YouTube Under FTC Investigation for Child Privacy Violations

Published on June 25, 2020

MOUNTAIN VIEW, Calif—Last week, The Washington Post reported that the Federal Trade Commission (FTC) is currently in the advanced stages of an investigation into YouTube’s methods of handling videos aimed at children.

According to two individuals who are involved with the investigation, this is the result of complaints made by parents and consumer groups that YouTube had collected data belonging to many of its younger users. Specifically, complaints state that the video giant has allowed harmful and adult content to appear in searches for children’s content.

The F.T.C. and Facebook

The investigation comes at an awkward time as the F.T.C. is poised to announce a settlement with Facebook over the social network’s handling (mis) of user data and potential violations of a 2011 consent decree with the agency over previous privacy violations. The media giant is recently involved with a copyright infringement lawsuit with a home-design company.

This year alone, the F.T.C. has turned closer attention to child privacy. Remember back in February when Music.ly, now known as TikTok, was fined by the F.T.C. for a record $5.7 million for violating child privacy laws? Back in March, it further fell under public scrutiny for claims that the service was deleting accounts…by mistake. YouTube faced a similar issue with one of its users, Life With Mak, a child with over 1.5 million subscribers who recently left the platform.

In its settlement with the F.T.C., the platform now directs users under 13 to a separate app, providing an experience that doesn’t allow them to share videos. In order to determine which users can access the regular interface and which ones will have to be re-directed, TikTok has implemented an “age check,” system, requiring users to verify their birthdays.

For the past two years, Twitter, has been cleaning up its platform, which has consisted of fake bot accounts, extremist groups, and others who pose a threat to the younger audience. While Twitter wasn’t responsible for creating the apps that allegedly violated COPPA, it was still marketing these apps through their platforms, according to the New Mexico Attorney General’s office in a statement back in 2018.

The Children’s Online Privacy Protection Act

However, one must wonder whether these social media giants are also part of the problem, allowing users under 18 to be on the system. YouTube’s main site and mobile application are designed for viewers 13 and older, while the company directs younger children to the YouTube Kids app, containing a filtered set of videos from the main site.

But should the age spectrum be re-aligned with respect to the type of content that appears on YouTube’s main site/app versus YouTube Kids? YouTube’s distinction between its main service and YouTube Kids is significant because of the rules with respect to disclosure and parental consent.

The Children’s Online Privacy Protection Act, or COPPA, is a federal law passed by Congress that puts parents in the driver’s seat when it comes to the type of information websites collect about their kids under age 13. Congress has granted power to the F.T.C. to issue COPPA, which has been in effect since 2000.

The law applies to operators of commercial websites and online services directed towards children under the age of 13 that collect personal information. Under the 2013 revisions, COPPA also applies to operators when they have “actual knowledge” they are collecting personal information.

Should YouTube Be Doing More?

Since the complaints, YouTube has been considering significant changes in how it handles children’s videos and associated content, especially with respect to how the algorithms work with the videos.

“We consider lots of ideas for improving YouTube, and some remain just that—ideas. Others, we develop and launch, like our restrictions to minors live streaming or updated hate speech policy,” Andrea Faville, a YouTube spokeswoman, said in a statement to The New York Times.

YouTube has declined to further comment on the investigation.

Unfortunately, YouTube has struggled to filter inappropriate content away from children’s video, which in-part, is the result of the extreme volume of videos being uploaded to the platform. This is where perhaps Europe’s new intellectual property laws would come into effect, shifting the burden away from content creators and onto the platform itself.

Back in February, YouTube fell into a storm after a video documenting how pedophiles use the comments on videos of children to guide other predators, was uploaded. After facing public backlash, the company said it would disable comments no most videos featuring children under the age of 13.

With the skepticism towards big tech giants throughout Silicon Valley, many believe action and enforcement by the F.T.C. towards Google/YouTube and Facebook to be appropriate and extremely timely.

The question remains: should YouTube be doing more?

Andrew "Drew" Rossow is a former contract editor at Grit Daily.

Read more

More GD News