YouTube’s ongoing problems surrounding content that involves children has led to an investigation from the federal government, according to a new Washington Post report.
The Federal Trade Commission is said to be investigating YouTube’s data collection practices and failure to protect children, according to the Post. The investigation is reported to be in its late stages and was brought about after complaints from consumer groups and privacy advocates. The investigation also follows numerous reports and investigations from publications over the last several months demonstrating how YouTube’s autoplay and recommendation feature allows predators to take advantage of content on the platform featuring children. A spokesperson for the FTC declined to comment.
Policy makers have started to respond to the investigation. Sen. Edward Markey (D-MA) called the investigation “into YouTube’s treatment of children online overdue,” in a press release issued yesterday, adding that “the company has yet to take the necessary steps to protect its youngest users.”
Both YouTube and Google executives, including respective CEOs Susan Wojcicki and Sundar Pichai, have accelerated trying to find a solution to the growing issue. The company decided to close comments on the majority of videos starring children in February as a way to prevent predatory comments from spreading. The company has also prohibited minors from livestreaming without an adult in the room.
One of the biggest requests that YouTube executives have received from policy makers, critics, and even some employees is to stop recommending videos that contain children. A YouTube spokesperson told The New York Times earlier this month that doing so would hurt creators. Instead, the company has limited “recommendations on videos that it deems as putting children at risk,” according to the Times.
CONTENT FEATURING CHILDREN IS PERVASIVE ON THE SITE, POSING DAUNTING MODERATION PROBLEMS
Wojcicki and Pichai appear to be looking into stronger measures to solve the problem, according to multiple reports over the last week, including two investigations from Bloomberg and the Wall Street Journal. One possible solution, according to the Journal, is to move all kids content over to YouTube’s standalone app, YouTube Kids. A YouTube spokesperson previously told The Verge the company considers “lots of ideas for improving YouTube and some remain just that — ideas.”
Even that comes with its own issues, though. The app, although generally safer than the main YouTube platform, has faced an array of moderation challenges, including graphic discussions about pornography and suicide, explicit sexual language in cartoons, and modeling unsafe behaviors like playing with lit matches.
There’s also the issue of kids not using the app as much as the main site. Bloomberg’s report cited internal sources at YouTube who suggested that children “tend to shift over to YouTube’s main site before they hit 13.” Even though the app sees more than 20 million people a week, a company spokesperson told Bloomberg, that’s just a fraction of the number of views popular kids channels are getting on the main site.
Content featuring children is pervasive on the site, posing daunting moderation problems. Kids are often incorporated into family vlogging, a quickly growing sector; they star in their own channels unboxing toys or playing with friends; many even partner with popular vloggers — top creator Jake Paul often collaborates with a four-year-old named Tydus for his videos.
It would be difficult for YouTube to remove all videos with children and migrate them over to YouTube Kids without affecting the greater creator community. While protecting kids is a top priority for the company, dealing with creator backlash is something the company has always considered, too.