A study released this month by researchers at Boston University, Cyprus University of Technology, the University of Alabama at Birmingham, and Telefonica Research shows that YouTube is endangering its youngest viewers through algorithms that lead them to violent and other inappropriate content from safe videos like Peppa Pig cartoons.
Hundreds of toddler-oriented channels on YouTube offer inoffensive, well produced, and educational videos. Unfortunately, inappropriate (disturbing) content that targets this demographic is also common. YouTube’s algorithmic recommendation system regrettably suggests inappropriate content because some of it mimics or is derived from otherwise appropriate content. Considering the risk for early childhood development, and an increasing trend in toddler’s consumption of YouTube media, this is a worrying problem.
Prior to this study, only anecdotal evidence was available. It was widely reported on when YouTubers found weird videos marketed for toddlers that showed cartoon characters committing murder or in sexual situations. Now there is scientific evidence that YouTube is directing toddlers to these videos through their algorithms.
Hence, in this work, we develop a classifier able to detect toddler-oriented inappropriate content on YouTube with 82.8% accuracy, and we leverage it to perform a first-of-its-kind, largescale, quantitative characterization that reveals some of the risks of YouTube media consumption by young children. Our analysis indicates that YouTube’s currently deployed countermeasures are ineffective in terms of detecting disturbing videos in a timely manner. Finally, using our classifier, we assess how prominent the problem is on YouTube, finding that young children are likely to encounter disturbing videos when they randomly browse the platform starting from benign videos.
The study found that out of 133,806 videos, a whopping 8.6 percent are inappropriate for toddlers. It also found a 5.8 percent probability that a toddler viewing a safe video will be given a top-ten recommendation for an unsafe video and there is a 45 percent probability that the toddler will click on it. It only takes 10 hops to get to an inappropriate video from a safe video, researchers say.
YouTube also struggles to remove inappropriate videos in a timely manner, according to the study. Who could forget the infamous “Baby Dance” video about raping babies that launched a nationwide campaign to remove it? Even then, YouTube moved at a snail’s pace. The study goes onto report that “alarmingly, the amount of the deleted disturbing and restricted videos, is significantly low.”
The graphs showing what was found with seemingly safe search terms are disturbing.
A strange finding was that “suitable videos have more dislikes than disturbing videos.” The researchers concluded that “taken altogether, these findings show that the problem of toddler-related inappropriate videos on YouTube is not negligible and that there is a very real chance that a toddler will be recommended an inappropriate video when watching a nondisturbing video.”
While YouTube should certainly fix this glaring hole in their algorithm, it seems even worse to imagine that there are parents letting their toddlers browse freely on YouTube.