A bizarre social media scandal has emerged on YouTube Kids over the past year and a half. Dubbed #Elsagate, it involves aberrant imagery in videos, combined with attempts to game the YouTube algorithms to present inappropriate content to children. In November, YouTube took action to remove content, shut down accounts, and block ads from some of the biggest offenders. Yet many are still asking if that is enough to end the threat to children.
A flood of inappropriate videos have appeared on YouTube Kids, a dedicated app that uses algorithms to filter out content not suitable for children. YouTube Kids should be a place that parents know kids will be safe and can view age-appropriate content. Starting in 2016, parents noticed videos that included nonsensical themes, in what artist James Bridle referred to as “infrastructural violence.” Videos include crude renditions of familiar characters such as Spider-Man, The Joker, and Elsa from Frozen (hence the name). They might be animated, amateur claymation, or live action videos acted out by adults and children in Halloween costumes. Often, videos include music but no dialogue — reportedly an effective method of bypassing algorithms set up to shield children from exploitative themes.
In order to grasp the vastness of the issue, one needs to grasp the vastness of the number of videos on YouTube Kids. James Bridle wrote a must-read article for any parent. Bridle travels step by step through several levels of increasingly more disturbing video content.
The first level consists of a seemingly endless series of nonsense videos. Nothing wrong, necessarily, but off. They seem to be nonsense videos created by bots or other automated sources that take advantage of keywords in the title (e.g. Wrong Heads Disney Wrong Ears Wrong Legs Kids Learn Colors Finger Family 2017 Nursery Rhymes). The heavy use of keywords helps push the videos to the top of search results.
The second level of disturbing videos consists of popular characters being put in upsetting situations, such as Peppa Pig crying uncontrollably while getting a shot at the doctor’s office, or eating her father, or drinking bleach. The emotional manipulation is obvious and disturbing for adults. One can only imagine the effect of this manipulation on the emotions of a young child. It’s almost akin to the scene in A Clockwork Orange where Alex, played by Malcolm McDowell, is brainwashed.
A third level consists of real-life families filming themselves doing things that are, well, weird. Bridle wrote of the Toy Freaks channel (since removed by YouTube) that showed a father and his daughters acting out various scenarios like punishment and sucking on pacifiers. According to a report by BuzzFeed, the girls were shown wetting themselves, screaming in fear, spitting up, and bathing.
This is not some dark corner of the internet where only fetishists troll. The Toy Freaks channel was the 63rd most popular channel on YouTube, with 8 million subscribers and almost SEVEN BILLION views. The monthly revenue generated by ad placements on this channel approached seven figures.
Then you get into the truly bizarre and disturbing, like the animated video “Batman Eat Sh*t! Sh*t Sticky face Elsa and mouth Anna” (since deleted). Again, these videos appear to be created by bots and utilize the keyword strategy mentioned above to exploit algorithms and push these type of videos to the top of the YouTube search engine. Bots are also deployed to create virtual clicks on the videos to inflate the numbers of views and subscribers. (A bot is an automated program that performs repetitive tasks on the internet automatically.)
There’s way more, but you get the idea.
As ArsTechnica described, the keyword strategy is key to getting videos to auto-play:
The unnerving reality is that it’s possible that many of those views came from YouTube’s “up next” and “recommended” video section that appears while watching any video. YouTube’s algorithms attempt to find videos that you may want to watch based on the video you chose to watch first. If you don’t pick another video to watch after the current video ends, the “up next” video will automatically play. Since some of these inappropriate videos showed up on YouTube Kids (and on the main YouTube app as well), it’s possible that any one of them was an “up next” video that automatically played after hours of kids watching other appropriate yet categorically similar content.
YouTube has had enough complaints about these videos that it has started to take action. Starting in November, YouTube has shut down over 270 accounts and deleted 150,000 videos. In addition, they have pulled ads from an additional two million videos and 50,000 channels.
BuzzFeed interviewed several content creators whose channels were recently shut down. One says that YouTube is responsible for encouraging the exponential growth in this content:
In one video from Orgill’s now-shut-down channel, a Spider-Man character is tied up to a tree; later in the footage, a real baby, along with other “character babies,” have their diapers “changed” simultaneously.
Orgill argues that the platform is responsible for encouraging what he found to be objectionable, sexual, and violent superhero content ostensibly oriented toward children. “YouTube blames it on these people that were doing it, but for a year their algorithm pushed this content,” he said. “People were doing it because it was creating millions and millions and millions of views. They created a monster.” (He later told BuzzFeed News that he thinks what the platform is doing is “totally understandable,” because “there were kids getting taking advantage of on YouTube.)
In emails Orgill provided to BuzzFeed News, twelve videos from his now-suspended channel were deemed “suitable for all advertisers” in the month of November (at least two videos were deemed unsuitable for ads). Two videos were approved for ads as late as Nov. 23, just one day before his account was shut down.
The article quotes several content creators who insist that they were encouraged by YouTube to monetize their channels because the algorithms allowed for rapid growth in subscribers and viewers.
#Elsagate has gained significant momentum on social media as more of these reports come out.
If you have a young child that watches videos on YouTube, please take a minute to read this graphic. Even if you're just an aunt or uncle or friend of a person with a child, this is worth reading and spreading. #ElsaGate pic.twitter.com/OnKXk5rHrq
— ☯︎ (@jjonessss_) December 12, 2017
— 🆅🅸🆃🌌🅻🌌🅽🌌🅽 ℠ (@VitalAnon) December 11, 2017
YouTube has reportedly strengthened its algorithms and added more human moderators to filter out inappropriate content for YouTube Kids. Additionally, age-restricted content will no longer be eligible for ads. Time will tell if these changes are sufficient to shield children from inappropriate content on YouTube Kids. In the meantime, it’s probably a good idea to check your kid’s viewing history and settings.