Elsagate (derived fromElsa and the-gate scandal suffix) is a controversy surrounding videos onYouTube andYouTube Kids that were labelled as "child-friendly" but contained themesinappropriate for children. These videos often featured fictional characters from family-oriented media, sometimes viacrossovers, used without legal permission. The controversy also included channels that focused on real-life children, such asToy Freaks, that raised concern about possiblechild abuse.
Most videos in this category were produced either withlive action orFlash animation, but some usedclaymation orcomputer-generated imagery. The videos were sometimes tagged in such a way as to circumvent YouTube's child safety algorithms, and some appeared on YouTube Kids. These videos were difficult to moderate due to the large scale of YouTube.[1] In order to capturesearch results andattract attention from users, theirbuzzword titles and descriptions featured the names of the fictional characters, as well askeywords such as "education", "learn colors", and "nursery rhymes".[2][3][4] They also includedautomatically placed commercials, making them lucrative to their owners and YouTube.[2]
Public awareness of the phenomenon grew in late 2017. That year—after reports on child safety on YouTube by several media outlets—YouTube adopted stricter guidelines regarding children's content. In late November, the platform deleted channels and videos falling into the Elsagate category, as well as large amounts of other inappropriate videos or user comments relating to children.[5] While these efforts, intended to wipe the platform of Elsagate-related media, curbed much of the older content, as of the 2020s similar videos and channels (this time utilizing video games popular with children) have been found to remain pervasively accessible to children on YouTube.[6][7]
In June 2016,The Guardian published an article about the channelWebs and Tiaras, which had been created in March of the same year. The channel showed people dressed as characters likeSpider-Man,Elsa, and theJoker engaging in bizarre or nonsensical actions. The videos themselves hadbackground music but no dialogue. The lack of dialogue meant that there was nolanguage barrier on the videos, which would normally hinder worldwide distribution. The article also reported that several nearly identical channels, namedToy Monster,The Superheroes Life, andThe Kids Club, had appeared on YouTube.[8]
In January 2017, one channel under the control of a YouTube partner in Vietnam,Spiderman Frozen Marvel Superhero Real Life, blocked their Vietnamese subscribers after complaints from parents regarding the content of their videos.[9] The channel's owner was later fined by Vietnamese authorities.[10]
The increase in the number of views had led some to voice concerns that such channels aregaming the system by using bots orclick farms to inflateviewing figures to higher proportions; however, there is no evidence for this.[8] In early February 2017,Tubefilter interviewed one ofWebs And Tiaras' creators, Eric, who insisted that the team has "never used anybots or any other method to increase our views." Given the channel's substantial numbers, he said, "If we were not respecting the rules, YouTube would certainly have taken notice." He also said that his team is based in Canada, and there is no larger company behind his channels, and that it represents agrassroots project among friends.[11]
In February 2017,The Verge commented that "adults dressing up in costume and acting out weird, wordlessskits has become a booming industry on the world's biggest video platform" and that while many videos were "puerile but benign", others featured more questionable content, such asscatological humor and violent or sexual situations. The article noted that most videos were made with a very limited budget and "a fewHalloween costumes", which made them easy to produce and contributed to their multiplication. It also attributed their success to the frequent use of "Freudian concerns", which young children may find fascinating, amusing, or frightening, such as "peeing, pooping, kissing, pregnancy, and the terrifying notion of going to the doctor and getting a shot".[12]
Also in February,The Awl published an article onWebs and Tiaras and similar channels, describing their content as "nonsensically nightmarish" and "pretty twisted for children's content: some videos involve Elsa giving birth, and in some others, Spider-Man injects Elsa with a brightly colored liquid. You half expect the scenarios to beporn setups." In most videos, thelike and dislike options were disabled, which makes it difficult to understand how many users were actually engaging with them. Many videos feature hundreds of positive comments written by similar channels in an apparent attempt to attract more clicks.[13]
In March, theBBC ran a piece titled "The disturbing YouTube videos that are tricking children". The article focuses on aPeppa Pig imitation, where the titular character's teeth are painfullypulled out by a dentist, and a video featuring said characterburning down an occupied house. The article also mentioned the existence of "hundreds" of similar videos, ranging fromunauthorized but otherwise harmless copies of authentic animations to frightening and gory content.[14]
CTV News also reported in March about YouTube's "fake toons problem", with adult-themed imitations of popular children's shows frequently appearing onYouTube Kids: "In some cases, the video will feature akid-friendly thumbnail, while the video itself might be entirely different" and be very unsuitable for small children. The network commented that such videos were "often nightmares to behold, with lots of frightening scenes involving monsters and blood. Many of these videos venture into dark territory, with the characters often being chased, attacked, or injured in a bloody manner."[15]
The term "Elsagate" was coined on the Internet in 2017. During the summer of that year, it became a popularhashtag onTwitter as users called attention to the presence of such material on YouTube and YouTube Kids.[16] OnReddit, an Elsagate subreddit (r/ElsaGate) was created on June 23 to discuss the phenomenon, soon attracting tens of thousands of users.[17]
In November 2017, several newspapers published articles about the YouTube channelToy Freaks, which had been created five years earlier by a single father named Greg Chism.Toy Freaks had a total of 8.54 million subscribers and were among the top 100 most viewed channels before it shut down that month. The channel often featured Chism's daughters, and in most cases showed them scared or crying.[18][19]
Several individuals, including the rapperB.o.B and commentary channelh3h3Productions, discussed Elsagate on social media during this time.[20]
On November 4,The New York Times published an article about the "startling" videos slipping past YouTube's filters and disturbing children, "either by mistake or because bad actors have found ways to fool the YouTube Kids' algorithms".[2] On November 6, authorJames Bridle wrote onMedium about his worry about videos aimed at scaring, hurting, and exploiting children. He said these videos were common on YouTube, and noticed that many of these videos were confusing as obvious parodies and imitations interacted with algorithm-driven content creators, which led to content that mixed up populartropes, characters, and keywords. He said this made videos with real people resemble automated content.[3] On November 17, Internet commentatorPhilip DeFranco posted a video addressing the issue.[21]
The New York Times found that one of the channels featuring counterfeit cartoons,Super Zeus TV, was linked to a website calledSuperKidsShop.com, registered inHo Chi Minh City, Vietnam. A man working forSuperKidsShop.com confirmed that his partners were responsible for the videos, on which "a team of about 100 people" were producing. Subsequent requests for an interview went unanswered.[2]
On November 9, members of the satirical sound collage groupNegativland presented an episode of their weekly radio showOver the Edge dedicated to Elsagate."'Modern Animal Kids'[22] threads Elsagate through a remix of three '90s episodes ofOver the Edge which focused on media for children, all broadcast in the final years beforeTeletubbies pioneered marketing to the 6- to 18-month-old demographic".[23]
On November 22,BuzzFeed News published an article about unsettling videos that depict children in disturbing and abusive situations. The information on the article came with the assistance of journalist and human rights activistMatan Uziel, whose investigation and report to theFBI on that matter were sent on September 22, informing its leadership about "tens of thousands of videos available on YouTube that we know are crafted to serve as eye candy forperverted, creepy adults and online predators to indulge in their child fantasies".[24]
On November 23, French-Canadian outletTabloïd released a video investigation aboutToy Monster, a channel linked toWebs and Tiaras. They confronted the videos' creators – based out of the south shore ofQuebec City – who refused to be interviewed. One of the actors featured in the videos anonymously stated that he was contractually obligated to refrain from commenting. The investigation revealed that identical content was being posted on numerous channels apparently operated by the same people.[25]
According to a report byChina News Service, many Elsagate clips had been re-uploaded onto Chinese video platforms from YouTube, but others were created domestically, involving youngsters being beaten with rulers and children giving each other injections. The authorities eventually announced in January 2018 to launch a nationwide campaign against the videos.[26] A creator toldThe Beijing News that his daughters appeared in a video he made, in which they lined up and werespanked with a ruler.[27]
Although YouTube initiated a crackdown on Elsagate content in 2017, videos hosting similar content have continued to be found on the website in the following years. Much of the content is based on video gameIPs popular with children, such asMinecraft,Among Us orPoppy Playtime, and is both marketed towards, and freely accessible to, children. And while YouTube Kids disallows inappropriate content and is intended to steer children away from the main app, the efficacy of that method has been called into question.[6][7]
An investigation, published byWired on 30 March 2021, found dozens of "disturbing" or "grotesque" animated videos targetingMinecraft andAmong Us fans that were featured under YouTube's "Topics" or "hashtags" pages for the games. The magazine argued that these newer discoverability features lacked moderation, and allowed "opportunistic" channels to display questionable or inappropriate content. One livestream showcased an animated video of a bare-chested, femaleMinecraft avatar opening a present containing a poopemoji, as well as a thumbnail featuring two inflated breasts holding up a "poopMinecraft brick" [sic], with other inappropriate or disturbingMinecraft orAmong Us-style thumbnails, easily seen via the Topics or hashtags sections. Still, the magazine acknowledged the 2017 purge and stated that their findings didn't represent "a direct Elsagate repeat", noting that these videos were not on YouTube Kids, that much of their shocking content was limited to the thumbnails only, and that many of the more obvious ways of targeting children had been challenged by the purge.[6]
In 2022,Newsweek also reported on channels and videos containing similarly inappropriate content or thumbnails, featuring characters from horror games such asFive Nights at Freddy's andPoppy Playtime that were popular with children. The magazine described suggestive thumbnails, as well as content that featured drugs (includingdate rape drugs) and violence, such as murder,school shootings, or physical abuse towards children and female characters. YouTube responded, stating that "the content shared has not been found in the YouTube Kids app which is our recommended experience for children under 13." However, Sonia Livingstone, professor at the London School of Economics and expert on children's digital safety, argued that this was insufficient, given that the main app still had the content in question accessible to children, with poor moderation, labeling and "age-gating".Newsweek further contended that, while YouTube encourages creators to age-restrict content intended for older users and disallows content that targets minors while displaying inappropriate themes, none of the videos they found had been age-restricted.[7]
With the advent of AI video generators, there are new variants of Elsagate generated using artificial intelligence. A common motif among the videos is the use of cats.[28]
![]() | This sectionneeds expansion. You can help byadding to it.(December 2024) |
The New York Times quoted pediatrics professor Michael Rich, who considered it upsetting that "characters [children] thought they knew and trusted" were shown behaving in an improper or violent manner.[2]
In response to the controversy, on 1 June 2017, YouTube changed its guidelines, specifically to ban children's characters in inappropriate situations. Unfortunately, that ban had little effect, and in the months after, the subredditr/Elsagate became a reliable[according to whom?] place for amateur investigators to raise the alarm on videos that slipped through YouTube's moderation system.[29]
In August 2017, YouTube announced its new guidelines on content and monetization. In an ongoing series ofefforts to demonetize controversial and offensive videos, it was announced that creators would no longer be able to monetize videos that "made inappropriate use of family-friendly characters". In November of the same year, it announced that it would implement "a new policy that age restricts this content in the YouTube main app when flagged".[30]
The controversy extended to channels that featured not necessarily children's characters but actual children, who sometimes performed inappropriate or dangerous activities under the guidance of adults. As part of a broader action, YouTube terminated the channelToy Freaks, which featured a father (Greg Chism) and his two daughters in potentially abusive situations.[31][32][33] Chism was subsequently investigated bychild-protection officials inIllinois andMissouri for alleged child abuse.[18][34][35] In December 2017, authorities announced that Chism would not face criminal charges.[36] Before its removal, the channel had over 8.5 million subscribers.[31][32][33]
It was also revealed in the media that many videos featuring minors – frequently uploaded by the children themselves and showing innocent content – had attracted comments frompedophiles and other groups. Some of these videos were monetized. As a result of the controversy, several major advertisers froze spending on YouTube, forcing YouTube to ban children from their site, citing legal obligations.[37][38][39]
On November 22, 2017, YouTube announced that it had deleted over 50 channels and thousands of videos that did not fit the new guidelines.[40] On November 27, the company said in a statement toBuzzFeed News that it had "terminated more than 270 accounts and removed over 150,000 videos", "turned off comments on more than 625,000 videos targeted by child predators" and "removed ads from nearly 2 million videos and over 50,000 channels masquerading as family-friendly content".[41]Forbes contributor Dani Di Placido wrote that many problematic videos could still be seen on the platform, and that "the sheer volume of videos hastily deleted from the site prove that YouTube's algorithms were utterly ineffective at protecting young children".[37]
In December 2017, as advertisers pulled ads, YouTube CEOSusan Wojcicki announced new moderation guidelines, removing inappropriate children's content and bringing the total number of moderators up to 10,000.[29]
In August of this year, YouTube announced that it would no longer allow creators to monetize videos which "made inappropriate use of family-friendly characters". Today it's taking another step to try and police this genre.