![]() | This article maylendundue weight to certain ideas, incidents, or controversies. Pleasehelp improve it by rewriting it in abalanced fashion that contextualizes different points of view.(January 2021) (Learn how and when to remove this message) |
Deplatforming, also known asno-platforming, is aboycott on an individual or group by removing the platforms used to share their information or ideas.[1] The term is commonly associated withsocial media.
As early as 2015, platforms such asReddit began to enforce selective bans based, for example, onterms of service that prohibit "hate speech".[2] A famous example of deplatforming wasTwitter's ban of then-US PresidentDonald Trump shortly after theJanuary 6 United States Capitol attack.[3]
In the United States, the banning of speakers on university campuses dates back to the 1940s. This was carried out by the policies of the universities themselves. TheUniversity of California had a policy known as the Speaker Ban, codified in university regulations under PresidentRobert Gordon Sproul, that mostly, but not exclusively, targetedcommunists. One rule stated that "the University assumed the right to prevent exploitation of its prestige by unqualified persons or by those who would use it as a platform for propaganda." This rule was used in 1951 to blockMax Shachtman, asocialist, from speaking at theUniversity of California at Berkeley. In 1947, former U.S. Vice PresidentHenry A. Wallace was banned from speaking atUCLA because of his views on U.S.Cold War policy,[4] and in 1961,Malcolm X was prohibited from speaking at Berkeley as a religious leader.
Controversial speakers invited to appear on college campuses have faced deplatforming attempts to disinvite them or to otherwise prevent them from speaking.[5] The BritishNational Union of Students established itsNo Platform policy as early as 1973.[6] In the mid-1980s, visits bySouth African ambassadorGlenn Babb to Canadian college campuses faced opposition from students opposed toapartheid.[7]
In the United States, recent examples includethe March 2017 disruption by protestors of a public speech atMiddlebury College by political scientistCharles Murray.[5] In February 2018, students at theUniversity of Central Oklahoma rescinded a speaking invitation to creationistKen Ham, after pressure from anLGBT student group.[8][9] In March 2018, a "small group of protesters" atLewis & Clark Law School attempted to stop a speech by visiting lecturerChristina Hoff Sommers.[5] In the 2019 filmNo Safe Spaces,Adam Carolla andDennis Prager documented their own disinvitation along with others.[10]
As of February 2020[update], theFoundation for Individual Rights in Education, a speech advocacy group, documented 469 disinvitation or disruption attempts at American campuses since 2000,[11] including both "unsuccessful disinvitation attempts" and "successful disinvitations"; the group defines the latter category as including three subcategories: formal disinvitation by the sponsor of the speaking engagement; the speaker's withdrawal "in the face of disinvitation demands"; and "heckler's vetoes" (situations when "students or faculty persistently disrupt or entirely prevent the speakers' ability to speak").[12]
Beginning in 2015,Reddit banned several communities on the site ("subreddits") for violating the site's anti-harassment policy.[13] A 2017 study published in the journalProceedings of the ACM on Human-Computer Interaction, examining "the causal effects of the ban on both participating users and affected communities," found that "the ban served a number of useful purposes for Reddit" and that "Users participating in the banned subreddits either left the site or (for those who remained) dramatically reduced their hate speech usage. Communities that inherited the displaced activity of these users did not suffer from an increase in hate speech."[13] In June 2020 and January 2021, Reddit also issued bans to two prominent online pro-Trump communities over violations of the website's content and harassment policies.
On May 2, 2019,Facebook and the Facebook-owned platformInstagram announced a ban of "dangerous individuals and organizations" includingNation of Islam leaderLouis Farrakhan,Milo Yiannopoulos,Alex Jones and his organizationInfoWars,Paul Joseph Watson,Laura Loomer, andPaul Nehlen.[14][15] In the wake of the2021 storming of the US Capitol, Twitter banned then-president Donald Trump, as well as 70,000 other accounts linked to the event and the far-right movementQAnon.
Some studies have found that the deplatforming of extremists reduced their audience, although other research has found that some content creators became more toxic following deplatforming and migration toalt-tech platform.[16]
On November 18, 2022,Elon Musk, as newly appointed CEO of Twitter, reopened previously banned Twitter accounts of high-profile users, including Kathy Griffin,Jordan Peterson, and The Babylon Bee as part of the new Twitter policy.[17][18] As Musk exclaimed, "New Twitter policy is freedom of speech, but not freedom of reach".
On August 6, 2018, Facebook,Apple, YouTube andSpotify removed all content by Jones andInfoWars for policy violations.YouTube removed channels associated withInfoWars, including The Alex Jones Channel.[19] On Facebook, four pages associated withInfoWars and Alex Jones were removed over repeated policy violations. Apple removed all podcasts associated with Jones fromiTunes.[20] On August 13, 2018,Vimeo removed all of Jones's videos because of "prohibitions on discriminatory and hateful content".[21] Facebook cited instances of dehumanizing immigrants, Muslims andtransgender people, as well as glorification of violence, as examples ofhate speech.[22][23] AfterInfoWars was banned from Facebook, Jones used another of his websites,NewsWars, to circumvent the ban.[24][25]
Jones's accounts were also removed fromPinterest,[26]Mailchimp[27] andLinkedIn.[28] As of early August 2018[update], Jones retained active accounts onInstagram,[29]Google+[30] andTwitter.[31][32]
In September, Jones was permanently banned from Twitter and Periscope after beratingCNN reporterOliver Darcy.[33][34] On September 7, 2018, theInfoWars app was removed from theApple App Store for "objectionable content".[35] He was banned from usingPayPal for business transactions, having violated the company's policies by expressing "hate or discriminatory intolerance against certain communities and religions."[36] AfterElon Musk's purchase of Twitter several previously banned accounts were reinstated including Donald Trump,Andrew Tate andYe resulting in questioning if Alex Jones will be unbanned as well. However Musk denied that Alex Jones will be unbanned criticizing Jones as a person that "would use the deaths of children for gain, politics or fame".[37]
InfoWars remained available onRoku devices in January 2019, a year after the channel's removal from multiple streaming services. Roku indicated that they do not "curate or censor based on viewpoint," and that it had policies against content that is "unlawful, incited illegal activities, or violates third-party rights," but thatInfoWars was not in violation of these policies. Following a social media backlash, Roku removedInfoWars and stated "After theInfoWars channel became available, we heard from concerned parties and have determined that the channel should be removed from our platform."[38][39]
In March 2019, YouTube terminated the Resistance News channel due to its reuploading of live streams fromInfoWars.[40] On May 1, 2019, Jones was barred from using both Facebook and Instagram.[41][42][43] Jones briefly moved toDlive, but was suspended in April 2019 for violating community guidelines.[44]
In March 2020, theInfoWars app was removed from theGoogle Play store due to claims of Jones disseminatingCOVID-19 misinformation. A Google spokesperson stated that "combating misinformation on the Play Store is a top priority for the team" and apps that violate Play policy by "distributing misleading or harmful information" are removed from the store.[45]
On January 6, 2021, in ajoint session of the United States Congress, the counting of the votes of theElectoral College was interrupted by abreach of the United States Capitol chambers. The rioters were supporters of PresidentDonald Trump who hoped to delay and overturn the President's loss in the2020 election. The event resulted in five deaths and at least 400 people being charged with crimes.[46] The certification of the electoral votes was only completed in the early morning hours of January 7, 2021. In the wake of severalTweets by President Trump on January 7, 2021Facebook,Instagram,YouTube,Reddit, andTwitter all deplatformed Trump to some extent.[47][48][49][50] Twitter deactivated his personal account, which the company said could possibly be used to promote further violence. Trump subsequently tweeted similar messages from the President's official US Government account @POTUS, which resulted in him being permanently banned on January 8.[51] Twitter then announced that Trump's ban from their platform would be permanent.
Trump planned to rejoin on social media through the use of a new platform by May or June 2021, according toJason Miller on aFox News broadcast.[52][53]
The same week Musk announced Twitter's new freedom of speech policy, he tweeted a poll to ask whether to bring back Trump into the platform.[54] The poll ended with 51.8% in favor of unbanning Trump's account.[54] Twitter has since reinstated Trump's Twitter account @realDonaldTrump (as of 19 Nov 2022 — but by then Trump's platform wasTruth Social).[54][55]
In 2017,Andrew Tate was banned from Twitter for tweeting that women should "bare some responsibility" in response to the #MeToo movement.[56] Similarly, in August 2022, Tate was banned on four more major social media platforms: Instagram, Facebook, TikTok, and YouTube.[56] These platforms indicated that Tate's misogynistic comments violated their hate speech policies.[57]
Tate has since been unbanned from Twitter as part of the new freedom of speech policy on Twitter.[57]
Social media platforms such as YouTube and Instagram allow their content producers or influencers to earn money based on the content (videos, images, etc.), most typically based around some sort of payment per a set number of new "likes" or clicks etc. When the content is deemed inappropriate for compensation, but still left on the platform, this is called "demonetization" because the content producer is left with no compensation for their content that they created, while at the same time the content is still left up and available for viewing or listening by the general public.[58]In September 2016,Vox reported that demonetization—as it pertained to YouTube specifically—involved the following key points:
Deplatforming tactics have also included attempts to silence controversial speakers through various forms of personalharassment, such asdoxing,[59] the making of false emergency reports for purposes ofswatting,[60] and complaints or petitions to third parties. In some cases, protesters have attempted to have speakersblacklisted from projects or fired from their jobs.[61]
In 2019, students at theUniversity of the Arts in Philadelphia circulated an online petition demanding thatCamille Paglia "should be removed from UArts faculty and replaced by a queer person of color." According toThe Atlantic'sConor Friedersdorf, "It is rare for student activists to argue that a tenured faculty memberat their own institution should be denied a platform." Paglia, a tenured professor for over 30 years who identifies astransgender, had long been unapologetically outspoken on controversial "matters of sex, gender identity, and sexual assault".[62]
In December 2017, after learning that a French artist it had previously reviewed was aneo-Nazi, theSan Francisco punk magazineMaximum Rocknroll apologized and announced that it has "a strict no-platform policy towards any bands and artists with a Nazi ideology".[63]
In May 2021, theUK government underBoris Johnson announced aHigher Education (Freedom of Speech) Bill that would allow speakers at universities to seek compensation for no-platforming, impose fines on universities and student unions that promote the practice, and establish a newombudsman charged with monitoring cases of no-platforming and academic dismissals.[64] In addition, the government published anOnline Safety Bill that would prohibit social media networks from discriminating against particular political views or removing "democratically important" content, such as comments opposing or supporting political parties and policies.[65]
Some critics of deplatforming have proposed that governments should treatsocial media as a public utility to ensure thatconstitutional rights of the users are protected, citing their belief that anInternet presence using social media websites is imperative in order to adequately take part in the 21st century as an individual.[66] Republican politicians have sought to weaken the protections established bySection 230 of the Communications Decency Act—which provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users—under allegations that the moderation policies of major social networks are not politically neutral.[67][68][69][70]
According to its defenders, deplatforming has been used as a tactic to prevent the spread ofhate speech anddisinformation.[13] Social media has evolved into a significant source of news reporting for its users, and support for content moderation and banning of inflammatory posters has been defended as an editorial responsibility required by news outlets.[71]
Supporters of deplatforming have justified the action on the grounds that it produces the desired effect of reducing what they characterize as hate speech.[13][72][73] Angelo Carusone, president of theprogressive organizationMedia Matters for America and who had run deplatforming campaigns against conservative talk hostsRush Limbaughin 2012 andGlenn Beckin 2010, pointed to Twitter's 2016 ban ofMilo Yiannopoulos, stating that "the result was that he lost a lot.... He lost his ability to be influential or at least to project a veneer of influence."[72]
In the United States, the argument that deplatforming violates rights protected by theFirst Amendment is sometimes raised as a criticism. Proponents say that deplatforming is a legal way of dealing with controversial users online or in other digital spaces, so long as the government is not involved with causing the deplatforming. According to Audie Cornish, host of theNPR showConsider This, "the government can't silence your ability to say almost anything you want on a public street corner. But a private company can silence your ability to say whatever you want on a platform they created."[74]
In the words of technology journalistDeclan McCullagh, "Silicon Valley's efforts to pull the plug on dissenting opinions" began around 2018 withTwitter,Facebook, andYouTube denying service to selected users of their platforms; he said they devised "excuses to suspend ideologically disfavored accounts".[75] In 2019, McCullagh predicted that paying customers would become targets for deplatforming as well, citing protests and open letters by employees ofAmazon,Microsoft,Salesforce, andGoogle who opposed policies ofU.S. Immigration and Customs Enforcement (ICE), and who reportedly sought to influence their employers to deplatform the agency and its contractors.[75]
Law professorGlenn Reynolds dubbed 2018 the "Year of Deplatforming" in an August 2018 article inThe Wall Street Journal. Reynolds criticized the decision of "internet giants" to "slam the gates on a number of people and ideas they don't like", namingAlex Jones andGavin McInnes.[76] Reynolds cited further restrictions on "even mainstream conservative figures" such asDennis Prager, as well asFacebook's blocking of a campaign advertisement by a Republican candidate "ostensibly because her video mentioned theCambodian genocide, which her family survived."[76]
In a 2019The Atlantic article,Conor Friedersdorf described what he called "standard practice" among student activists. He wrote: "Activists begin with social-media callouts; they urge authority figures to impose outcomes that they favor, without regard for overall student opinion; they try to marshalantidiscrimination law to limitfreedom of expression."[62] Friedersdorf pointed to evidence of achilling effect on free speech andacademic freedom. Of the faculty members he had contacted for interviews, he said a large majority "on both sides of the controversy insisted that their comments be kept off the record or anonymous. They feared openly participating in a debate about a major event at their institution—even after their university president put out an uncompromising statement in support of free speech."[62]
Two of the videos featured anti-Muslim content, including one in which Jones claimed that Muslims had invaded Europe. Another was flagged for anti-transgender content in which Jones appeared to threaten transgender people. The fourth showed an adult man and a young boy engaged in a physical altercation under the title "How To Prevent Liberalism."