
Wikipedia's volunteereditor community has the responsibility offact-checking Wikipedia's content.[1] Their aim is to curb the dissemination ofmisinformation anddisinformation by the website.
Wikipedia is considered one of the majorfree content websites, where millions can read, edit, and document what reliable sources say about millions of topics, for free. Therefore Wikipedia takes the effort to provide its readers with well-verified sources. Meticulous fact-checking is an aspect of the broaderreliability of Wikipedia.
Variousacademic studies about Wikipedia and the body ofcriticism of Wikipedia seek to describe the limits of Wikipedia's reliability, document who uses Wikipedia for fact-checking and how, and what consequences result from this use. Wikipedia articles can have poor quality in many ways including self-contradictions.[2] Those poor articles require improvement.
Large platforms includingYouTube[3] andFacebook[4] use Wikipedia's content to confirm the accuracy of the information in their own media collections.
Wikipedia serves as a public resource for access to genuine information. For example, theCOVID-19 pandemic was an important topic on which people relied on Wikipedia for genuine information.[5] Seeking public trust is a major part of Wikipedia's publication philosophy.[6] Various reader polls and studies have reported public trust in theEnglish Wikipedia's process for quality control.[6][7] In general, the public uses Wikipedia to counterfake news.[8]

At the 2018South by Southwest conference,YouTube CEOSusan Wojcicki made the announcement that YouTube was using Wikipedia to fact check videos which YouTube hosts.[3][9][10][11] No one at YouTube had consulted anyone at Wikipedia about this development, and the news at the time was a surprise.[9] The intent at the time was for YouTube to use Wikipedia as a counter to the spread ofconspiracy theories.[9] This is done by adding new information boxes under some YouTube videos, thereby, attracting conspiracy theorists.[citation needed]
Facebook uses Wikipedia in various ways. Following criticism of Facebook in the context of fake news around the2016 United States presidential election, Facebook recognized that Wikipedia already had an established process for fact-checking.[4] Facebook's subsequent strategy for countering fake news included using content from Wikipedia for fact-checking.[4][12] In 2020, Facebook began to include information from Wikipedia's info boxes in its own general reference knowledge panels to provide objective information.[13]
Mike Caulfield andSam Wineburg adapt an approach to fact checking as a type ofmedia literacy, suggesting that information seekers emphasize lateral reading (or skimming multiple reliable sources instead of thoroughly examining one), including by using Wikipedia as a starting point for learning about a topic.[14]
Renée DiResta in her 2024 book advised victims of rumors, misinformation or disinformation to ensure that factual information was available online, including on Wikipedia, especially in an era when AI chatbots often rely on Wikipedia for information.[15]
Fact-checking is one aspect of the generalediting process in Wikipedia. The volunteer community develops a process for reference and fact-checking through community groups such asWikiProject Reliability.[8] Wikipedia has a reputation for cultivating a culture of fact-checking among its editors.[16] Wikipedia's fact-checking process depends on the activity of its volunteer community of contributors, who numbered 200,000 as of 2018.[1]
The development of fact-checking practices is ongoing in the Wikipedia editing community.[6] One development that took years was the 2017 community decision to declare a particular news source,Daily Mail, as generally unreliable as a citation for verifying claims.[6][17] Through strict guidelines onverifiability, Wikipedia has been combating misinformation.[18] According to Wikipedia guidelines, all articles on Wikipedia's "mainspace" must be verifiable.[19]
When Wikipedia experiencesvandalism, platforms that reuse Wikipedia's content may republish that vandalized content.[20] In 2016, journalists described how vandalism in Wikipedia undermines its use as a credible source.[21]
Vandalism is prohibited by Wikipedia. The website suggests these steps for inexperienced beginners to handle vandalism: access, revert, warn, watch, and finally report.[22]
In 2018, Facebook and YouTube were major users of Wikipedia for its fact-checking functions, but those commercial platforms were not contributing to Wikipedia's free nonprofit operations in any way.[20]