| Wikipedia in the press |
|---|
Since its inception in 2001, Wikipedia has garnered substantial media attention. The following is a list of the project'spress coverage received in 2025, sorted chronologically. Per WP:PRESS, this page excludes coverage exclusively on a single WP-article, coverage of (some aspect of) the project overall is wanted.
Joe Truzman, an expert on Palestinian militant groups with theFoundation for Defense of Democracies think tank, said the latest online controversy highlights the often inaccurate nature of Wikipedia's entries about Israel.
His first project was to try and add a Wikipedia page for every airport in Canada, no matter how small. Then he moved on to weather monitoring equipment. He was made an administrator after just months as an editor. He said that now, the process is quite rigorous, but at the time, it was pretty informal.
The Heritage Foundation plans to 'identify and target' volunteer editors on Wikipedia who it says are 'abusing their position' by publishing content the group believes to be antisemitic
"The slideshow says the group's 'targeting methodologies' would include creating fake Wikipedia user accounts to try to trick editors into identifying themselves by sharing personal information or clicking on malicious tracking links that can identify people who click on them. It is unclear whether this has begun," according to the Forward.
This is how you get every Wikipedia article vaguely mentioning Israel becoming a catalogue of anti-Zionism.
Whatever would they do with editors' personal information? Send them fruit baskets, surely!
What did Wikipedia do with that $31.2 million? It gave much of it out as grant money. In the Middle East, it funded the Arab Reports for Investigative Journalism, which helps train journalists in Arab countries. In Brazil, it funded the InternetLab which spent the money researching racial disparities and internet access in the country. In North America, it gave money to the Racial Equity in Journalism Fund, which helped fund local newsgathering.
And this must be said: The cascade of newish terms ending in "cide" — "scholasticide," "educide" and "domicide" — all have one thing in common on Wikipedia: The current war in Gaza is always used as an example.
The Heritage Foundation launched a new effort to track down volunteer editors on Wikipedia who are publishing what they believe to be antisemitic content.
By creating and deploying a new tool called INFOGAP, the researchers used artificial intelligence to look at how biographical information about LGBT people is presented across the English, Russian, and French versions of Wikipedia and found inconsistencies in how they are portrayed.
The pressure on Wikipedia comes as the online, user-generated encyclopedia has been under fire from some Jewish organizations for its coverage of the war in Gaza and for labeling the Anti-Defamation League a "generally unreliable source" on the war. Wikipedia says the ADL is a pro-Israel activist group that declares nearly any criticism of Jews or Israel antisemitic.
Speaking to The Journal, Uí Ríordáin – who is full-time employee at Wikipedia Community Ireland – said that Vicipéid has helped to boost the confidence of students and other non-native speakers in their ability to use Irish.
The leader of the rating was an article aboutOleksandr Syrsky, the commander of the Armed Forces of Ukraine, which was viewed over 1.1 million times. In second place is an article about Ukraine, and the top three is an article about Ukrainian boxerOleksandr Usyk.
The differences between Facebook, Instagram and Wikipedia are as vast as the Gulf of America, though: the goal of Wikipedia is to compile and spread accurate information. That is not the aim of Facebook, Instagram, or any other social network, and has never been one of its strong points.
...the first non-sponsored, non-AI search result: an entry from one of the most reliable places on the internet, Wikipedia. If I'd written that last sentence at the start of my career, no editor would have allowed it into print. You can't trust something that anyone can edit, the thinking went, and so it became a bad word in journalism and academia. Don't cite it; don't even look at it. Or if you do, for God's sake, don't let anyone see you.
Aravind Srinivas criticizes Wikipedia's neutrality, urging for unbiased alternatives.
Beyond its functional value, Wikipedia has become a cultural phenomenon. It has inspired memes, informed countless debates, and even become a trusted companion for breaking news. Its transparent editing history also provides a unique window into how society's understanding of events evolves over time.
Users on X were quick to react to the website turning 24.
It's always fascinating to me how, by and large, every page is so much better than I think it will be. Really high-energy, maybe "truth optional" pages for topics like cryptocurrencies. It's a small miracle that they're as orderly and reasonable as they end up being.
According to Bader, so far, no structured French group is working on Palestine, unlike the "other side" namely Israel, which is said to be "organized." On the English Wikipedia, the pro-Palestinians are "more numerous, they have been attacked, they have been targeted, but they are successful," Bader specifies. [Google translate]
Wikipedia's Arbitration Committee (ArbCom), which is the site's version of the Supreme Court, is on the verge of issuing indefinite topic bans to eight editors involved in the Israel-Palestine topic area, most of them of being anti-Israel editors.
Multiple anti-Israel Wikipedia editors are likely to be topic-banned after spreading misinformation and hate across the site, the Anti-Defamation League (ADL) announced on Friday.
On the side of Wikimedia France, whose ideological orientations Le Figaro has already reported , murky links have existed for several years between members of the board of directors and the association " les sans pagEs " which pays employees to produce Wikipedia content on people belonging to groups identified as minorities and underrepresented in the encyclopedia.[Google translate]
You could just hit "random page" and not get bored for at least as long as it takes to convince your boss that you're working.
"Since legacy media propaganda is considered a "valid" source by Wikipedia, it naturally simply becomes an extension of legacy media propaganda!", he wrote on X.
"Wikipedia is completely ideologically captured. Deserves $0 in donations until they re-balance", the author wrote. After seeing the post, Musk chimed in demanding action taken against Wikipedia. "Defund Wikipedia until balance is restored!", he wrote.
"Defund Wikipedia until balance is restored!" Musk wrote on X. "Stop donating to Wokepedia until they restore balance to their editing authority." However, Wales replied to Musk, defending the site and criticizing the tech CEO for stirring up anger.
The editor of the blog "The Wikipedia Flood," however, said the topic bans are barely enough, especially given the large number of editors involved, and the fact that so many of those involved were not part of the disciplinary action.
The emerging coalition of MAGA supporters and Silicon Valley's tech bros and venture capitalists has found a new shared target to set its sights on: the world's largest free encyclopedia.
While Musk's animosity towards Wikipedia may focus outwardly on the hand gesture, Wikipedia's goal of factual neutrality makes it a natural adversary to X — a platform increasingly synonymous with heated culture wars, hate speech and disinformation. Wikipedia and the media at large — which Musk has increasingly criticized — also pose a threat by holding him accountable as he thrusts himself into the center of U.S. politics.
Eight Wikipedia editors accused of disruptive behavior have been barred from making changes to articles on the Israeli-Palestinian conflict, following a a ruling issued Thursday by the crowd-sourced encyclopedia's highest oversight body.
She said her Wikipedia editing journey began back in 2018 with learning about what was considered notable on the site and identifying where there were gaps which really needed to be filled.
Bussigny told me ... "However, I was keen to reveal and prove through this immersion that fiercely anti-Zionist organized networks are targeting the online encyclopedia, which is extremely popular in France, especially among pupils and students who may be unknowingly influenced."
The behavior of anti-Israel Wikipedia editors has been in the news lately.
Yet Wikipedia remains one of the few major platforms where political debates can take place in a way that is both intense and calm, even on the most controversial subjects, Vermeirsche noted.
However, both languages benefit from the visibility and preservation opportunities provided by the platform. Qualitative content analysis demonstrated that both Wikipedia editions contain a mix of cultural, historical, and contemporary topics.
Might Wikipedia, the ever-evolving online compendium of human knowledge, become the latest target in the new administration's crackdown on public sources of information?
The bottom line: 84% of Left-leaning outlets have Wikipedia's stamp of approval, while 0% of right-leaning outlets even get a wink from the tech giant.
Wikipedia warns that if "no such source exists, that may suggest that the information is inaccurate." In other words, the only media reports that are considered trustworthy are those reported by leftist, legacy media.
It's no secret that Wikipedia's volunteer editors are predominantly ideological myopes favorable to leftist causes, ideas, and personalities and antipathetic to conservatives of various stripes.
It seems that both the CCP and Heritage believe that if you can't win an argument in the digital space of Wikipedia, it's fair game to destroy that person's life offline.
The source blacklist has zero to do with accuracy and everything to do with shutting down any journalist who doesn't bend the knee to the left. And stifling any discourse not approved by progressive would-be overlords in biz and government and the NGO sector. In other words, Wikipedia is engaged in an actual disinformation op.
I think that there's this political-industrial complex right now where everything is being politicized, right? And the right wing has an interest in portraying Wikipedia as left-wing and a kind of liberal media. ... But if I had to guess, I think it's going to get worse before it gets better in terms of partisan rhetoric about Wikipedia.
Wikipedia is certainly not immune to bad information, disagreement, or political warfare, but its openness and transparency rules have made it a remarkably reliable platform in a decidedly unreliable age. Evidence that it's an outright propaganda arm of the left, or of any political party, is thin.
The Media Research Center, a conservative organisation, released a report on the free online encyclopedia's list of "reliable sources". The report said that all the US news sites the centre categorised as right-leaning had failed to meet Wikipedia's criteria as a trusted resource for administrators.
Even Wikipedia recognises the gravity of the situation its contributors in Belarus now face, to the extent that they have overridden their own protocols and deleted the entire edit history for Belarus-related articles that could land its users in trouble.
As you scroll through the 2020s, though, you'll notice that the pages keep going: 2026, 2027, 2028 and so on. The reliably dull Wikipedia interface remains unchanged, even as recorded history cedes to speculative history.
In a series of calls and letters to the Wikimedia community over the last two weeks, Wikimedia executives have told editors that they are trying to figure out how to keep their users safe in an increasingly hostile political environment.
Many worry that Wikipedia contributors could be targeted next. According to documents obtained by the independent news organization Forward, the Heritage Foundation, a conservative think-tank responsible for Project 2025, wants to "use facial recognition software and a database of hacked usernames and passwords in order to identify contributors to the online encyclopedia, who mostly work under pseudonyms." It is not yet clear what the organization would do after identifying the contributors."
In response, Wikimedia is rolling out new security measures. One major change is the temporary accounts program, which will prevent unregistered editors' IP addresses from being visible to the public.
Overall,Wade says, "there's a bunch of old-school scientists who don't think this kind of science communication is credible". Yet, she stresses that Wikipedia editing is easy and rewarding, and a useful way to contribute to research culture.
Ultimately, this article calls for greater transparency and accountability in how big tech entities use open-access datasets like Wikipedia, advocating for collaborative frameworks prioritizing ethical considerations and equitable representation.
The culture wars have come for our public information sources. And Wikipedia is on the chopping block.
In November 2024, the Indian government reportedly formally raised concerns over bias and inaccuracies on the platform, citing complaints about a small group of editors exerting disproportionate influence over content neutrality. India Today reported that the government questioned whether Wikipedia should continue being classified as an intermediary or be held accountable as a publisher.
On February 14, the Wikimedia Foundation Board of Trustees and language committee approved the proposal of Sylheti Wikipedia.
That feeling of getting lost in the information rabbit hole is a quintessential Wikipedia experience that most people are familiar with.
He added that he prefers to keep his anonymity because he sometimes writes on contentious topics. He cited a massive defamation lawsuit filed last year by the government of India against the Wikimedia Foundation, and a more recent report about the conservative U.S. Heritage Foundation's plans to "identify and target" volunteer editors on Wikipedia.
But Wikipedia has proved remarkably resilient. Wales has stressed that the site is not for sale. And for two decades, a long time in tech years, it has stayed true to its crowdsourced, democratic ethos and to its commitment to facts. In 2025 America, that counts as a beacon of hope.
Wikipedia depends on the availability of existing published sources to verify the facts in its articles. But, because women have been left out of historical narratives and traditional sources of knowledge, many of these knowledge gaps are present on Wikipedia.
As these attacks continue, it's more important than ever for funders to continue supporting and safeguarding organizations that foster learning. Only then can we ensure that in the future, Wikipedia, and free knowledge on the web in general, don't share the fate of the great Library of Alexandria.
Ongoing war and other conflicts in the Middle East have spilled onto the pages of the online encyclopedia Wikipedia, where volunteer editors who maintain the website are sparring over how to frame recent events. At least 14 editors have been barred from working on pages related to the topic, Jewish organizations are claiming bias, and the conflict has reached the top levels of Wikipedia as the site's two founders, Jimmy Wales and Larry Sanger, are at odds over whether to unmask the anonymous editors involved in the turmoil.
With the new Trump administration's goal to tackle waste, fraud and abuse in the federal government, Sanger sees a prime opportunity for DOGE to take another look at Wikipedia. He believes there's evidence to raise questions about potential government influence on the website and warned there could be foreign influence from China or Russia on the website as well.
Every second, more than 8000 people read Wikipedia. Every minute, there are about 350 edits to the site. It's the most-read reference ever. This, of course, is according to Wikipedia - a sentence that would have been unlikely to appear in an article even a few years ago. But in a world where Meta has removed fact-checkers and AI gives laughably inaccurate answers, Wikipedia has emerged as a surprisingly reliable and increasingly respected source of information.
As we mark International Women's Day, it's high time we examined the barriers that keep Wikipedia from achieving true equity, and the efforts being made to close this digital divide.
The initiative is part of a longer-term collaboration between Times of Malta and Wikimedia Community Malta which sees academic Toni Sant serve as Times of Malta's Wikimedian-in-Residence. As part of that collaboration, Times of Malta will be making a number of photos from its historic photo library available on Wikipedia for public use under a Creative Commons Licence.
This portrait problem stems from Wikipedia's mission to provide free reliable information. All media on the site must be openly licensed, so that anyone can use it free of charge. That, in turn, means that most photos of notable people on the site are of notably poor quality.
Auckland Museum has been working alongside Wikipedia for a while, but especially closely since 2020. Their team keeps track of how often articles about their collections are looked at online: 60 million views a year on Wikipedia, compared with 600,000 on the museum's own website.
But if the common feeling in the room was that Wikipedia wasn't under existential threat, the editors still felt vulnerable.
Wikipedia's goal: 5,000 new articles. But beyond the numbers, the project is about preserving heritage and making knowledge accessible to all.
The issue arose after ANI sued Wikipedia for defamation alleging that the platform allowed defamatory edits by certain users referring to the news agency as a "propaganda tool" for the present Central government.
Subsequently, on November 11, 2024, the Delhi High Court closed Wikimedia's appeal against the single judge's order directing disclosure of the individuals' subscriber details. This came after both parties entered into a consent order resolving the matter.
In October, the Delhi High Court described Wikipedia's model as "dangerous", complaining that "anyone can edit a page". It also ordered the take down of a Wikipedia page titled "Asian News International vs. Wikimedia Foundation" which contains details on the ongoing case.
ADL has found clear evidence that a group of at least 30 editors circumvent Wikipedia's policies in concert to introduce antisemitic narratives, anti-Israel bias, and misleading information.
On pages dedicated to major historical events, like several Israel-Arab wars or peace negotiations, editors would make "extensive edits" in "tone, content and perspective" to advance an anti-Israel narrative, the report found.
The ADL report did not call for abandoning Wikipedia but warned users to be skeptical of politically sensitive entries.
The ADL said editors appeared to coordinate changing relevant pages, downplaying Palestinian antisemitism, violence, and calls to destroy Israel, and adding more criticism of Israel.
Speaking toJewish News, a spokesperson at the Wikimedia Foundation, the nonprofit that operates Wikipedia said: ... "Though our preliminary review of this report finds troubling and flawed conclusions that are not supported by the Anti-Defamation League's data, we are currently undertaking a more thorough and detailed analysis. It is unfortunate that we were not asked to provide context that might have helped allay some of the concerns raised."
In its latest act of partisan truth-twisting, Wikipedia took a blowtorch to the reputations of President Donald Trump's nominees for his Cabinet. The partisan ploy failed to derail them, but it exposed the sinister agenda that informs everything the online encyclopedia touches.
Since that decision, "the ADL has continued to misrepresent Wikipedia's well-established guidelines, policies, and enforcement mechanisms that effectively address the issues outlined in the report and its recommendations," the Wikimedia Foundation said in its statement.
For five hours, two dozen or so volunteers congregate over laptops, cups of coffee and doughnuts iced with the Wikipedia logo. They're writing new entries for places lost in the fires, adding citations, updating information and uploading photos.
Have you ever wondered why there's a sea of differences between the celebrity portraits seen on IMDb as compared to those on Wikipedia? It turns out many of us have been scratching our heads over the same.
The workshop covered various topics, including basic training on page editing, creating articles on Wikipedia, incorporating references, and understanding Wikipedia's layout and formatting. Thirty-seven students and researchers participated in the event.
For example, discussions about Hindu religious practices frequently center around Western feminist or secular critiques rather than incorporating viewpoints from Hindu practitioners themselves. This creates a situation where content about Hinduism may emphasize aspects like caste systems or gender inequalities while minimizing its philosophical depth or cultural significance to Indian society.
In response to The Forward article, Wikipedia editors launched a discussion known as Request for Comment (RfC) on Jan. 8 how editors should treat the think tank's reliability going forward.
Wikipedia is a vital link within the network of online information. Women with articles in Wikipedia are easier to find. We'll boost the discoverability of these women by creating and editing articles in Wikipedia. Help us share the stories of women musicians, fiber artists, journalists, doctors and more.
As Wikipedia becomes more central to the infrastructure A.I., the organization is grappling with rising bot traffic, the need for attribution and how to sustain its ecosystem in the face of powerful new users.
Some enthusiasts launched WikiPortraits, a project to recruit a group of volunteer photographers around the world and get them accreditation to attend film festivals, conferences and other events.
The tension traces back to last year when Wikipedia editors deemed the ADL "generally unreliable" on the Israeli-Palestinian conflict due to its dual advocacy and research roles, though still "generally reliable" elsewhere.
Over on Bluesky, theDepths of Wikipedia account curates some absolute treasures from the open-collaboration encyclopaedia.
In its suit against Wikimedia Foundation and its officials, ANI has said that the former has allegedly published palpably false and defamatory content with malicious intent of tarnishing the news agency's reputation and to discredit its goodwill.
In October, the Delhi High Court had described Wikipedia's model as "dangerous", complaining that "anyone can edit a page". It also ordered the take down of a Wikipedia page titled "Asian News International vs. Wikimedia Foundation" which contains details on the ongoing case. Wikipedia took the page down but also moved the Supreme Court.
In response, Wikipedia's site managers have imposed "case-by-case" rate limiting for the offending AI crawlers, or even banned them. But to address the problem over the long-term, the Wikimedia Foundation is developing a "Responsible Use of Infrastructure" plan, which notes the network strain from AI bot scrapers is "unsustainable."
Web-scraping bots have become an unsupportable burden for the Wikimedia community due to their insatiable appetite for online content to train AI models. Representatives from the Wikimedia Foundation, which oversees Wikipedia and similar community-based projects, say that since January 2024, the bandwidth spent serving requests for multimedia files has increased by 50 percent.
"Defendant No.1 [Wikimedia Foundation Inc] professes itself to be an encyclopaedia and people at large have a tendency to accept the statements made on the web pages of Defendant No.1 as gospel truth. The responsibility, therefore, of Defendant No.1 is higher," [the court] stated.
Last year the agency, ANI, sued Wikimedia for defamation in the Delhi High Court, citing a Wikipedia description that it faced criticism for being a government "propaganda tool" and sought removal of such statements.
Recently, the 30-ish folks behind the Wikimedians of Minnesota User Group have resurrected their meetups. It's an effort to better curate entries related to the state and geek-out on the minutia of online information chronicling, but it's also a great excuse to snack on stuffed sausage dates while sipping beers at Lake Monster Brewing Co., as they'll do this coming Sunday.
The non-profit Wikimedia Foundation, which operates Wikipedia, says since January 2024 it has seen a 50 per cent increase in network traffic requesting image and video downloads from its catalogue.
The Wikimedia Foundation is the second major tech platform, following X, to become embroiled in Indian court battles over orders to take down content in recent years.
The single judge had observed that Wikipedia cannot wash its hands of the contents published on it by merely claiming that it is an intermediary and cannot be held responsible for the statements published on the platform. Perusing ANI's Wikipedia page, the Court had said that the statements on it were all sourced from articles which were nothing but editorials and opinionated pages.
Unlike newspapers or scientific journals, the encyclopedia does not purport to publish new information; volunteers are instead expected to repeat with attribution or reproduce with references, information originally published elsewhere, with a preference for reputable sources. In this light, the court order is problematic.
ANI argued that, as a public platform, Wikipedia should not possess the same freedoms as a private company. It also criticised the platform for restricting the page, which prevents the news agency from making edits while allowing Wikipedia editors to do so.
On Tuesday, the court observed that Wikipedia is regarded as an encyclopedia and should maintain neutrality and not take sides like a blog.
Wikipedia, which allows for user-generated modifications supported by credible sources, defended itself by citing its community moderation model and neutrality policy.
The cyber cell then issued a stern warning to Wikipedia, warning that failure to comply could result in its services being blocked in India under Section 69A of the IT Act.
But Wikipedia has, increasingly, found itself at odds with the world. The rise of autocracy and totalitarianism last year means a growing number of governments are looking to control what Wikipedia says about them, while a global turn against traditional institutions has weakened trust in the website – which many see as a mouthpiece for legacy media.
Editors have starkly different views of what Wikipedia is and how it best serves readers.
Dr Johansson designed a program, dubbed "lsjbot", which generated millions of articles in several languages, but particularly Cebuano. It also laid bare a debate which Wikipedia has been grappling with since its inception, and which artificial intelligence (AI) is making ever more pressing.
Wikipedia is attempting to dissuade artificial intelligence developers from scraping the platform by releasing a dataset that's specifically optimized for training AI models.
The High Court issued summons to Wikipedia on July 9, 2024 and ordered it to disclose information about three people who made the edits on ANI's Wikipedia page. The order was resisted by Wikipedia, which itself chose to serve notices on these users instead of disclosing their identity in public.
No longer a straightforward source of facts, Wikipedia today is pure left-wing propaganda — and its intense campaign against Vice PresidentJD Vance is just the latest example of its bias.
According to Ars Technica, bots that scrape Wikipedia and Wikimedia Commons pages have consumed50 percent of its bandwidth, putting a massive strain on the nonprofit's entire operation.
In asking for the takedown of articles by interpreting critical information as defamation and by even threatening penal action against Wikipedia, judicial actions could unwittingly lead to the stifling of open discussion of entities on the encyclopaedia, thereby acting against the interest of the free flow of information.
The letter, which was obtained by The Free Press, accused the largest online encyclopedia of "allowing foreign actors to manipulate information and spread propaganda to the American public."
While Wikipedia has weathered occasional controversies throughout its history over the content of its articles, its emergence as a bogeyman of U.S. conservatives is relatively recent. In 2018, an Atlantic column dubbed it "the last bastion of shared reality" in an ever more polarized country.
Martin went on to complain that the Wikimedia board is "composed primarily of foreign nationals" who are "subverting the interests of American taxpayers."
Martin's letter reflects a broader trend of the right targeting Wikipedia.
Globally, Wikipedia has come under scrutiny from various governments. In Saudi Arabia, the government imprisoned two Wikipedia editors on charges of "swaying public opinion." In Turkey, Wikipedia was blocked entirely for nearly three years over content critical of the government until the country's highest court ruled the ban unconstitutional.
Mr. Martin said Wikipedia's operations are directed by a board composed primarily of foreign nationals "subverting the interests of American taxpayers." He said that its mission as a neutral educational resource is benefiting "foreign powers," and its tax-exempt status could be at risk for violating its "legal obligations and fiduciary duties."
The letter was cordial, if ridiculous. But on social media, Martin was (even) less professional. "Hey @Wikipedia: you can run but you can't hide!" he tweeted, linking to a post on his own letter.
The cohort of Wikipedia editors has softened the image of Islamist terrorist groups such as Hamas through removing any mention of their 1988 charter, which calls for the complete massacre of Jews and elimination of Israel. The editors also edited an article on Zionism, describing the movement for Jewish self-determination as "an ethnocultural nationalist movement" which was "pursued through the colonization of Palestine."
One might ask, "Who cares if Wikipedia is biased?" Lots of media are biased in one direction or another. And the notion that any nonprofit organization's political leaning requires its status be investigated is ludicrous, considering that three of the organizations hyping Wikipedia's alleged wrongdoing—the Heritage Foundation, the Manhattan Institute and the ADL—have the same tax-exempt status.
The letter follows recent ADL research that found widespread antisemitic and anti-Israel bias on its pages, across multiple languages, especially on content related to Israel and the Israeli-Palestinian conflict, as well as an apparent coordinated promotion of pro-Hamas propaganda.
The letter highlighted how the report documented instances in which these editors scrubbed the Wikipedia pages of certain public figures' support for terrorism and antisemitism and that theHamas Wikipedia page whitewashes Hamas' terror activities.
But the rightwing obsession with going after public media and Wikipedia isn't solely about wanting to consolidate control over information. Like so many Trumpian culture-war "policies," it's driven by a deep well of resentment and revenge, a desire to get even with the snooty elites who must be punished for liking things that Real Americans don't.
Aharoni Lir revealed that interviews with 16 Jewish English Wikipedia editors had raised major concerns about the "difficulty, and at times impossibility, of correcting biased content directed against Israel," a problem she said had "notably intensified since October 7."
In their announcement, the committee said it had reviewed a 244-page dossier that The Journal published in the "Gaming the Wiki System" cover story chronicling the purported activities of the Wikipedia channel in the Tech for Palestine Discord server.
"In violation of Wikipedia's rules, Buzbee directed his employees to edit Wikipedia pages to enhance Buzbee's image and damage Mr. Carter's and Roc Nation's reputations,"Jay-Z's attorneys write in the amended complaint.
I will remain in my role until a new CEO is in place,"Iskander said in the note. "The hope is to welcome a successor by January 2026, a milestone that coincides with Wikipedia's 25th anniversary.
The Cologne-based firm has cooked up a collaboration with the Wikimedia Foundation, the nonprofit that operates Wikipedia and has found itself in the crosshairs of U.S. President Donald Trump and Tesla founder Elon Musk in recent months.
The charity that hosts Wikipedia is challenging the UK's online safety legislation in the high court, saying some of its regulations would expose the site to "manipulation and vandalism".
Lead counsel Phil Bradley-Schmieg said it was "unfortunate that we must now defend the privacy and safety of Wikipedia's volunteer editors from flawed legislation". The government told the BBC it was committed to implementing the act but could not comment on ongoing legal proceedings.
Well, true crime experts, now it's your turn to share. What's the scariest Wikipedia page about a missing person (or persons) you've ever read?
We have no hesitation to hold that the direction issued by the High Court could not have been issued, the top court said.
The High Court today issued notice to Wikimedia Foundation (owner of Wikipedia) on a fresh application moved by ANI for an interim injunction against alleged defamatory content on Wikipedia's page about ANI. Justice Jyoti Singh heard the case today and issued notice to Wikipedia before listing the matter for further hearing on July 7.
The Supreme Court said the takedown order was the result of the high court reacting "disproportionately." "Courts, as a public and open institution, must always remain open to public observations, debates and criticisms," the 37-page Supreme Court order said.
But he stops short of calling it a shared reality. Wikipedia is not a place where you're "just seeing your preferential version of things served up to you as a delicious piece of confirmation bias," he says. "[But] there will be people who just reject Wikipedia out of hand when it doesn't conform with their concept of reality."
{{cite news}}: CS1 maint: url-status (link)Iskander also sees lessons in Wikipedia's approach for AI companies as they seek to mitigate bias, reduce errors and ensure a healthy information ecosystem.
The original paper was published in 2022 and was criticised in a subsequent study published in May 2023, co-authored by High Court judge Mr Justice Richard Humphreys, which said citations in judgments are driven mainly by legal submissions and maintained there was no "Wikipedia effect".
Hearing Wikimedia's Special Leave Petition (SLP), the bench firmly rejected the HC's interpretation and its broad directive.
In conclusion, the Wikipedia v ANI decision is not merely a win for one online platform, it draws a line in the sand. The Supreme Court has reminded lower courts and litigants alike that judicial power must be exercised with restraint, especially in matters concerning speech.
Wikipedia has edited out the 'disparaging' parts of the pages on Indian news channels following a news report that called out the website's alleged bias. Wikipedia, in its profiles of leading news channels in India, allegedly labelled them as close to the BJP and stated that the channels reported in favour of the ruling party.
[Wikipedia] is run completely on the goodwill and hyperfixation disorders of complete randos out there. I really have no idea how this place runs, but it's been too long for me to ask. People have always trusted me because of how I look and I can't ruin that by asking questions. Regardless, we have other stuff we need money for that you don't know about. Just trust me, okay?
The two-year residency pilot program is a partnership between CUNY and Wikimedia New York City, and funded by Craig Newmark, founder of Craigslist and namesake of CUNY's journalism school.
The attacks by the ADL and Congress on Wikipedia purport to protect Jews, but in fact have nothing to do with antisemitism, and everything to do with controlling the narrative about Israel. But Wikipedia's processes around that narrative are admirable in their strenuous research.
The authors argue that Wikipedia's structure increasingly mirrors that of a social media platform rather than a neutral and reliable information source.
True story: Stoever Googled "How to use Wikipedia in your classroom," which led her to Wiki Education. She applied, and within weeks, the organization built a dashboard for her class, assigned students a Wikipedia expert as a guide, and gave the professor her own mentor, along with online talks and events throughout the semester.
At another point,Mr. Martin said he viewed himself as being engaged in a "war over information." He cited a letter he had written to Wikipedia accusing it of bias and improperly shielding itself from scrutiny through its tax-exempt status.
If anything, the ADL report downplays our findings, which point to the pervasive presence of antisemitism on the Wikipedia pages we analyzed.
These existential questions resonated with viewers, with his TikTok video reaching over 331.9K views and 31.8K likes. After all, most people take the internet's most trusted encyclopedia for granted.
In other words, a government-sanctioned organization is being platformed by the Wikipedia website to redact content with strong political undertones – promising to reimburse the winners with what appears to be governmental allocated funds.
While Wikipedia has never been known to be an infallible source free of bias or inaccuracies, generative AI has proven to be far more unreliable, thanks to widespread hallucinations and biases present in its training data.
We have over 2,000 African languages, yet our voices are missing on the Internet. While Swahili is making remarkable strides, other languages must rise too. Wikipedia is a powerful tool for digital inclusion, and we are here to change the narrative.
Elon Musk tweeted in 2023 that it had a "non-trivial left-wing tilt" and joked it should rename itself "Wokipedia". Conservative commentators have echoed similar criticisms, saying it takes its information disproportionately from "left-wing" media, and one study published last year suggested its editors wrote more positively of Democratic figures than Republican ones.
Although Wikipedia credits its volunteer editing for making it a vast encyclopedia with tens of millions of pages, it lacks the policies and tools to achieve neutrality on hotly contested topics like the Israel-Palestine conflict, according to a third editor who requested anonymity.
The Wikimedia Foundation, the nonprofit organization which hosts and develops Wikipedia, has paused an experiment that showed users AI-generated summaries at the top of articles after an overwhelmingly negative reaction from the Wikipedia editors community.
Even Wikipedia, the vast repository of knowledge famously maintained by an army of volunteer human editors, is looking to add robots to the mix. The site began testing AI summaries in some articles over the past week, but the project has been frozen after editors voiced their opinions. And that opinion is: "yuck."
When trying to find out more information about Wikipedia's game, not least whether I'm embarrassingly late to discovering it or one of the first to be offered it, I find there's the weirdest lack of information out there.
"The decision reflects a growing trend, in which Wikipedia undermines its foundational principles of neutrality and open knowledge by failing to disclose facts related to sensitive topics and actively concealing them, especially regarding subjects related to Israel," Lir said.
There is something ennobling, in other words, about the whole project – in all its vastness and eccentricity and frivolity and grandeur – just as there is something ennobling about democracy. The greatest thing about Wikipedia, of course, is that it works exceptionally well.
Sure, the scale of the reaction might feel over the top, but the instinct behind it is easy to grasp.
"This is exactly the problem with the absence of fairness and standards in websites like this," Daniel S. Mariaschin, CEO ofB'nai B'rith International, told JNS.
When Wikipedia and artificial intelligence disagreed, the AI wasn't more often right than Wikipedia. Sometimes, the AI even correctly criticized a sentence, but also provided false facts itself. That's why human review was so important. At the same time, most AI models are also trained on Wikipedia articles. The AI has therefore very likely overlooked some errors because it learned inaccurate information from Wikipedia (Google translate).
According to the Frankfurter Allgemeine Sonntagszeitung (FAS), it has examined 1000 randomly selected entries in the German-language Wikipedia for accuracy. ... At least 20 percent of the entries contained information that was "no longer up to date".
The fact that errors can be systematically researched and promptly corrected is a cornerstone of Wikipedia's credibility and quality as the most important repository of knowledge of our time. Of course, Wikipedia knowledge is always provisional and never completely secure – like knowledge in general (Google translate).
Exceptionally for such a popular resource, Wikipedia does not track you or sell any of your search information. It does not carry advertisements or monetize itself beyond regular appeals to users for small donations. It is fully accountable, with every keystroke credited and dated to a specific user. It is continually trying to improve its accuracy, reach, diversity of content and contributors. And beyond all this, it is a thing relentlessly and reliably useful.
Here are 12 stories that will make you spiral down the Wikipedia rabbit hole:
The Hello Kitty murder case, Junko Furuta, the Toy-Box Killer, and so many more nightmares ahead.
Wikipedia is the world's most trusted and widely used encyclopedia, with users across the word accessing its wealth of information and participating in free information exchange through the site. TheOSA must not be allowed to diminish it and jeopardize the volunteers on which it depends.
It does expose the weakness of a website like Wikipedia. It relies on unpaid individuals to oversee an acceptable first version of history and enforce its policies around bias and undue emphasis. Given that, it's remarkable it's as good as it is.
To stay outside the scope of the regulation, Wikipedia could cap visitor numbers from the UK so it does not qualify as a "category one" site, which are defined as those with seven million users. This would make it harder for British users to access the site when they wanted.
Here's alarming news: AI bots rely heavily on Wikipedia, which feed them a diet of half-truths, ideological bias and leftist lies — and then pass along the propaganda to millions of unsuspecting readers.
People are increasingly using chatbots to seek out information they would have once found on Wikipedia. (Ironically the website is one of the most widely used data sources for AI training models owing to its open licence.) Some editors and researchers worry fewer people will visit the website directly.
The nonprofit is particularly concerned about the Categorisation Regulations contained within the bill, and how the website could be in the top tier: category one. It would require Wikipedia to enforce ID verification on its anonymous voluntary moderators, as well as visitors.
Wikipedia might seem like a place where colleges can easily shape their reputations. That's certainly what communications and marketing departments would like — and many have taken to the site to correct information or, in some cases, make it more flattering. But their efforts don't go very far.
These debates now matter, to reputation specialists, due to the rise of AI-powered search engines, says Bennett Kleinberg, founder of Jupiter Strategies. "The robots have to pay to read The Times of London," he says. "But everything on Wikipedia is in the public domain … The content on Wikipedia is driving so much of the content that is ultimately displayed, in AI."
They've rewritten the narrative around the expulsion of Palestinians, erased records of suicide bombings to portray Palestinians as more innocent, and revised articles on ancient history to downplay or omit the Jewish people's presence and historical connection to the land.
The solution Wikipedians came up with is to allow the speedy deletion of clearly AI-generated articles that broadly meet two conditions.
Large language models have made it easier than ever to generate convincing writing for Wikipedia, Miller and Kaffee said. It's an enticing shortcut for novice contributors or those with an agenda.
Despite varying viewpoints about AI across the Wikipedia community, the Wikimedia Foundation isn't against using it as long as it results in accurate, high-quality writing.
Most of Wikipedia's critics aren't pushing for better neutrality. They just don't like what Wikipedia says.
Although not in the Wikimedia Foundation's favor, the ruling "does not give Ofcom and the Secretary of State [for Science, Innovation and Technology] a green light to implement a regime that would significantly impede Wikipedia's operations," the court said.
The Wikimedia Foundation took legal action at London's High Court over regulations made under the law, which it said could impose the most stringent category of duties on Wikipedia.
Judge Jeremy Johnson dismissed its case but said the Wikimedia Foundation could bring a further challenge if regulator Ofcom "(impermissibly) concludes that Wikipedia is a Category 1 service".
Caroline Mwaura, chairperson of Kenyan Wikimedia hub said that they have reached out to Pwani University as part of a move to generate content and it has plans to reach out to contributors from other universities.
The Wikimedian of the Year award winners are recognized across several categories, from new contributors to long-time volunteers, highlighting the diverse and vibrant community that makes the Wikimedia projects possible.
The operator of Wikipedia has been given permission by a high court judge to challenge the Online Safety Act if it is categorised as a high-risk platform, which would impose the most stringent duties.
Wikipedia has been caught in the stricter regulations due to its size and user created content even though it argues (convincingly) that it differs significantly from other user-to-user platforms, said Mona Schroedel, data protection litigation specialist at law firm Freeths.
If the regulator decides that Wikipedia falls under Category 1, the website would have to implement the same identity-verification tech as social media and pornography sites since July 25. In the organization's view, the consequences of it having to do so would be extensive and damaging for its users' privacy.
To Wikipedia's lawyers, the law itself is almost a case of mistaken identity. Wikipedia, they argue, isn't the kind of social-media platform that the law was intended to regulate.
Strassmann, who has since worked with the Wikimedia Foundation and served on the board of Wiki Education, first got the idea for incorporating Wikipedia into her lessons after observing her son's reliance on the site.
The court ultimately dismissed Wikimedia's judicial review, but Justice Johnson said Wikimedia has two grounds to lodge another.
The Online Safety Act is already being accused of censorship, as scenes of protests and speeches in Parliament are hidden from users who haven't proved they are over 18. But, as the Wikipedia case shows, the potential for hiding content from adults goes much further.
Wikipedia, the world's leading public education site, could suffer the same fate as Reddit . They don't want to track users or know who you are. They challenged the law to be exempted from the rules, but lost in court. If they are caught by the law's harshest rules, they could be fined, forced to restrict the site, or blocked from the UK altogether.
Speaking during a keynote panel at the first-ever Wikimania hosted in East Africa, Ugandan computer scientist Joyce Nakatumba-Nabende and South African data scientist Professor Vukosi Marivate warned that while generative AI holds vast potential to expand and improve access to knowledge, it also poses serious risks if unchecked.
Could the whole thing be some kind of "art project," with the real payoff being exposure and being written about? Perhaps. But whatever the motive behind the decade-long effort to boost Woodard on Wikipedia, the incident reminds us just how much effort some people are willing to put into polluting open or public-facing projects for their own ends.
For members of the physics community, that's where the American Physical Society's Wiki Scientist courses come in. The six-week program, offered by the Society's Office of Public Engagement in partnership with Wiki Education, teaches physicists how to add and edit content to Wikipedia pages for a variety of physics-related topics, from its history and people to its concepts and theories.
Wales and editors proceeded to get into it in the replies to his article. The basic disagreement is that Wales thinks that LLMs can be useful to Wikipedia, even if they are sometimes wrong, while editors think an automated system that is sometimes wrong is fundamentally at odds with the human labor and cooperation that makes Wikipedia so valuable to begin with.
Um dos responsáveis da Fundação Wikimedia, Joe Sutherland, defendeu, num texto dirigido à comunidade de editores, que a decisão da Justiça portuguesa "prejudica o direito à privacidade e à liberdade de expressão dos voluntários que contribuem com edições."[One of the people responsible for the Wikimedia Foundation, Joe Sutherland, argued, in a text addressed to the community of editors, that the decision of the Portuguese court "harms the right to privacy and freedom of expression of volunteers who contribute edits."]
If everyone who doesn't like a Wikipedia entry is allowed to sue Wikipedia as European laws allow, then every piece of content is a potential legal and financial risk.
While overall Wikipedia traffic remains steady, the researchers find that specific types of articles—those whose content closely resembles what ChatGPT would generate—have seen a noticeable drop in readership since ChatGPT's launch.
In addition to being influenced from the outside, the Wikimedia Foundation also doles out hundreds of thousands of dollars each year to activist groups seeking to bring the online encyclopedia more in line with traditionally left-of-center points of view. For instance, its 2022-2023 tax documents indicate that the philanthropy made grants to organizations such as Art+Feminism, Whose Knowledge, and Black Lunch Table, all of which conduct coordinated editing operations.
Wikipedia famously crowdsources its information through a network of volunteer contributors and editors. This combination of crowdsourced data and highly specific niche topics is a recipe for the misuse of AI.
Wikipedia has long faced accusations of its entries having a political bias, with the right-leaning Manhattan Institute releasing a report in 2024 that found Wikipedia entries are more likely to attach a negative sentiment to right-leaning terms.
The House Oversight Committee is investigating manipulation efforts to determine the role and methods of foreign individuals, those at academic institutions subsidized by United States taxpayer dollars, as well as Wikipedia's awareness and response.
In recent years, right-wing figures in the United States have raised concerns about perceived political bias in Wikipedia's content and other large online platforms.
Exerting control over media and information is one of the hallmarks of a fascist movement. Apropos of nothing, two prominent Republicans from the House Committee on Oversight and Government reform, Kentucky representative James Comer and South Carolina's Nancy Mace, have launched an inquiry into Wikipedia.
When it comes to open-source and decentralized information resources like Wikipedia, however, it appears the plan may be to find evidence of a nefarious conspiracy that justifies reworking the platform to your liking.
It was, in essence, a request by Congress for Wikipedia to "dox" many of its editors.
Two House Republicans are on the hunt for problematic Wikipedia contributors.
Interestingly, the letter specifically highlights systematic efforts to advance antisemitic and anti-Israel information in Wikipedia articles related to Israel.
SVT has compared a number of Wikipedia pages on Israel/Palestine-related topics, in English, Arabic and Hebrew. How do they differ?
Wikipedia has come to play a similar role of factual ballast to an increasingly unmoored internet, but without the same institutional authority and with its own methods developed piecemeal over the last two decades for arriving at consensus fact. How to defend it from political attacks is not straightforward.
While poring over new submissions for anything AI generated, they found errors, fake sources and people in places that were made up. But as AI advanced, the signs became more subtle, which is why Lebleu and other editors now look for less obvious tells, such as cliches.
On August 27, 2025, the U.S. House Committee on Oversight and Government Reform launched an investigation into the Wikimedia Foundation to examine potential foreign manipulation of Wikipedia, especially content related to Israel and antisemitism.
If they were to write about cable news, the Wikipedia results are as biased as ... well, cable news. Their entries on Fox News Channel and Newsmax are remarkably different from the ones on CNN and MSNBC.
After analyzing thousands of edits to law firm pages and speaking to multiple sources, Law.com International can reveal how some law firms have used paid editors, often covertly, or been blocked for conflicts of interest, and how details on sex scandals have quietly disappeared, political language has been softened, and hyperbole added, removed, and then reintroduced.
At its core, Wikipedia is a wrapper for the mainstream media. Its infamous "Reliable Sources" list of news outlets that can be used as references and sources Wikipedia editors consider to be "reliable" as green and those they deem "unreliable" as red. The green sites read like a semi-official list of the mainstream media: New York Times, Washington Post, CNN, NBC News, CBS News, ABC News, Associated Press. It disproportionately marks conservative outlets as unreliable, while giving a neutral rating to the Chinese propaganda outlet China Daily.
So next time you search for who invented the airplane or who won a war, consider reading these articles in another language. You might find a different hero. And you will definitely find a different story.
What should be clear by now is that right-wing media coverage of Wikipedia isn't actually interested in explaining how the site works. The goal is to undermine Wikipedia's function as a volunteer-driven project that can produce an independent repository of facts that has (at least historically) been insulated from political interference.
Wikipedia has managed the onset of the AI era better than many other websites. It has not been flooded with AI bots or disinformation, as social media has been. It largely retains the innocence that characterized the earlier internet age. Wikipedia is open and free for anyone to use, edit, and pull from, and it's run by the very same community it serves. It is transparent and easy to use. But community-run platforms live and die on the size of their communities. English has triumphed, while Greenlandic has sunk.
Beating Wikipedia won't be easy; the free online encyclopedia is the seventh most-visited website in the world. Still, Musk is betting he can disrupt the status quo.
Powerful bots like Grok, Chat GPT and Gemini siphon up huge swaths of text from Wikipedia and then spit it out as though it's neutral and authoritative. It's not. It is trimmed and hewed to fit a particular leftist narrative that has excised a huge territory of conservative thought and reportage from its source-stream.
But while the approved list checks out, Wikipedia's treatment of other outlets is more complicated. Of those cited by Sanger, only Breitbart News is formally "blacklisted" with edits citing it automatically blocked.
The magic of Wikipedia is that anyone can contribute information, cite a source, and it's policed by contributors and editors who mostly try to keep things as objective and factual as possible while citing reliable sources. Generative AI tools like Grok create sentences by relying on their training data, and it's difficult to tinker with the weights to prioritize a right-wing view of the world without going full Nazi. We saw that play out in real-time twice now at scale, all thanks to Musk.
Sanger said on X that 85% of Wikipedia's "most influential accounts" are anonymous, calling this group the "Power 62." Musk reposted Sanger's comment by saying, "Curiouser and curiouser."
Carlson had claimed Wikipedia has become 'a weapon of ideological, theological war, used to destroy its enemies.' The former Fox News host said he once 'really believed in Wikipedia' and donated large sums of money because he was 'so thrilled by its existence. 'Now it's like the leading source of dishonesty, or, I would say, disinformation,' he said.
I launched the site in 2001. Today, it's been captured by anonymous editors who manipulate articles to fit their ideological biases.
Musk criticized Wikipedia on Tuesday as "Wokipedia" following similar accusations from his allies. White House AI and crypto czarDavid Sacks slammed the site as "hopelessly biased." "An army of left-wing activists maintain the bios and fight reasonable corrections," Sacks alleged on X.
While his reportage on Wikipedia's supposed anti-India and anti-Hindu bias is yet to be shared on his Substack, Rindsberg and Sanger are part of a collection of people advocating against what has been called the "ideological hijacking" of the online resource. Wikipedia has been built on a foundation of straightforward, open-access information, but they argue that a left-liberal bias is corrupting the editors' framing of information on the online encyclopedia.
If Wikipedia rejects his "commonsense proposals," which are not "particularly, actually conservative," it shows the platform is beyond reform, he said.
Among his solutions, Sanger recommended abandoning the consensus model, as well as eliminating the approved sources list and returning to broader, clearly marked citations to allow readers to determine their own conclusions.
The offices for the communications minister and theeSafety Commissioner did not respond to questions about whether they consider Wikimedia's platforms within the bans, or how they will inform platforms of their requirements before the December 10 deadline.
But Wikipedia isn't some random website — it forms the foundation of many AI language models, which run on algorithms trained on its biased entries. What happens when every major AI bot runs exclusively on left-wing slop that's utterly divorced from reality?
This approach reflects one of the main strategies Wikipedia editors use to build a narrative: the creation of a web of articles, or a "networked narrative," designed to push the same key point. The repetition across multiple articles is what drives the narrative.
Now Wikipedia's estranged co-founder, Larry Sanger, has triggered a fresh chorus of Republican calls for reform, with a comprehensive proposal to overhaul the platform and make it more open to conservatives, fringe views and religious beliefs — and appearing on no less than Tucker Carlson's show to promote it.
In effect, those final two points mean information comes summarized from known media sources. Those policies—and how they're enforced—are what upset opponents such as billionaire Musk, White House AI czar David Sacks and others who don't like its perceived slant.
Scientists saw no drop in Wikipedia activity during the 36 months. In fact, they found an increase in page views and visitor numbers across all language editions, although the growth was smaller in languages where ChatGPT was available. Despite this, the study found no evidence that ChatGPT reduced the number of edits or editors on Wikipedia.
For twenty years, we could blame Wikipedia's skewed lens: too many male Western editors, too few Tamil citations, too much weight given to headlines from Delhi and London. Today, we are promised a fix. Grok's creators say their AI will scan everything and deliver the neutral residue. Yet, early evidence shows the fix is simply a new filter, tinted a different shade.
Musk has previously voiced criticism of Wikipedia, questioning the funding of the non-profit organisation and alleging that its content is influenced by left-leaning perspectives. He has suggested that traditional sources of information have, over time, misled the public and manipulated young minds.
The lesson is clear. Use Wikipedia for lists of monarchs, for summaries of chlorophyll or Caravaggio, or deep dives on the moons of Jupiter. For that it remains absolutely marvellous, still an internet miracle. But if you want to understand a political dispute, a culture war, a controversy, you must treat Wikipedia not as the final word, but as a cleverly illustrated propaganda pamphlet.
Among her many advocacy initiatives are, of course, her Wikipedia biographies, which she began writing in her 20s. In less than a decade, she has created more than 2,000 articles for the online encyclopedia, focusing on women and underrepresented scientists who have been overlooked or forgotten by history.
Cruz accused the foundation of "financially support[ing] left-wing organizations that contribute to Wikipedia content," and said that "a coordinated group of editors pushed antisemitic narratives on Wikipedia while whitewashing the activities of groups like Hamas." Wikipedia responded to edit wars on the Israeli–Palestinian conflict by banning eight editors in January.
Cruz's letter accused Wikipedia of pushing antisemitic narratives. He described the Wikimedia Foundation as "intervening in editorial decisions" in an apparent reference to an incident in which the platform's Arbitration Committee responded to editing conflicts on the Israeli–Palestinian conflict by banning eight editors.
But the project has managed to overcome prejudice, and while it's not perfect—just as paper encyclopedias weren't, let's face it—it's operational and rigorous, making it one of the most consulted websites in the world. Furthermore, its free nature makes it one of the few true tools for the democratization of knowledge that the pioneers of the internet promised us.
A Korean American Wikipedian who has spent years editing the world's largest online encyclopedia says too much of what global readers find about South Korea is biased, incomplete or simply missing – and he has made it his mission to change that. Through Wikipedia, he hopes readers can learn about South Korea with more depth and balance.
There are plenty of reasons why the site has remained so popular over the decades, Reagle says. But the biggest one is that it still plays an important service for people looking for information online. "It is also the last of the early generation websites that still serves the users instead of extracting value from them via advertising and algorithmic manipulation," Reagle adds.
Whether being willfully obtuse or just characteristically dense, Cruz is only the latest voice to wade into the fray. Earlier this year, the Heritage Foundation, the extremely powerful think tank behind Project 2025, said it would "identify and target" Wikipedia editors over their perceived political bias.
This profit model (or rather, lack thereof) is the encyclopedia's other greatest strength. Their freedom from shareholders means they are less susceptible to certain kinds of bias and can continue operating on a volunteer basis, where most contributors are simply passionate individuals who edit for the fun of it, and for the enjoyment of being in a community. The lack of profit incentive has protected Wikipedia from the "enshittification" or "platform decay" that is the downfall of many for-profit or ad-based digital services.
The result is endless discussions. Long lists of sources. And material that, according to a number of researchers, is getting better and better. A 2016 study found that both articles and contributors on Wikipedia become more neutral over time. Another study published in Nature in 2019 also shows that the most polarizing articles on Wikipedia – about euthanasia and Leonardo Di Caprio, for example – are the ones of the highest quality, as each side of the argument continues to add citations that support their case. But the criticism doesn't diminish for that.
A simple Wikipedia search shows barely a few active articles tied to Akwa Ibom's people, heritage, or achievements — a fraction of what states like Lagos, Anambra, or Kano command.
In an analysis of 35,000 Wikipedia entries of Australian places, only six per cent had an associated First Nations name and others showed attempts to remove Indigenous names, University of Technology, Sydney Professor Heather Ford found.
At a time when almost everything that Big Tech offers up is either a ruse to hoover up personal data or the expression of entrenched monopolies that make 19th Century US robber barons look like tax-and-spend social democrats, Wikipedia is a force for good. At its heart is a community of people who believe the imparting and recording of knowledge is something worth fighting for.
Wikipedia at least remains one of the few large-scale platforms that openly acknowledges and documents its limitations. Neutrality is enshrined as one of its five foundational principles. Bias exists, but so does an infrastructure designed to make that bias visible and correctable.
Ironically, while generative AI and search engines are causing a decline in direct traffic to Wikipedia, its data is more valuable to them than ever. Wikipedia articles are some of the most common training data for AI models, and Google and other platforms have for years mined Wikipedia articles to power its Snippets and Knowledge Panels, which siphon traffic away from Wikipedia itself.
For example, in his biogeochemical processes class, students have contributed to public knowledge about lakes, nitrogen cycling, microalgae, and more. So far this year, Cotner's students have added content to at least 47 Wikipedia articles, amounting to some two million views.
Members, who range in age from approximately 16 to 88 and work as academics, lawyers, photographers, recycling technicians and more, meet about once a month to socialize and share projects they're working on. Around twice a month, they also participate in events geared toward training new editors.
A man attending Wikipedia's annual conference in Manhattan was arrested Friday after brandishing a loaded revolver and threatening suicide inside an office building near Union Square, Bloomberg reported.
WikiConference North America 2025 kicked off at 9 a.m. at Civic Hall, a community center in Manhattan. During the opening ceremony, a man with a gun jumped onto the stage and pointed his weapon at the ceiling before being tackled by conference organizers, according to police and a Wikipedia spokesperson.
Wikipedia is famous for its real-time entries on unfolding disasters, but as of Friday evening it had not posted an entry about the near-tragedy, which unfolded at WikiConference North America at Civic Hall in Union Square. The annual gathering was being held in New York City for the first time in over a decade.
Also: If Wikipedia is "generally not considered a reliable source itself because it is a tertiary source that synthesizes information from other places," then what does that make a chatbot?
You see the notice at the top of a page sometimes that says, "The neutrality of this page has been disputed," or "The following section doesn't cite any sources." People like that. Not many places these days will tell you, Hey, we're not so sure here.
I also love a good Wikipedia entry, which I'll admit is an odd thing to say. But if you go deep enough down the rabbit hole, you can usually find some weird stuff some agitated editor has managed to slip into the record, quite possibly after months of debate with a different agitated editor. That's my favorite part.
Friday's incident wasn't the first reminder that Wikipedia editing can potentially be dangerous. The website has an article titled "List of people imprisoned for editing Wikipedia," which includes a Saudi Arabian volunteer sentenced to 32 years in prison for edits deemed "critical" of the government. A Belarusian editor was detained under the country's censorship laws requiring citizens to describe Russia's invasion of Ukraine as a "special military operation." Across the globe, editors deal with harassment, lawsuits, and imprisonment for their work. This stark reality has inspired a meme: "Editing Wikipedia Is Not a Crime."
Yes, neutrality sounds good, but Wikipedia's attackers are arguing in bad faith, said Stephen Harrison in Slate, and they really just want to "subordinate" reality to their own politics. Conservative commentators took particular umbrage with Wikipedia for "doing what an encyclopedia is supposed to do" after the killing of Charlie Kirk. It merely documented "what Kirk said." But what MAGA supporters of Kirk wanted was that the page "should double as a memorial."
Maybe, says Peel, AI will even enhance Wikipedia's value. In the age of artificial content, human-made work deserves a premium.
As Wikipedia continues to dig into its "trust" narrative, what it doesn't realise is that so much of that trust has already been eroded. And new players are ready to scoop it up, right when it matters most.
The Wikimedia Foundation, which oversees Wikipedia, is under scrutiny after failing to comply with a House Oversight Committee document request tied to allegations of anti-Israel bias and politically coordinated editing. The committee, chaired by Rep. Nancy Mace (R‑SC), requested documents by Sept. 10, but the Foundation has yet to fully respond.
Various studies over the years have tried to ascertain Wikipedia's political leanings, with some suggesting it leans moderately left in the context of U.S. politics, while others have found it to be generally down the middle. Studies also have indicated that articles tend to become more neutral over time as editors work on them.
The post suggests a major rework is underway to align Grokipedia with Musk's vision of "truth"—a vision that often clashes with mainstream moderation standards. Obviously, this opens the door for bias in the opposite direction, especially since the system lacks Wikipedia's core safeguards: transparent citations, edit histories and decentralized moderation.
New editors wrote about what they knew, or were excited about, not necessarily what a traditional encyclopaedia would consider most important. So while our articles about Shakespeare may have been thin, our coverage of Pokémon was deep and dazzling.
We want to communicate to everybody that Wikipedia is not a very comfortable place for extremists. If you want to rant about things and you want to be super biased, then go on, write your own blog. What we're looking for is kind and thoughtful people who care more about getting it right and being calm and factual.
On social media, if a user blocks another, they no longer see their content. Simple. But applied to Wikipedia, a free online encyclopaedia that relies on collaborative editing, it looks a lot more complicated.
Wikipedia's editorial guidelines stipulate that all entries must be written from a neutral point of view, "which means representing fairly, proportionately, and, as far as possible, without editorial bias, all the significant views that have been published by reliable sources on a topic." Wales added in the interview with BBC Science Focus that Wikipedia welcomes contributors from across the political spectrum, provided they follow its neutrality rules.
In a post on X, Musk announced that "version 0.1" of Grokipedia was now live and claimed "Version 1.0 will be 10X better, but even at 0.1 it's better than Wikipedia."
The site resembles Wikipedia in style and format, with articles on topics such as ChatGPT, Diane Keaton and the 2026 FIFA World Cup. But it appears significantly smaller, more opaque in its workings — and more right-leaning in how it framed some articles.
While many of the pages WIRED saw on launch day appeared fairly similar to Wikipedia in terms of tone and content, a number of notable Grokipedia entries denounced the mainstream media, highlighted conservative viewpoints, and sometimes perpetuated historical inaccuracies.
Wikipedia, which debuted almost 25 years ago, has faced increasing criticism from conservatives in recent months. Mr. Musk and his political allies have argued that the online encyclopedia is too "woke" and excludes conservative media outlets from its approved citations.
Here are some comparisons between Wikipedia articles and Grokipedia articles. These are copied verbatim from the intros of articles with the footnotes and links removed for ease of reading:
Grokipedia's operation differs from Wikipedia's in at least one major respect: no clear human authors. While volunteers write and edit Wikipedia, often anonymously, Grokipedia says its articles were "fact-checked" by Grok, the AI chatbot from Musk's startup xAI. Visitors to Grokipedia cannot make edits, though they can suggest edits via a pop-up form for reporting wrong information.
Grokipedia's design is pretty basic right now; like Wikipedia, the homepage is mostly just a big search bar, and entries resemble very basic Wikipedia entries, with headings, subheadings, and citations. I haven't seen any photos on the site yet. Wikipedia lets users edit pages, but it doesn't appear that users can currently do that on Grokipedia.
Good journalists struggle with questions like that every day. So do Wikipedia editors. And very often, they disagree. Even when everyone is committed to the principle of independence. Even when everyone is informed and thoughtful. Still, they disagree.
Observers have already noted some stark differences between the pages on Wikipedia and Grokipedia for topics like gender, Jan. 6, and Donald Trump. Take, for example, how Musk himself is described.
Any given Wikipedia page has been stress tested by actual humans who are discussing, for example, whether it's actually that unusual to get speared to death by a beach umbrella.
More recently, Musk has criticized Wikipedia as "an extension of legacy media propaganda" and called for the site to be defunded. Wikipedia has many well-documented issues with accuracy, racism, and bias, but they are not limited to any one worldview. By creating his own version of Wikipedia, Musk did not seem motivated to address those issues but rather gave himself the power to root out anything he deemed "woke" or leftist.
"Happy birthday Wikipedia! So glad you exist." So wrote billionaire Elon Musk on X, then Twitter, back in January 2021, on the 20th anniversary of the launch of the free online encyclopedia. Like the many others who regularly access the crowd-sourced site — more than a billion a month — he apparently regarded it as an invaluable free tool and noble undertaking that had democratized human knowledge the world over, all thanks to the tireless work of tens of thousands of volunteer editors and the nonprofit Wikimedia Foundation.
Conservatives have long accused Wikipedia of strong liberal bias, and Musk has accused Wikipedia of being "controlled by far-left activists." Users have already pointed out stark differences between Grokipedia and Wikipedia articles, starting with articles about Musk himself.
Presenting a detailed explanation of how Grokipedia and Wikipedia differed in their response to certain topics, Sanyal wrote: "Provide 20 Significant Instances of Grokipedia Corrections to Wikipedia's Ideological Biases on India-Related Issues. Below is Grok's answer".
Wikipedia has massive influence over AI systems, both as training data and as a go-to reference. It's one of the largest, cleanest collections of human knowledge online, which means large language models basically grew up reading it. That shapes everything AI "knows" about history, science, culture, you name it. When AI tools answer questions, they often echo Wikipedia's style and structure. Perplexity or Bing cite Wikipedia because it ranks so high in search results.
Much of the content on Grokipedia is also suspiciously similar to Wikipedia. In many cases, articles are practically — and sometimes literally — clones of their Wikipedia counterparts.
Musk baselessly claimed that the Wikimedia Foundation – the non-profit that hosts Wikipedia – was "controlled by far-left activists" and slammed it for devoting nearly $50 million of its $177 million budget for the 2023-24 fiscal year to diversity, equity and inclusion policies.
In contrast, Wikipedia is not perfect and, largely due to its open platform, is also filled with misinformation at any given moment, but there's a human-centric system in place to take care of it.
Wikipedia's entries sometimes include similarly controversial statements, but these usually have multiple footnotes referencing academic sources of different backgrounds — say Palestinian historians, Israeli historians and American Jewish historians — to prove that controversial lines in their entries are neutral and factual; these footnotes often include the relevant quote.
One major factor that makes Grokipedia different from Musk's other rival-fueled enterprises is that Wikipedia is quite popular, well-liked, and widely trusted. There's no substantial burgeoning dissatisfaction with or opposition to Wikipedia fueling demand for an alternative. To the contrary, in the world of mass-market information, it's one of the strongest brands out there.
If you've ever looked something up on Wikipedia (and who hasn't?), there's a fair chance you've read the work of Steven Pruitt. Known online by his operatic pseudonym 'Ser Amantio di Nicolao', Pruitt has made more than six million edits to the English-language site and created over 33,000 articles – more than anyone else in its near 25-year history.
The types of pages where Grokipedia seemed to beat Wikipedia were the unloved, scraggly entries on Wikipedia. You know the kind — where it truly seems like a bunch of people added in a single sentence once a year for the last 15 years.
Publishing rival articles on the same topic would fragment Wikipedia's core value proposition: a single, synthesised summary of human knowledge. And deciding on what's most likely to be true on the basis of a show of hands is not the wisest way to reach a conclusion.
Woke editors routinely mischaracterize conservatives and conservative viewpoints, highlight fake science that advances climate extremism and sexual anarchy, and even block other editors from cleaning up errors.
Even Grok, the xAI chatbot the new site is named after, told me that "while Grokipedia improves on specific Wikipedia flaws – like verbose, overly critical entries on conservative topics – its AI gatekeeping creates a centralised 'Musk's truth' filter, lacking Wikipedia's distributed checks", and that "it trades one set of biases for another, often with less accountability". Quite. But Musk's crusades will surely continue.
You can also see the fights that editors and contributors have had over changes, which provides a great deal of fun on its own. Best of all, Wikipedia remains a nonprofit to this day, which keeps its motives relatively pure in a world beset on all sides by bad faith.
Wikipedia owes its ubiquity today to platforms like Google. Google simply didn't want to strike deals with publishers and pay; and so Wikipedia obligingly sang the kumbaya tune of giving the world free information and ensured that its material could be freely lifted.
This outcome suggests there is no simple, direct line connecting higher reliability with one particular political viewpoint in Wikipedia's sources. The finding that sources with a liberal bent are chosen seems independent of a straightforward preference for the most reliable outlets available.
Contrary to what you were taught in middle school, Wikipedia is remarkably accurate. It's used by doctors and professional fact-checkers. Even large language models (LLMs) and the AI products they power, such as Elon Musk's new Wikipedia competitor "Grokipedia," rely on the site for training data.
Yet in a world where fact and fiction are increasingly difficult to tell apart, and as artificial intelligence (AI) reshapes how we create and consume knowledge, where does Wikipedia fit in?
In a statement to The Verge, Wikimedia Foundation spokesperson Lauren Dickinson said it's not unusual for Wales to comment on Wikipedia entries. "In his personal capacity during interviews about his new book, Jimmy Wales has discussed multiple Wikipedia articles and topics, expressing his own perspectives and reflections," Dickinson said.
:Wikipedia has never, ever treated all voices as equal, nor does policy demand we do. If we did, the Earth article would state that Earth's shape is under debate. But we don't do that because scholarly consensus is that Earth is roughly spherical. Instead, flat eartherism is presented as what it is: a fringe movement without scientific backing," the editor wrote.
On social media, others considered that Wales had humiliated himself by wanting to give the opinions of the Israel lobby equal weight to those of United Nations experts, human rights groups and genocide scholars, and by 'caving' to the Israel lobby:
"Have you ever considered," the joke goes, "that Wikipedia was built like the modern-day Library of Alexandria on the mere need of geeks to correct each other?" A joke, but also a picture of the digital encyclopedia that was long in vogue: a creation of geeks and non-profit enthusiasts. Astonishingly accurate. Extremely comprehensive, with detailed articles on everything from the biggest topics to the most obscure. A symbol of the optimism of the early internet, where information wanted to be free and voluntary collaboration would create something amazing.
Wikipedia is one of the latest institutions to face threats from the Trump administration. White House officials and Republicans in Congress are targeting the site over what they call a liberal bias.
Wikipedia is fundamentally an act of democratic faith: thousands of strangers believing that truth can emerge from disagreement, that evidence matters more than authority and that knowledge serves everyone better when it's collectively stewarded. Grokipedia represents a different faith entirely, one in the perfectibility of truth through algorithmic curation by a single individual.
Yet Musk's AI encyclopedia is also part of something broader: an escalating campaign to discredit Wikipedia and reshape what counts as a reliable source of basic information in the age of AI. Whatever the potential flaws of a crowdsourced reference site, many users often find Wikipedia more convenient, comprehensive, and reliable than any alternative.
Why do American Republicans take such issue with Wikipedia? Researcher Jeanne Vermeirsche believes it is because "it is a space that cannot be privatized, where knowledge cannot be immediately controlled and where everyone must abide by the rules.
The site Grokipedia is trying to replace is the result of an unprecedented bottom-up phenomenon in which millions of people contributed time, attention, and effort to create a shared resource, synthesizing existing information through a messy, flawed, but ultimately deliberative and productive process.
The Wikimedia Foundation, the non-profit entity that provides Wikipedia with technological and legal support, strongly advocates for human-generated content, stressing that AI's knowledge ultimately derives from human-created information.
There's a right-leaning or conservative criticism, opens new tab that Wikipedia relies too much on the liberal woke establishment (in its sourcing of articles). How does that connect with what we were just talking about?
The Wikimedia Foundation (WMF), the nonprofit behind the global encyclopedia Wikipedia, has offered up a stark message to generative-AI firms: quit scrapping the freely accessible site, and instead pay for access via its enterprise API.
The Wikimedia Foundation's support of groups like Whose Knowledge, Art+Feminism, Black Lunch Table, and AfroCROWD is long-standing, with tax filings from past years showing hundreds of thousands of dollars more going to those groups.
Academic research has shown that Wikipedia does have biases, mostly due to its reliance on establishment media, academia and other institutional sources. Wikipedia also has an issue with the make-up of the volunteers who edit its articles – the vast majority are men. AI models, similarly, have been shown to exhibit biases contained within the data sets on which they are trained.
The result is that much of Wikipedia is now a cesspool of antisemitism and anti-Israel lies. The examples are too numerous to count. Here are just a few examples.
Six nigerians across the country's six geopolitical zones have received honorary awards from Wikipedia Foundation as Wikipedia Nigeria hosted the Nigeria Wikipedia Community Distinguished Awards (NWCDA) 2025 and 10th year anniversary in Lagos.
Wikipedia editors, when they're acting at their best, they really do believe that there's something more than partisan politics—and that's the accurate reflection of reliable sources. That higher purpose can bring together people of different politics. One of the things I'm trying to make clear in my reporting is that Wikipedia is not a political monolith, one way or the other.
While Wikipedia focuses on initiatives like one called "Queering Wikipedia," what we don't see are similar projects dedicated to Christian, conservative or even politically centrist voices.
One of Wikipedia's many powers is its deep integration with the World Wide Web. Internal hyperlinks allow you to browse related articles and wander down rabbit-holes. References on Wikipedia link to external sites where possible. Beyond its immediate utility, this embrace of Web technologies is what led to it being ranked highly in the early days of Google results, and its subsequent continued success in it being heavily linked to from elsewhere today.
The inclusion of speculative content about Jesus' sexuality undermines Wikipedia's credibility and risks promoting misinformation about one of the most significant figures in human history. Faith leaders and scholars have called for greater editorial accountability, warning that the platform's handling of religious topics distorts centuries of well-documented teaching in favor of contemporary ideological trends.
Instead, it's time for Wikipedia to pull itself out of the culture wars by taking reasonable steps, such as those Sanger suggested, to ensure it practices the nonpartisanship it preaches. If it doesn't do so, Congress should consider expanding its investigation of Wikipedia to cover not just the site's antisemitism but its active censorship of Americans.
And here are some that were so creepy, people on Reddit just had to share them:
Wikipedia, as a rule, relies on published secondary sources and tells contributors to avoid doing their own original research. It also prohibits some sources that it views as unreliable or untrustworthy. But Grokipedia is different.
This is not the first time Wikipedia locked a page on a contentious topic. Thousands of such articles covering significant world events, key definitions, or celebrity figures have been temporarily locked in the past to prevent partisan edits by unvetted editors as well as acts of Wikipedia "vandalism" that range from funny to hateful.
Since 2023, Wikipedia editors have been working to get a handle on AI submissions, a project they call Project AI Cleanup. With millions of edits coming in each day, there's plenty of material to draw on, and in classic Wikipedia-editor style, the group has produced a field guide that's both detailed and heavy on evidence.