Wikipedia has an AI problem.
Wikipedia, once derided by academics and memes alike, has gotten a bit of a glow-up lately. As ChatGPT, Gemini, Claude, Grammarly, and other LLM tools have flooded the Internet with poorly written, often-inaccurate slop, many people have begun to long for "the good old days" of the Internet. Perhaps you've urged people on social media to use Wikipedia as a substitute for ChatGPT, Google's AI Summaries, or similar tools. Maybe you've begun to donate money to the site, to keep this mission alive.
Unfortunately, this view is outdated by several years. Wikipedia is no longer an "alternative" to AI. Wikipedia is becoming AI.
Since 2023, Wikipedia has been quietly overwhelmed by a flood of AI-generated text, much of it unnoticed and unedited. If you are reading a high-profile page on Wikipedia, there is a non-trivial chance that some of it was written by AI.
The problem is far worse than reported, extending to even the most visible and most-promoted articles. Nor is it caused by one or two bad actors. It is the result of thousands of editors over the years, and growing to this day.
Why is this happening? There are many reasons, which this site will explore in depth. But here are some of the big ones.
Wikipedia has no policy on AI.
No, really, it doesn't!
ChatGPT was launched in late 2022. While Wikipedia acknowledged this, it failed to establish any enforceable policy or guidelines about its use, besides a non-binding, somewhat obscure essay. Three years later, it has still failed to establish any enforceable policy. The flood of AI content on the encyclopedia is the predictable result.
Wikipedia has fewer editors.
Wikipedia has experienced a substantial decline in human visitors since Google's AI Summaries (often scraped from Wikipedia) discouraged people from clicking through. However, the problem goes well beyond that; the site has been hemorrhaging editors since 2007.
This means there are fewer people making edits across the board, let alone the in-depth work of detecting and fixing AI content.
Wikipedia is enshittifying itself to juice its growth.
Faced with this decline in visitors, the Wikimedia Foundation has resorted to the same enshittifying tactics other sites have used to juice traffic and monetization. Several new features targeted at editor retention are making the AI problem worse: gamification features that encourage people to mass-edit with AI, outreach materials that tacitly endorse its use, or sometimes just their own homegrown slop.