By nature, all wikis—websites edited and managed by their users—are experiments in collaborative self-regulation, but few approach the scale of Wikipedia. It’s one of the most important websites ever created, perhaps “humanity’s best effort at collecting all our knowledge in one place” with over 6.5 million articles, making it 90 times larger than the Encyclopedia Britannica. But with that vast catalog comes the massive task of regulating content, a uniquely daunting challenge for a collective source being edited by people all over the world. Because anyone can edit Wikipedia pages for free, there’s an ever-present possibility of ne’er-do-wells messing with information. The site takes proactive steps against this: “Wikipedia’s administrators protect some pages” over concerns of “‘vandalism’ – the addition of abusive language or falsehoods.”
False and biased edits come from all over the place, threatening the site’s integrity. Sometimes, companies corrupt Wikipedia pages to serve their own interests: employees of “Medtronic, a company that sells medical devices used for, among other things, kyphoplasties,” altered the page for kyphoplasty to replace the descriptor “‘controversial’” with “‘well documented and studied.’” Wikipedia has faced challenges with “undisclosed paid editing,” even banning a firm called Wiki-PR in 2013 “for using hundreds of dummy accounts to fabricate widespread support for pages that flattered its clientele.”
Wikipedia has even had to deal with political infiltrations. In the early 2000s, the CIA and FBI edited pages “on topics including the Iraq war and the Guantanamo prison.” And the day after January 6th, 2021, John Eastman, a key conspirator in Donald Trump’s failed seizure of the U.S. presidency, edited his own page with, among other changes, a “less accusatory description” of the attempted coup. His editing “drew immediate attention” and within two hours, “his edits were made to disappear.”
But the most influential case of a false Wikipedia edit was actually a foolish prank. In 2005, John Seigenthaler Sr., “an established luminary in the world of journalism,” as well as “an administrative assistant and later a pallbearer for [John F.] Kennedy,” found out that for months, his Wikipedia page had read that he’d been involved in Kennedy’s asassination. He went on a national news media blitz, calling for accountability and for years after, “no journalist could write a story [about Wikipedia] without mentioning the dark side of the site’s openness.” The culprit was discovered to be an employee of a small shipping company in Nashville, who “placed the entry as a joke,” and after the controversy, they resigned. But an impact had been made, and Wikipedia took action to combat the negative PR, tightening policies around handling living people’s pages and thus their reputations. Now revisions to articles on living people, particularly edits by new users, “require a trusted editor to accept changes.”
In the decade and a half since the Seiganthaler Incident, Wikipedia has developed tools to combat bad edits. The site uses bots that “highlight some of the most common patterns of undesired edits,” presenting likely problems for deletion at a human’s discretion. Other bots help weed out graphic pictures from the “‘bad image list,’” though in certain contexts deemed “proper” some graphic content is allowed.
Not everyone has access to those tools though: despite being a collaborative effort, Wikipedia still has a hierarchy. At the bottom are unregistered editors, and above them are editors who have created a free Wikipedia account, which among other perks allows members to hide their IP addresses from the public. Very established editors, who are active in the Wikipedian community, have more sway in discussions about issues requiring a collective consensus to be reached. Above editors are administrators, who have the power to delete pages, protest pages, block and unblock users, and edit fully protected pages. There’s also a group called bureaucrats, and three dozen users worldwide have been labeled stewards, wielding some additional powers that I don’t entirely understand. To be honest the structure gets pretty confusing.
Similarly, policies on the content of are overwhelmingly extensive while also oddly flexible. Their gist is to only write unbiased factual information in articles, based on existing sources and not original research. But the details of what that means get very messy very quickly. I clicked through page after page of policies on Wikipedia. Some articles were long and some were short. Some were adamant about rules being followed, some made rules seem more like suggestions, or guidelines to follow with common-sense discretion. This lines up with my sense that Wikipedians informally self-regulate their population by discouraging outside participation, whether deliberately or not. The dense complexity of Wikipedian culture, creates an insular community which, working with entrenched inequities in the tech field, stifles diversity. Men heavily outnumber women at every user access level, with only one steward being a woman, and the Wikipedian population is based largely in Europe and North America, with significant gaps in Africa and South America. Thus an ostensibly level playing field (free access for all), replicates existing inequities.
Understanding the intricacies of Wikipedia’s processes and culture, while fascinating, would require much more space than I have here. However, I think it’s most important to recognize that while it is run by users, Wikipedia is not egalitarian, nor a democracy. Plenty of issues go through community review, but administrators, for example, make decisions of inclusion and omission every day using their personal discretion. They are still volunteers but they have been given additional authority.
There are a limited number of paid employees who maintain the site. Their salaries, as well as service costs, are covered by Wikipedia’s parent non-profit the Wikimedia Foundation, which raises and spends dozens of millions of dollars every year to support Wikipedia as well as a few related projects. The allocation of this money is decided by the Board of Trustees, which generally leaves non-budgetary decisions to the actual Wikipedia community. The community has in fact flexed their collective power several times, pressuring the Board of Trustees with the threat of labor stoppages. Wikipedia’s regular operations are genuinely regulated almost entirely by the same volunteers who write its articles, which is a tremendous communal feat, regardless of caveats like user hierarchy.
So far I’ve only discussed Wikipedia’s internal regulation, but since the Wikimedia Foundation is based in San Francisco and Wikipedia’s main database is in Tampa the site is still subject to outside regulation from the U.S. government. It has no offices outside the U.S., just user associations which have no formal association with the Wikimedia Foundation. Lax U.S. libel laws and Section 230 protections make the U.S an ideal home base. Still the site has explicit internal policies against posting content that would be illegal including libelous claims and copyright infringement.
International access to Wikipedia is also subject to government regulation. All access to Wikipedia in China is currently blocked by their government for example. And recently, Russian authorities “fined and threatened to silence Wikipedia for publishing ‘prohibited materials’ and ‘fake’ content about the war.” These attempts to stifle Wikipedia indicates just how important a resource it is. Though not infallible, Wikipedia has managed to create an enormous, fairly reliable collection of human knowledge, relying almost entirely on volunteer effort and community self-regulation.
Bibliography
Cohen, Noam. “Vips Expect Special Treatment. At Wikipedia, Don’t Even Ask.” The
Washington Post, October 28, 2021.
https://www.washingtonpost.com/outlook/wikipedia-jimmy-wales-john-eastman-editing
/2021/10/28/f2d61bea-35fd-11ec-9bc4-86107e7b0ab1_story.html.
Dewey, Caitlin. “Wikipedia Has a Ton of Money. So Why Is It Begging You to Donate Yours?”
The Washington Post, October 26, 2021.
https://www.washingtonpost.com/news/the-intersect/wp/2015/12/02/
wikipedia-has-a-ton-of-money-so-why-is-it-begging-you-to-donate-yours/.
Khatsenkova, Sophia. “Russia’s Answer to Wikipedia: Propaganda or Common Sense
Encyclopedia?” Euronews, September 15, 2022.
https://www.euronews.com/my-europe/2022/09/15/russias-
answer-to-wikipedia-propaganda-or-common-sense-encyclopedia.
Lurie, Stephen. “The 36 People Who Run Wikipedia.” Medium, March 11, 2015.
https://medium.com/matter/the-36-people-who-run-wikipedia-21ecca70bcca.
Mannix, Liam. “Evidence Suggests Wikipedia Is Accurate and Reliable. When Are We Going to
Start Taking It Seriously?” The Sydney Morning Herald, September 13, 2022.
https://www.smh.com.au/national/evidence-suggests-wikipedia-
Is-accurate-and-reliable-when-are-we-going-to-start-taking-it-seriously-20220913-
p5bhl3.html.
Mikkelsen, Randall. “CIA, FBI Computers Used for Wikipedia Edits.” Reuters, August 16, 2007.
https://www.reuters.com/article/us-security-wikipedia/cia-fbi-computers-
used-for-wikipedia-edits-idUSN1642896020070816.
Pinsker, Joe. “The Covert World of People Trying to Edit Wikipedia for Pay.” The Atlantic.
Atlantic Media Company, August 12, 2015.
https://www.theatlantic.com/business/archive/2015/08/wikipedia-
editors-for-pay/393926/.
“User Access Levels.” Wikipedia. Wikimedia Foundation. Accessed October 7, 2022.
https://en.wikipedia.org/wiki/Wikipedia:User_access_levels#Extended_confirmed_users.
Wall, Matthew. “Wikipedia Editing Rules in a Nutshell.” BBC News, April 22, 2015.
https://www.bbc.com/news/technology-32412121.
Walsh, Kathleen M., and Sarah Oh Lam. “Self-Regulation: How Wikipedia Leverages
User-Generated Quality Control Under Section 230.” SSRN, March 31, 2010.
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1579054.