When the Capitol Was Attacked, Wikipedia Went to Work – Washington Monthly
On January 6, Jason Moore was working from his home in Portland, Oregon and flipping between CNN and MSNBC as Donald Trump supporters gathered outside the U.S. Capitol. Watching what was unfolding in D.C. on cable news, I found it initially fascinating, and then, later, terrifying, he told me.
Moore, a digital strategist, is one of the top 55 contributors to the English-language version of Wikipedia. The free online encyclopedia has more than six million articles in English and is maintained by more than 100,000 regular volunteer editors like Moore. Around 1:30 p.m. eastern time, Moore started a new Wikipedia page to document what was then just a protest. He titled it: January 2021 Donald Trump rally.
I have a personal interest just in documenting political movements, said Moore, who goes by the username Another Believer. He logs onto his Wikipedia watchlista feed of the changes that have been made to the pages he wants to trackseveral times a day, like someone else might log on to Twitter or Facebook. Im a bit of a political junkie.
As the Capitol protest escalated into a violent assault, Moore was tabbing between Google News, the Wikipedia article he had created, and the articles talk page, where volunteer editors could discuss changes with one another. Hundreds more volunteer editors were chiming in. As chronicled by Alex Pasternack in Fast Company, Wikipedians debated the reliability of different sources and the accuracy of terms, and documented the democratic cataclysm in real time. It became, said Moore, this hurricane of people sifting through a lot of information at once.
Moore estimates he spent about ten hours editing the page now titled 2021 storming of the United States Capitol and closely related pages. The entry runs nearly 13,000 words long and has hundreds of external source citations. It has sections on intelligence, or the lack thereof, leading up to the attack; on police preparations; on the participation of state lawmakers; on the House and Senate evacuations; on the completion of the electoral vote count; and more. More than 1,000 volunteer editors worked together on the entry, which is still being updated regularly.
The page is the result of a remarkably collaborative online community of volunteers who edit, verify, and generally obsess over the vast, always-in-motion encyclopedia. Wikipedia is not without faults; it doesnt take much poking around to find a page with a major error. (Last year, a Reddit user unearthed that an American teenager who did not speak Scots, a Scottish dialect, had written almost half of the articles on Scots Wikipedia. The pages were riddled with grammar mistakes). Wikipedia is also not representative of the public; the vast majority of its volunteer editors are male, and fewer than 20 percent of Wikipedias biographies are about women.
But Wikipediaone of the most visited websites in the U.S.has avoided many pitfalls that have hobbled other online platforms. Twitter, Facebook, and YouTube are facing a backlash for their role in propagating misinformation. After Trumps repeated false claims about election fraud propelled his followers to break into the Capitol, all three companies suspended his accounts. It might have been the right call in the moment, but it also raised uncomfortable questions about the outsize power over discourse wielded by a tiny number of executives at private companies. Wikipedias bottom-up model, shaped by thousands of volunteer editors, proves that theres another way to build online communities.
Other special volunteer roles help keep the site running. An arbitration committee, also made up of vetted, experienced editors, settles the most contentious disputes; checkusers, an elite group of Wikipedia editors, are granted access to technical data to figure out if several Wikipedia accounts are being operated by one person. These privileged editors help deal with difficult situations, but much of the day-to-day work of editing Wikipedia is handled by regular volunteers making changes, discussing issues, following the suggested dispute resolution process, and ideally, landing on a consensus. The site even has principles for how editors can best collaborate, dubbed Wikiquette.
As protestors at the Capitol turned violent, one major debate among Wikipedia editors was how to describe the event in the pages title. Was it a protest? A riot? An insurrection? A coup attempt? There is a clear consensus thatprotestis inadequate to describe these events, wrote a Wiki editor with the username Matthias Winkelmann. Riot is a more appropriate label for the events that took place, responded a user called Bravetheif. I oppose protests and oppose storming, but support 2021 United States Capitol Siege or 2021 United States Capitol Breach, wrote another editor calling themselves RobLa. On the morning of January 7, an editor with the username CaptainEek set the page title to 2021 storming of the United States Capitol.
But the debate roared on, with editors making a case for their preferred term. Volunteers catalogued which terms different reputable publications had used. Their list of generally reliable sources that had used coup included theAtlantic, Buzzfeed News, and theLos Angeles Times. The list for insurrection included the Associated Press, Axios, and NPR.
This appeal to reputable sources springs from the ethos of Wikipedia content. According to English Wikipedias Verifiability policy, an editor can be sure something is true, but if its not verifiable with a reputable source, it shouldnt be added to a page. The site has a chart of publications categorized by the current consensus view of their reliability. The consensus can and does change. In 2018, for example, Breitbart was deprecated by a consensus of editors, meaning it could no longer be cited as a reference for factual matters. A year prior, editors had made a similar decision about the Daily Mail, a British tabloid.
The imperative to provide reliable sources is one way Wikipedia editors keep misinformation off of contentious pages. When one user proposed an edit suggesting that the Capitol rioters were not really Trump supporters, but rather antifa, an editor with the username Anachronist responded, interrogating the sources provided for the proposed edit:
Lets examine those sources. A student newspaper (byu.edu) isnt a reliable source. TheWashington Timescontradicts your proposal . . . explicitly saying thatnoAntifa supporters were identified. I could stop right there, but lets go on:Fox Newsis not considered a reliable source for political reporting, and the Geller Report is basically a blog, self-published, and therefore not usable.
The proposed edit never made it through, since administrators had placed the page under protection, meaning less experienced editors could not make changes directly to the page. Thats a common step for entries on contentious topics. By the evening of January 6, the Storming page was placed under extended-confirmed protection, meaning that for the next two days, only editors who had made over 500 edits and had had their account for 30 days or more could make changes. (After two days, the page was set to a slightly lower level of protection). This helped enormously with the level of disruption, said Molly White, a long-time Wiki editor and administrator, in an email.
White, a software developer in Cambridge, Massachusetts who goes by the username GorillaWarfare, made multiple edits to the Capitol Storming page. I was horrified and anxious to watch this all unfold, she explained, but editing on Wikipedia felt better than doomscrolling. This is something I do oftenif Im trying to understand whats happening or learn more about something, I will go edit the Wikipedia article about it as I do. White primarily edits pages related to right-wing online extremism. She wrote much of the Wikipedia pages for Parler and Gabalternative social media apps popular among Trump supporters and right-wing provocateursand contributed significantly to the entry on the Boogaloo movement.
Wikipedia can count on having humans in the loop on content decisions, rather than relying on artificial intelligence, because its much smaller than YouTube or Facebook in terms of active monthly users, said Brian Keegan, an assistant professor of information science at the University of Colorado Boulder. Thats helpful because content decisions often require understanding context, which algorithms dont always get right. Humans can also offer more nuanced feedback on why an edit is being reversed, or why a page is being taken down.
Of course, Wikipedia doesnt always get it right either. Less trafficked pages receive attention from fewer editors, which can easily result in significant factual errors. But pages that attract more attention from editors are often of high quality, thanks to a fairly functional system of collaboration and cross-checking. In fact, other social media companies have come to rely on Wikipedia as a source of reliable information. In 2018, YouTube announced it would link to Wikipedia pages alongside its videos about conspiracy theories in an effort to provide users with accurate information. In 2020, Facebook began testingWikipedia-powered information boxes in its search results.
What Wikipedia illustrates is that the problems with Facebook, Twitter, YouTube, and other social media platforms arent that they are social or that theyre populated by user-generated content. Its their business models. All three are for-profit companies that make their money through micro-targeted advertising, which means they have strong incentives to show users content that will keep them on their platform for as long as possible and keep them coming back. Content that confirms users beliefs or stokes their preexisting resentments can be good for business. That only overlaps with the truth some of the time.
As a nonprofit, Wikipedia operates within a fundamentally different set of incentives. It doesnt rely on advertising revenue and it doesnt need to drive up user engagement. The Wikipedia community has instead been able to develop norms and policies that prioritize the integrity of the content. A platform like Wikipedia has no compunction about shutting down access to editing their articles, or stopping people from creating accountsall these things that would really hurt topline numbers at shareholder-driven organizations, said Keegan.
The irony of the Capitol Storming page is that so many volunteers worked so hard to accurately document an event fueled by lies. For every claim that the election had been stolen or Mike Pence had the power to stop the count, there was a volunteer clicking through news reports, trying to get it right. Nearly a month later, the page still isnt complete. When I asked Molly White how she would know when to stop working on it, she wrote that Wikipedia is never finished, and pointed me to a corresponding Wiki entry titled Wikipedia is a work in progress.
Update: A reference to Fast Companys article on the same Wikipedia page was added on Feb 8.
See the rest here:
When the Capitol Was Attacked, Wikipedia Went to Work - Washington Monthly