Wikipedias New Code Of Conduct Gets One Thing Right; Another Will Be A Struggle – Forbes

ANKARA, TURKEY - JANUARY 15: (BILD ZEITUNG OUT) In this photo illustration, The logo of Wikipedia is ... [+] seen on the screen of a laptop with a magnifying glass on January 15, 2021 in Ankara, Turkey. (Photo by Altan Gocher/DeFodi Images via Getty Images)

A major social network announced a new set of rules for its members Tuesday, and by itself that might not rate as news.

But Wikipedia isnt just any social network, and its new rulebook stands apart from the terms of service handed down by commercial social platforms like Facebook and Twitter.

The Universal Code of Conduct announced Tuesday by the Wikimedia Foundation, the San Francisco nonprofit that hosts Wikipedia and related projects, isnt a top-down product. Instead, Wikipedians collaborated to write it, much as almost 356,000 of them regularly create or edit entries in that online encyclopedia.

More than 1,500 Wikipedia volunteers from 19 different Wikipedia projects representing five continents and 30 languages participated in the creation of the universal code of conduct, Wikimedias announcement notes.

That goes well beyond earlier moves by commercial social platforms to borrow the collective wisdom of their crowds. See, for example, Twitter, adopting the foundational features of @ mentions and hashtags from its early users, or Facebook letting users vote on new terms of service before scrapping that experiment in 2012 after too few people bothered to cast virtual ballots.

At Wikimedia, the collective drafting of the new code began with input from around the world about the need for revisions to its earlier terms and involved months of collaboration.

Theyre an alternative model to the private social experience that exists almost everywhere else, said Alex Howard, director of the Demand Progress Education Funds Digital Democracy Project.

The results also differ from many other codes of conduct by virtue of being unusually shortunder 1,700 words, or less than 1,300 if you subtract the introductory paragraphs.

The operative text starts not on a thou-shalt-not note, but with a you-should list of expected behavior of any user: Practice empathy; Assume good faith, and engage in constructive edits; Respect the way that contributors name and describe themselves; Recognize and credit the work done by contributors, among others.

The organization is saying, here are our values, Howard said. Theyre giving people scaffolding to interact with each other.

An Unacceptable behavior list follows, including a broadly constructed ban on harassment. This covers the usual categoriesfor instance, insults targeting personal characteristics, threats, and doxingbut also covers the broader category of being a jerk.

Thats both necessary, because people who punch down a little in public often do more often in private, and tricky because these lesser fouls arent as obvious.

People at times assume that its unintentional, said Caroline Sinders, founder of Convocation Design + Research and an expert in online harassment research whos worked with the Ford Foundation, Amnesty International and others (including an earlier stint at Wikimedia itself).

Or, she added, the offense will go unrecorded and then forgotten without a ladder of accountability that recognizes how unchecked minor abuses can lead to more toxic behavior.

These provisions also cover behavior outside Wikimedia projects. For example, the doxing clause notes that sharing other contributors private information, such as name, place of employment, physical or email address without their explicit consent is out of line either on the Wikimedia projects or elsewhere.

Theres a complicating factor here in Wikimedias understandable lack of a real-names policyenforcing one would endanger marginalized communities, and in particular those living under abusive governments. Wikipedia doesnt even require an email address to create a contributor account.

Wikimedia Foundation communications lead Chantal De Soto noted this issue in an email: enforcing any breaches of conduct that happen on other platforms is often very difficultverifying connections between Wikimedia accounts, and, for example, a Twitter account, is often not straightforward.

But its important that Wikimedia communities make that effort, considering all the evidence now available of how online radicalization can erupt in the physical world.

All we have to do is look at January 6 to get a sense of what happens when that goes too far, Howard said of the riots that took place at the U.S. Capitol.

The next chapter in Wikimedias effort will involve more collaboration on enforcement policies and mechanisms. This may be the most difficult part, since it will involve setting up structures that can work at scale and across cultures.

A community needs to think about how theyre going to document these cases, who has access to them, how are they keeping track of things, how are they going to respond to harassment, said Sinders.

Done right, this may require hiring more dedicated trust-and-safety professionals.

In open-source communities, a lot of this arduous labor is falling to volunteers, Sinders warned. And that leads to community burnout.

Read the original post:
Wikipedias New Code Of Conduct Gets One Thing Right; Another Will Be A Struggle - Forbes

Related Posts

Comments are closed.