Archive for the ‘Wikipedia’ Category

Why Is This Flower on Wikipedia Suddenly Getting 90 Million Hits Per Day? – VICE

The Michaelmas daisy is an innocuous purple flower that grows in fields in the American northeast, and for some reason, an image of it is suddenly getting millions of hits a day. A picture of this perfectly normal flower hosted on Wikipedia is currently responsible for 20 percent of the traffic on one of Wikiemdias data centers.

We've noticed today that we get about 90M hits per day from various ISPs in India, a post on PhabricatorWikimedias collaboration platformsaid. These are very strange, as they come from wildly different IPs, follow a daily traffic pattern, so we are hypothesizing there is some mobile app predominantly used in india that hotlinks the above image for e.g. a splash screen. We need to investigate this further as this kind of request constitutes about 20 percent of all requests we get in EQSIN for media.

EQSIN is the name of a Wikimedia data cluster in Singapore. For more than six months, 20 percent of the traffic to that server were requests to look at the daisy. Wikimedias data is public and a chart showing daily requests to access the picture of the flower show a clear trend. Before June 8, the flower had pretty low numbers. On average, the flower gets a few hundred views. On June 9, the number jumps to 2,154. On June 10, it hit 15,037. By June 30, it had more than 15 million daily hits.

Chris Albon, director of Machine Learning at Wikimedia, pointed out that weird trend on Twitter. One reply pointed out that the huge upsurge in requests to see the flower coincided with India banning TikTok and several other Chinese apps. India banned TikTok on June 29, 2020. The flower was already getting more views than normal before then, but it did experience a huge surge in popularity after the ban.

After the TikTok ban, clones of the app flourished in India and some of the people investigating the mystery are speculating that one of these new apps is accessing the flower picture. It is most likely an app, given the header information above and also based on some other connection attributes, one investigator said. The question is which app though as some of us have gone through the popular apps in India but haven't been able to identify which app it is. It is also possible that the code was embedded in some app and that it requests the image but does not display it.

After several days of investigation, the Wikimedia team tracked down the app and confirmed that it was, indeed, a mobile app. I just wanted to share that we have identified the app and will update this task tomorrow, an investigator said. And yes, it was a mobile app.

Wikimedia hasnt yet revealed the name of the app and, according to its data reports, EQSIN is still getting hammered for requests to see the pretty purple daisy.

Wikimedia did not immediately respond to Motherboards request for comment.

Original post:
Why Is This Flower on Wikipedia Suddenly Getting 90 Million Hits Per Day? - VICE

Wikipedia has a new Universal Code of Conduct to deal with harassment, misinformation – The Indian Express

Wikipedia now has its own Universal Code of Conduct, a first-of-its-kind document that will create a global set of community standards for addressing negative behaviour on the site.

The code is the result of recommendations that were made as part of 2018 global consultation with Wikipedia communities called the 2030 Movement Strategy, Amanda Keton, General Counsel of the Wikimedia Foundation told indianexpress.com over an email query. The global consultation included 200-plus salons, which are community-organised regional gatherings, spread across 50 countries with over 2,000 Wikipedia community members being involved.

Before this new universal code, there was no consistent way of addressing harassment on the platform and the incidents were addressed on a case by case basis, and varied project by project, she pointed out.

Keep in mind that Wikimedia is made up of more than 300 language Wikipedias and other related projects such as Wiktionary, Wikimedia Commons and Wikidata, the new code will apply a standard protocol and consistent framework to deal with harassment across all of these projects and Wikipedias, according to the company.

While some of the projects like English Wikipedia followed more established standards for harassment, others were not as far along in their journey, according to Keton. The universal code will try and solve this challenge.

Wikimedia also felt that given how various tactics are used to spread misinformation in the current internet era, it was important to enhance our mechanisms and establish new measures for dealing with deliberate attempts to add false information on the site, she said.

Our new universal code of conduct creates binding standards to elevate conduct on the Wikimedia projects, and empower our communities to address harassment and negative behaviour across the Wikimedia movement. Through this effort, we can create a more welcoming and inclusive environment for contributors and readers, and a more representative source of knowledge for the world, Katherine Maher, CEO of the Wikimedia Foundation, said in a statement.

Wikimedia says the new code is transparent, only 1600 words long, and not opaque as community standards tend to be with other tech companies. The goal of this code is to define harassment and unacceptable behaviour.

The codes distinguishing standards include delinating harassment on and off the projects for all Wikipedia participants, preventing the abuse of power and influence to intimidate others, combating deliberate introduction of false or inaccurate content and provide consistent enforcement process and shared responsibility between the Foundation and volunteer communities.

The code also explains the reasons why it was adopted, stating that it defines a minimum set of guidelines of expected and unacceptable behaviour. Further, this code will apply to everyone who interacts and contributes to online and offline Wikimedia projects and spaces, including new and experienced contributors, functionaries within the projects, event organisers and participants, employees and board members of affiliates and employees and board members of the Wikimedia Foundation.

Wikipedias universal code also expands on what will constitute harassment on the platform. For instance, around insults the code explains that these may refer to perceived characteristics like intelligence, appearance, ethnicity, race, religion (or lack thereof), culture, caste, sexual orientation, gender, sex, disability, age, nationality, political affiliation, or other characteristics.

Even repeated mockery, sarcasm, or aggression constitute insults collectively, according to the code.

It further adds that trolling which is defined as deliberately disrupting conversations or posting in bad-faith to intentionally provoke, will come under harassment.

Further, doxxing or disclosure of personal information, sexual harassment of any kind, threats be it physical or those which call for unfair and unjustified reputational harm, or intimidation by suggesting gratuitous legal action to win an argument or force someone to behave the way you want, are all defined as harassment. It notes that hounding someone over their work in the projects will also be considered harassment.

While Wikimedia has announced the Universal code, it still needs to evaluate how local and regional Wikipedia projects will enforce the new standards. This will be part of the next phase of the codes implementation, explained the spokesperson.

Read this article:
Wikipedia has a new Universal Code of Conduct to deal with harassment, misinformation - The Indian Express

Wikipedias New Code Of Conduct Gets One Thing Right; Another Will Be A Struggle – Forbes

ANKARA, TURKEY - JANUARY 15: (BILD ZEITUNG OUT) In this photo illustration, The logo of Wikipedia is ... [+] seen on the screen of a laptop with a magnifying glass on January 15, 2021 in Ankara, Turkey. (Photo by Altan Gocher/DeFodi Images via Getty Images)

A major social network announced a new set of rules for its members Tuesday, and by itself that might not rate as news.

But Wikipedia isnt just any social network, and its new rulebook stands apart from the terms of service handed down by commercial social platforms like Facebook and Twitter.

The Universal Code of Conduct announced Tuesday by the Wikimedia Foundation, the San Francisco nonprofit that hosts Wikipedia and related projects, isnt a top-down product. Instead, Wikipedians collaborated to write it, much as almost 356,000 of them regularly create or edit entries in that online encyclopedia.

More than 1,500 Wikipedia volunteers from 19 different Wikipedia projects representing five continents and 30 languages participated in the creation of the universal code of conduct, Wikimedias announcement notes.

That goes well beyond earlier moves by commercial social platforms to borrow the collective wisdom of their crowds. See, for example, Twitter, adopting the foundational features of @ mentions and hashtags from its early users, or Facebook letting users vote on new terms of service before scrapping that experiment in 2012 after too few people bothered to cast virtual ballots.

At Wikimedia, the collective drafting of the new code began with input from around the world about the need for revisions to its earlier terms and involved months of collaboration.

Theyre an alternative model to the private social experience that exists almost everywhere else, said Alex Howard, director of the Demand Progress Education Funds Digital Democracy Project.

The results also differ from many other codes of conduct by virtue of being unusually shortunder 1,700 words, or less than 1,300 if you subtract the introductory paragraphs.

The operative text starts not on a thou-shalt-not note, but with a you-should list of expected behavior of any user: Practice empathy; Assume good faith, and engage in constructive edits; Respect the way that contributors name and describe themselves; Recognize and credit the work done by contributors, among others.

The organization is saying, here are our values, Howard said. Theyre giving people scaffolding to interact with each other.

An Unacceptable behavior list follows, including a broadly constructed ban on harassment. This covers the usual categoriesfor instance, insults targeting personal characteristics, threats, and doxingbut also covers the broader category of being a jerk.

Thats both necessary, because people who punch down a little in public often do more often in private, and tricky because these lesser fouls arent as obvious.

People at times assume that its unintentional, said Caroline Sinders, founder of Convocation Design + Research and an expert in online harassment research whos worked with the Ford Foundation, Amnesty International and others (including an earlier stint at Wikimedia itself).

Or, she added, the offense will go unrecorded and then forgotten without a ladder of accountability that recognizes how unchecked minor abuses can lead to more toxic behavior.

These provisions also cover behavior outside Wikimedia projects. For example, the doxing clause notes that sharing other contributors private information, such as name, place of employment, physical or email address without their explicit consent is out of line either on the Wikimedia projects or elsewhere.

Theres a complicating factor here in Wikimedias understandable lack of a real-names policyenforcing one would endanger marginalized communities, and in particular those living under abusive governments. Wikipedia doesnt even require an email address to create a contributor account.

Wikimedia Foundation communications lead Chantal De Soto noted this issue in an email: enforcing any breaches of conduct that happen on other platforms is often very difficultverifying connections between Wikimedia accounts, and, for example, a Twitter account, is often not straightforward.

But its important that Wikimedia communities make that effort, considering all the evidence now available of how online radicalization can erupt in the physical world.

All we have to do is look at January 6 to get a sense of what happens when that goes too far, Howard said of the riots that took place at the U.S. Capitol.

The next chapter in Wikimedias effort will involve more collaboration on enforcement policies and mechanisms. This may be the most difficult part, since it will involve setting up structures that can work at scale and across cultures.

A community needs to think about how theyre going to document these cases, who has access to them, how are they keeping track of things, how are they going to respond to harassment, said Sinders.

Done right, this may require hiring more dedicated trust-and-safety professionals.

In open-source communities, a lot of this arduous labor is falling to volunteers, Sinders warned. And that leads to community burnout.

Read the original post:
Wikipedias New Code Of Conduct Gets One Thing Right; Another Will Be A Struggle - Forbes

When the Capitol Was Attacked, Wikipedia Went to Work – Washington Monthly

On January 6, Jason Moore was working from his home in Portland, Oregon and flipping between CNN and MSNBC as Donald Trump supporters gathered outside the U.S. Capitol. Watching what was unfolding in D.C. on cable news, I found it initially fascinating, and then, later, terrifying, he told me.

Moore, a digital strategist, is one of the top 55 contributors to the English-language version of Wikipedia. The free online encyclopedia has more than six million articles in English and is maintained by more than 100,000 regular volunteer editors like Moore. Around 1:30 p.m. eastern time, Moore started a new Wikipedia page to document what was then just a protest. He titled it: January 2021 Donald Trump rally.

I have a personal interest just in documenting political movements, said Moore, who goes by the username Another Believer. He logs onto his Wikipedia watchlista feed of the changes that have been made to the pages he wants to trackseveral times a day, like someone else might log on to Twitter or Facebook. Im a bit of a political junkie.

As the Capitol protest escalated into a violent assault, Moore was tabbing between Google News, the Wikipedia article he had created, and the articles talk page, where volunteer editors could discuss changes with one another. Hundreds more volunteer editors were chiming in. As chronicled by Alex Pasternack in Fast Company, Wikipedians debated the reliability of different sources and the accuracy of terms, and documented the democratic cataclysm in real time. It became, said Moore, this hurricane of people sifting through a lot of information at once.

Moore estimates he spent about ten hours editing the page now titled 2021 storming of the United States Capitol and closely related pages. The entry runs nearly 13,000 words long and has hundreds of external source citations. It has sections on intelligence, or the lack thereof, leading up to the attack; on police preparations; on the participation of state lawmakers; on the House and Senate evacuations; on the completion of the electoral vote count; and more. More than 1,000 volunteer editors worked together on the entry, which is still being updated regularly.

The page is the result of a remarkably collaborative online community of volunteers who edit, verify, and generally obsess over the vast, always-in-motion encyclopedia. Wikipedia is not without faults; it doesnt take much poking around to find a page with a major error. (Last year, a Reddit user unearthed that an American teenager who did not speak Scots, a Scottish dialect, had written almost half of the articles on Scots Wikipedia. The pages were riddled with grammar mistakes). Wikipedia is also not representative of the public; the vast majority of its volunteer editors are male, and fewer than 20 percent of Wikipedias biographies are about women.

But Wikipediaone of the most visited websites in the U.S.has avoided many pitfalls that have hobbled other online platforms. Twitter, Facebook, and YouTube are facing a backlash for their role in propagating misinformation. After Trumps repeated false claims about election fraud propelled his followers to break into the Capitol, all three companies suspended his accounts. It might have been the right call in the moment, but it also raised uncomfortable questions about the outsize power over discourse wielded by a tiny number of executives at private companies. Wikipedias bottom-up model, shaped by thousands of volunteer editors, proves that theres another way to build online communities.

Other special volunteer roles help keep the site running. An arbitration committee, also made up of vetted, experienced editors, settles the most contentious disputes; checkusers, an elite group of Wikipedia editors, are granted access to technical data to figure out if several Wikipedia accounts are being operated by one person. These privileged editors help deal with difficult situations, but much of the day-to-day work of editing Wikipedia is handled by regular volunteers making changes, discussing issues, following the suggested dispute resolution process, and ideally, landing on a consensus. The site even has principles for how editors can best collaborate, dubbed Wikiquette.

As protestors at the Capitol turned violent, one major debate among Wikipedia editors was how to describe the event in the pages title. Was it a protest? A riot? An insurrection? A coup attempt? There is a clear consensus thatprotestis inadequate to describe these events, wrote a Wiki editor with the username Matthias Winkelmann. Riot is a more appropriate label for the events that took place, responded a user called Bravetheif. I oppose protests and oppose storming, but support 2021 United States Capitol Siege or 2021 United States Capitol Breach, wrote another editor calling themselves RobLa. On the morning of January 7, an editor with the username CaptainEek set the page title to 2021 storming of the United States Capitol.

But the debate roared on, with editors making a case for their preferred term. Volunteers catalogued which terms different reputable publications had used. Their list of generally reliable sources that had used coup included theAtlantic, Buzzfeed News, and theLos Angeles Times. The list for insurrection included the Associated Press, Axios, and NPR.

This appeal to reputable sources springs from the ethos of Wikipedia content. According to English Wikipedias Verifiability policy, an editor can be sure something is true, but if its not verifiable with a reputable source, it shouldnt be added to a page. The site has a chart of publications categorized by the current consensus view of their reliability. The consensus can and does change. In 2018, for example, Breitbart was deprecated by a consensus of editors, meaning it could no longer be cited as a reference for factual matters. A year prior, editors had made a similar decision about the Daily Mail, a British tabloid.

The imperative to provide reliable sources is one way Wikipedia editors keep misinformation off of contentious pages. When one user proposed an edit suggesting that the Capitol rioters were not really Trump supporters, but rather antifa, an editor with the username Anachronist responded, interrogating the sources provided for the proposed edit:

Lets examine those sources. A student newspaper (byu.edu) isnt a reliable source. TheWashington Timescontradicts your proposal . . . explicitly saying thatnoAntifa supporters were identified. I could stop right there, but lets go on:Fox Newsis not considered a reliable source for political reporting, and the Geller Report is basically a blog, self-published, and therefore not usable.

The proposed edit never made it through, since administrators had placed the page under protection, meaning less experienced editors could not make changes directly to the page. Thats a common step for entries on contentious topics. By the evening of January 6, the Storming page was placed under extended-confirmed protection, meaning that for the next two days, only editors who had made over 500 edits and had had their account for 30 days or more could make changes. (After two days, the page was set to a slightly lower level of protection). This helped enormously with the level of disruption, said Molly White, a long-time Wiki editor and administrator, in an email.

White, a software developer in Cambridge, Massachusetts who goes by the username GorillaWarfare, made multiple edits to the Capitol Storming page. I was horrified and anxious to watch this all unfold, she explained, but editing on Wikipedia felt better than doomscrolling. This is something I do oftenif Im trying to understand whats happening or learn more about something, I will go edit the Wikipedia article about it as I do. White primarily edits pages related to right-wing online extremism. She wrote much of the Wikipedia pages for Parler and Gabalternative social media apps popular among Trump supporters and right-wing provocateursand contributed significantly to the entry on the Boogaloo movement.

Wikipedia can count on having humans in the loop on content decisions, rather than relying on artificial intelligence, because its much smaller than YouTube or Facebook in terms of active monthly users, said Brian Keegan, an assistant professor of information science at the University of Colorado Boulder. Thats helpful because content decisions often require understanding context, which algorithms dont always get right. Humans can also offer more nuanced feedback on why an edit is being reversed, or why a page is being taken down.

Of course, Wikipedia doesnt always get it right either. Less trafficked pages receive attention from fewer editors, which can easily result in significant factual errors. But pages that attract more attention from editors are often of high quality, thanks to a fairly functional system of collaboration and cross-checking. In fact, other social media companies have come to rely on Wikipedia as a source of reliable information. In 2018, YouTube announced it would link to Wikipedia pages alongside its videos about conspiracy theories in an effort to provide users with accurate information. In 2020, Facebook began testingWikipedia-powered information boxes in its search results.

What Wikipedia illustrates is that the problems with Facebook, Twitter, YouTube, and other social media platforms arent that they are social or that theyre populated by user-generated content. Its their business models. All three are for-profit companies that make their money through micro-targeted advertising, which means they have strong incentives to show users content that will keep them on their platform for as long as possible and keep them coming back. Content that confirms users beliefs or stokes their preexisting resentments can be good for business. That only overlaps with the truth some of the time.

As a nonprofit, Wikipedia operates within a fundamentally different set of incentives. It doesnt rely on advertising revenue and it doesnt need to drive up user engagement. The Wikipedia community has instead been able to develop norms and policies that prioritize the integrity of the content. A platform like Wikipedia has no compunction about shutting down access to editing their articles, or stopping people from creating accountsall these things that would really hurt topline numbers at shareholder-driven organizations, said Keegan.

The irony of the Capitol Storming page is that so many volunteers worked so hard to accurately document an event fueled by lies. For every claim that the election had been stolen or Mike Pence had the power to stop the count, there was a volunteer clicking through news reports, trying to get it right. Nearly a month later, the page still isnt complete. When I asked Molly White how she would know when to stop working on it, she wrote that Wikipedia is never finished, and pointed me to a corresponding Wiki entry titled Wikipedia is a work in progress.

Update: A reference to Fast Companys article on the same Wikipedia page was added on Feb 8.

See the rest here:
When the Capitol Was Attacked, Wikipedia Went to Work - Washington Monthly

Wikipedia Gatekeeping The About This Result Google Feature? – Search Engine Roundtable

I honestly think this new "about this result" feature is not a big deal (I can be wrong) but with anything new with Google Search, SEOs tend to obsess about it. So now SEOs are concerned about the information within that feature. Specifically, why is most (all) of it coming from Wikipedia and when Wikipedia does not have information, it just shows when Google first indexed it.

First, let's show the two basic kind of information you see today for a normal snippet. Not the local or other types of snippets, like answers you get in search.

Here is this site, which does not have a Wikipedia entry (click to enlarge):

Here is the WSJ, which does have a Wikipedia entry (click to enlarge):

So then you have this debate about it is Wikipedia or Google's first crawl date, why not show data that the site owner can give Google. Why do we have to be subjugated by either Wikipedia or Google for the information about the site we own? Right, it is not fair!

Danny Sullivan from Google said no, "it's not gatekeeping," he said. "We show additional information about the source of a result as an *option* people can choose to view *if they want*. Wikipedia is one source; what we know of a domain is another. As a beta launch, we'll be looking to further improve going forward," Danny added.

But the conversation gets entertaining; is it or is it not gatekeeping?

SEOs need to jump into new things and react when we shouldn't?

It is new and in BETA so relax:

Thing is, the old Danny Sullivan would have pointed out these concerns on Search Engine Land. It is this type of feedback where Google can potentially listen to the feedback and adjust the solution going forward.

Personally, I don't think this feature will last. I don't see searchers, the normal searcher, using it. That is why this does not bother me too much. But I can be wrong. I do think Google can give site owners a way, like they do with knowledge panels in general, to claim them and potentially suggest edits. Of course, I understand why Google does not want to let SEOs control what that says. I can see SEOs trying to inject fun marketing messages and who knows what.

Forum discussion at Twitter.

View original post here:
Wikipedia Gatekeeping The About This Result Google Feature? - Search Engine Roundtable