Archive for the ‘Wikipedia’ Category

This ‘High School Musical’ Star Has 1 of The Most-Translated Wikipedia Pages, More Than Shakespeare and Donald Trump – Showbiz Cheat Sheet

Wikipedia is the go-to place to get fast, quick facts on someone or something. Its highly editable, so theres always a sense of caution you need to have while reading, but for the most part, the general stuff is typically true. And while High School Musical was big, its a bit odd that one of the stars has the fifth-most translated Wikipedia page right now.

RELATED:A New High School Musical 3 Theory Suggests 1 Character Died at the Beginning of the Movie

In 2013, a project named Pantheon collected data about the Wikipedia pages in the world. The project was from the MIT Media Lab and according to BuzzFeed it was working collaboratively to quantify, analyze, measure and visualize global culture.

What does this mean? If you go to their site even today, seven years later users can filter through pages. So, one can pick a country, the birthdate range, and occupation of the people theyre searching for. There are then columns of information, with one being how many translated Wikipedia pages there are for this person.

The most-translated page goes to President Ronald Reagan with 250 language pages, with Jesus Christ right behind at 246. Then comes Michael Jackson and President Barack Obama with 233 and 230 pages, respectively. And in fifth place? Mr. Corbin Bleu, Disney Channel star known for his role as Chad in 2006s High School Musical.

Its truly a wild find and in 2013 he was at number 3. Even though hes gone down a few pegs, he still has a massive amount of language pages on Wikipedia, with 216. To put this into perspective, the next living person on the list, under Bleu, is President Donald Trump with 205.

And its not even Zac Efron, Vanessa Hudgens, or Ashley Tisdale. Efron only has 86 translated pages, Hudgens has 69, and Tisdale has 61. Thats not to say Bleu didnt leave an impact on the world as Chad, but still. He has more pages than Shakespeare and Leonardo da Vinci.

BuzzFeed actually got ahold of Bleu in 2013 and told him about the find and he was also very shocked.

What? Bleu said. Holy sh*t! Really? I wonder why that is! Are that many people looking me up? What the hell! Thats amazing. Thats ridiculous, actually. That is unnecessary, but I will definitely put that on my resume.

Diving into why that is, in 2013 no one really knew. Everyone was really just as baffled by it as Bleu was. But in 2019, it seems like Reddit came up with an equally complex answer.

Insider reported that Reddit was on the case. Someone posted this fact in the r/UnresolvedMysteries subreddit and an answer was found within hours.

According to Reddit user u/Lithide (whos now deleted) Wikipedia user Zimmer610, AKA Chace Watson from (presumably) Saudi Arabia made them all. Theyre apparently a polyglot, Corbin-superfan.

I actually think theres a dedicated fan of Corbin Bleu from Saudi Arabia who wanted to make sure there were Wikipedia articles for their idol in every language possible and also spent a few dozen hours working on the Arabic-language article, Lithide wrote.

One of the original Reddit posts updates also noted that they might have had a run-in with Wikipedia authorities in doing their translations. Allegedly they were banned from the English Wikipedia and Wikipedia Commons. Apparently, the Arabic page for Bleu is a featured Arabic Wikipedia page because its so well-done.

Its a whole, complicated find on Reddit with many layers and a lot of detective work. There also doesnt seem to be a known motive yet. But, if one is looking for a good distraction in the year 2020, this mystery (solved or not) is a great thing to dive into.

RELATED: Which Original High School Musical Cast Members Gave Their Blessing For The New Series?

Read the rest here:
This 'High School Musical' Star Has 1 of The Most-Translated Wikipedia Pages, More Than Shakespeare and Donald Trump - Showbiz Cheat Sheet

This could lead to the next big breakthrough in common sense AI – MIT Technology Review

AI models that can parse both language and visual input also have very practical uses. If we want to build robotic assistants, for example, they need computer vision to navigate the world and language to communicate about it to humans.

But combining both types of AI is easier said than done. It isnt as simple as stapling together an existing language model with an existing object recognition system. It requires training a new model from scratch with a data set that includes text and images, otherwise known as a visual-language data set.

The most common approach for curating such a data set is to compile a collection of images with descriptive captions. A picture like the one below, for example, would be captioned An orange cat sits in the suitcase ready to be packed. This differs from typical image data sets, which would label the same picture with only one noun, like cat. A visual-language data set can therefore teach an AI model not just how to recognize objects but how they relate to and act on one other, using verbs and prepositions.

But you can see why this data curation process would take forever. This is why the visual-language data sets that exist are so puny. A popular text-only data set like English Wikipedia (which indeed includes nearly all the English-language Wikipedia entries) might contain nearly 3 billion words. A visual-language data set like Microsoft Common Objects in Context, or MS COCO, contains only 7 million. Its simply not enough data to train an AI model for anything useful.

Vokenization gets around this problem, using unsupervised learning methods to scale the tiny amount ofdata in MS COCO to the size of English Wikipedia. The resultant visual-language model outperforms state-of-the-art models in some of the hardest tests used to evaluate AI language comprehension today.

You dont beat state of the art on these tests by just trying a little bit, says Thomas Wolf, the cofounder and chief science officer of the natural-language processing startup Hugging Face, who was not part of the research. This is not a toy test. This is why this is super exciting.

Lets first sort out some terminology. What on earth is a voken?

In AI speak, the words that are used to train language models are known as tokens. So the UNC researchers decided to call the image associated with each token in their visual-language model a voken. Vokenizer is what they call the algorithm that finds vokens for each token, and vokenization is what they call the whole process.

The point of this isnt just to show how much AI researchers love making up words. (They really do.) It also helps break down the basic idea behind vokenization. Instead of starting with an image data set and manually writing sentences to serve as captionsa very slow processthe UNC researchers started with a language data set and used unsupervised learning to match each word with a relevant image (more on this later). This is a highly scalable process.

The unsupervised learning technique, here, is ultimately the contribution of the paper. How do you actually find a relevant image for each word?

Lets go back for a moment to GPT-3. GPT-3 is part of a family of language models known as transformers, which represented a major breakthrough in applying unsupervised learning to natural-language processing when the first one was introduced in 2017. Transformers learn the patterns of human language by observing how words are used in context and then creating a mathematical representation of each word, known as a word embedding, based on that context. The embedding for the word cat might show, for example, that it is frequently used around the words meow and orange but less often around the words bark or blue.

This is how transformers approximate the meanings of words, and how GPT-3 can write such human-like sentences. It relies in part on these embeddings to tell it how to assemble words into sentences, and sentences into paragraphs.

Theres a parallel technique that can also be used for images. Instead of scanning text for word usage patterns, it scans images for visual patterns. It tabulates how often a cat, say, appears on a bed versus on a tree, and creates a cat embedding with this contextual information.

The insight of the UNC researchers was that they should use both embedding techniques on MS COCO. They converted the images into visual embeddings and the captions into word embeddings. Whats really neat about these embeddings is that they can then be graphed in a three-dimensional space, and you can literally see how they are related to one another. Visual embeddings that are closely related to word embeddings will appear closer in the graph. In other words, the visual cat embedding should (in theory) overlap with the text-based cat embedding. Pretty cool.

You can see where this is going. Once the embeddings are all graphed and compared and related to one another, its easy to start matching images (vokens) with words (tokens). And remember, because the images and words are matched based on their embeddings, theyre also matched based on context. This is useful when one word can have totally different meanings. The technique successfully handles that by finding different vokens for each instance of the word.

For example:

Go here to read the rest:
This could lead to the next big breakthrough in common sense AI - MIT Technology Review

Wikipedia is better prepared for Election Day than Facebook or Twitter – Vox.com

If youre looking for up-to-the-minute results on election night, Wikipedia might be one of the first sites to pop up in your Google search. But, in this case, the crowd-sourced encyclopedia of human knowledge likely wont have the immediate answers you seek. And thats by design.

In yet another election cycle defined by copious amounts of misinformation from a variety of sources, Wikipedia wants and is set up to be a carefully curated resource of impartial facts. Theres no rush to be the first to declare a winner (quite the opposite, in fact). Its also difficult for trolls to vandalize associated pages, let alone keep those edits up for a prolonged period of time or to allow them to spread.

For the 2020 United States presidential election page, as well as the pages for presidential candidates Donald Trump and Joe Biden and vice presidential candidate Kamala Harris, only editors whose accounts are at least 30 days old and who have made at least 500 edits can change the article. This is what Wikipedians, the editors who run the site, call extended confirmed protection.

The election page lock was put in place on October 21 by Molly White, who goes by the handle GorillaWarfare on the site. Shes been a Wikipedia editor for almost 15 years and also serves as an administrator. This gives her some additional abilities, like the power to lock pages. But White is not anticipating any major issues on Wikipedia with regard to the upcoming election.

For the most part, things will be business as usual on Wikipedia, White told Recode. Wikipedia editors and administrators have plenty of tools at our disposal to ensure that our readers are only seeing accurate information, even as things are changing quickly behind the scenes.

This probably wont be the case elsewhere online. Like Wikipedia, social media companies run on user-generated content, and theyre once again scrambling to come up with ways to stop the spread of misinformation and disinformation on their platforms. After being blamed for influencing the outcome of the 2016 election, Facebook is particularly concerned with how it will handle Election Day this year.

But Wikipedia, which will be 20 years old on January 15, has been around longer than Facebook, Twitter, and YouTube. This will be the sixth presidential election in Wikipedias lifetime, and the sites all-volunteer army of thousands of editors has used those years of experience to develop and refine methods of combating lies and inaccuracies during prominent breaking new events while also identifying and deleting anything incorrect or poorly sourced that happens to make it onto their pages.

Wikipedia editors are currently discussing how to handle Election Day and its results in public forums on the site. Theyre debating how many sources to use for election-related updates, which ones to rely on when a presumptive winner is declared, and how long after polls close to start adding the results to the page.

Wikipedia is intended to be an encyclopedia, not a news organization, and so we are much more concerned with being accurate than we are with being quick, White said.

Indeed, Wikipedias stated mission is to be a repository for all human knowledge. The site has 55 million articles across its 300 versions the most popular version, English, has 6.2 million articles. Wikipedia is also one of the most-read websites in the world, with 1.5 billion unique visitors per month.

So while huge social media platforms tend to expose their users to content that generally fits their existing worldview and political sensibilities, Wikipedia has quietly emerged as a website for people who are actively seeking accurate information. Whats behind the effort is a community that strives to provide that information as neutrally and as accurately sourced as possible.

Wikipedia is ruled by consensus, its articles are fluid, and discussions over how and why they should be changed are ongoing. Wikipedia putting up information about the presidential election is no different.

Most pages associated with the election and candidates have some kind of edit protection on them, though the level of protection might vary. For example, while Harris currently has extended confirmed protection, her opponent, Mike Pence, has a page that is only semi-protected. That means edits can only be made by registered users whose accounts are at least four days old and have made at least 10 edits though, again, this might change as Election Day nears.

Similarly, many United States politics-associated pages are also subject to additional rules limiting edits to reverse a previous edit or requiring a consensus to apply any edits that have been challenged. To reach consensus, editors will typically argue their respective viewpoints on an articles accompanying talk page, citing various Wikipedia rules and procedures to back up their case until a majority of editors agree on what to do next. Administrators can block or ban editors who dont follow those rules.

When it comes to the election results, editors are still hashing out whether the Associated Presss projections are a good enough single source or if at least three news sources should be used. Theyre also considering just locking certain pages from edits for everyone except administrators for a set period of time.

With standards, rules, and a community of editors to uphold them, moving slowly has been a Wikipedia superpower, Noam Cohen recently wrote in Wired. That, Cohen added, makes the site a less attractive target to those bent on campaigns of misinformation with immediate payoffs. Vandalism is hard to add, usually doesnt stay up for long, and therefore doesnt spread widely.

While Facebook and Google have spent billions of dollars on content moderators and other measures to combat misinformation and abuse on their platforms, Wikipedias editors do this work for free. Wikipedia is hosted by the nonprofit Wikimedia Foundation, which covers its associated costs, including servers, software, and legal fees. The Foundation relies on donations and gifts and gets a lot of them: The organization received $113 million last year alone.

The Foundations role is to support those folks in every way that that they need us to, Ryan Merkley, Wikimedia Foundations chief of staff, told Recode. That means everything from keeping the servers up and running, to running our security operation, to communications, fundraising. But also working with trust and safety, and then supporting [editors] with the tools that they need in order to edit.

Some of those tools include bots that can quickly detect article vandalism and either get rid of it or flag it to an editor. Editors can also add articles to their watch lists to be immediately alerted of any changes (nearly 550 editors have put the 2020 US presidential election page on their watch lists). And they can lock pages that might or already have become targets for vandalism.

The Foundation has also done some of its own work to prepare for the election.

We put together an internal task force, with staff representatives from every part of the foundation who relate to disinformation, Merkley said. So that includes the security team, trust and safety, legal policy, communications, our partnerships group that works with the other platforms that engage with Wikimedia content.

The guiding principle behind Wikipedia is that anyone can contribute anything to it. This being the internet, not everyone operates in good faith or knows what theyre talking about, so the site has a longstanding reputation for inaccuracy. Thats no longer wholly deserved, but Wikipedia itself will tell you that its not a reliable source for this very reason.

The site has also been criticized for systemic bias, with a lack of representation from certain demographics theres a lot of white English-speaking men who contribute that can create a hostile environment for minority editors. The lack of diversity also has the potential for bias to make it into the articles themselves. The Wikipedia Foundation and Wikipedians have made efforts to improve this, but they still have work to do.

Other things get overlooked on a site as big as Wikipedia, too. For instance, you might stumble across vandalized articles, usually lurking in Wikipedias lower-trafficked corners, that have managed to escape the notice of editors. You may even find a version of Wikipedia that contains thousands of articles written by someone who doesnt really know the language theyre supposed to be written in.

While anyone can become a Wikipedia editor, only a tiny fraction of Wikipedias readers actually will. And its deceptively difficult. The initial process of making an edit is as simple as signing in and changing some text, but Wikipedias editorial rules and processes and the various code words and language around them can be a barrier to doing it correctly, which is necessary for the edit to be accepted.

But the people who get it, like White, may spend a considerable amount of their time doing unpaid work on the site. They might also become the target of harassment as a result. White, who spends two or three hours a day working on Wikipedia, said shes been doxxed, threatened with violence and lawsuits, and people have even tried to get her fired from her day job because of it.

It is at best frustrating and at worst extremely frightening, but I both care deeply about the importance of Wikipedia and I am also a very stubborn person who does not like to feel like I am giving in to threats, White said, attributing some of that harassment to her position as an administrator, her gender, and the controversial articles and topics she often works on (she created the Boogaloo movement page, for example).

And Wikipedia is important. Its one of the top results for most internet searches, and so, for better or worse, Wikipedia is the site people are most likely to visit when they want more information about something. That means the stakes are high when big topics are involved.

Notably, its coverage of Covid-19 has drawn praise. This involved the creation of a WikiProject dedicated to the virus with over 200 participating editors (anyone can join!) who may focus on pandemic case data, the viruss impact on specific locations, or the industries affected. One professor who studies misinformation told the Washington Post that Wikipedia was a ray of hope in a sea of pollution and handled the virus exceptionally well.

Theres a lot of really great work done through these WikiProjects, especially during times of crisis where a lot of hard-hitting, late-breaking stuff is coming out, Zachary J. McDowell, an assistant professor in the Department of Communication at the University of Illinois at Chicago, told Recode.

So if Wikipedia, with its high visibility and wide-open door for anyones contributions, can still provide readers with well-sourced, neutral articles, why cant the social media platforms that play such a big role in the spread of misinformation do the same? Clearly, some of them see the merits of Wikipedias work; Facebook and Google use Wikipedia articles to provide additional knowledge in user searches.

Social media is designed to keep users on their platforms for as long as possible, both to show them as many ads as possible and to collect their data, which is then used to show them even more ads. They are incentivized to keep your attention, not to ensure that what youre reading or seeing is accurate. That business model is unlikely to change anytime soon. Meanwhile, Wikipedias model is quite different.

[Wikipedia has] no algorithms designed to serve content in certain ways to some people, Merkley said. None of that structure exists which can be later gamed, in order to advance this post about a person or to target this message to that person.

Wikipedia is also very transparent, Merkley said. An articles associated history and talk pages will tell you, in great and granular detail, all the edits that have been made, who made them, and any associated discussions between editors about them.

This transparency helps create trust, but good luck getting, say, Facebook to implement it. Facebook is notoriously secretive about its algorithms, which determine what you see on the site, from ads to posts from your friends to recommendations for groups you should join or people you should befriend. These algorithms create filter bubbles of information that tends to line up with your political viewpoints, offering little exposure to anything that might conflict with them. You get what Facebook thinks you want to hear or watch what YouTube thinks you want to watch, and thats not always whats true.

It is essentially a game where the entire system is already rigged for disinformation, fake news, McDowell said. Its monetarily incentivized to get people riled up and to click. It will always be a game where those who are trying to control the information flow will be the ones who are one step behind.

McDowells studies include Wikipedias value as a teaching tool for information literacy. He stresses that Wikipedia itself shouldnt be seen as a source but rather as a collection of information, clearly cited, that users can follow if they want to learn more or verify what theyve read.

Having a critical eye toward information is absolutely imperative right now, McDowell said. And a lot of people dont.

For their part, social media platforms have, in recent years, tried to hold back the flow of misinformation in some cases, including during the election. Facebook has made rules around political ads, voter suppression, and even premature declarations of victory. But social media still receives plenty of criticism from both sides of the aisle, and it will almost certainly be blamed for influencing the outcome of the election in some way, regardless of the winner.

Wikipedia, on the other hand, will just tell you who reliable sources say the winner is as soon as its editors reach a consensus on what those sources are.

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.

Will you help keep Vox free for all?

The United States is in the middle of one of the most consequential presidential elections of our lifetimes. Its essential that all Americans are able to access clear, concise information on what the outcome of the election could mean for their lives, and the lives of their families and communities. That is our mission at Vox. But our distinctive brand of explanatory journalism takes resources. Even when the economy and the news advertising market recovers, your support will be a critical part of sustaining our resource-intensive work. If you have already contributed, thank you. If you havent, please consider helping everyone understand this presidential election: Contribute today from as little as $3.

Go here to see the original:
Wikipedia is better prepared for Election Day than Facebook or Twitter - Vox.com

The Senate Race That Could be Pivotal for Americaand Wikipedia – WIRED

A political newcomer, Greenfield has never held public office, and her life lacks the typical arc of a political climber. In 1988, her husband died in a freak accident; the Social Security benefits she received allowed her family to survive, a story that has become the centerpiece of her campaign. After earning a college degree, Greenfield became the president of a small Des Moines real estate firm.

This has made Greenfield an unusual candidate for national office: Her tragedies have been private, while her ambitions, if not modest, were focused: trying to raise two children as a single parent with a business. Greenfields lack of notabilitywhich she shares with the vast majority of people she is running to representis in many ways a primary theme of her campaign.

Greenfields dilemma is one that can often face female candidates: what might be called a notability trap.

In short, Wikipedias notability litmus test doesnt just advantage political incumbents; it advantages the kind of peopleinsiders, celebrities, menwho already enjoy notable status in a social and economic hierarchy that others in politics may wish to democratize.

Greenfields dilemma is one that can often face female candidates: what might be called a notability trap. Political challengers who are deemed non-notable tend to be women, and they are often faced with only one path to getting a page on Wikipedia: winning their race. In 2018, for example, Alexandria Ocasio-Cortez saw her Wikipedia entry appear on June 27, the day after she won an upset primary victory.

In the blue wave later that year, 88 newcomers would win election to Congress. Of the 52 challengers considered notable enough to have Wikipedia entries before their elections, almost 70 percent were men and 30 percent women. And among the 10 challengers already considered notable for their private-life achievements, eight were men: A liquor store magnate; the brother of Vice President Pence; a former NFL wide receiver; and a California man who won the lottery. Meanwhile, among the women not considered notable were a Navy commander, an Air Force captain and sports company executive, a key architect of the auto-industry bailout, a law professor, and an Iowa state official. All received their Wikipedia articles shortly after they won election.

The notability trap has become a topic of controversy outside of politics, too. In 2018, Canadian physicist Donna Strickland was repeatedly denied a Wikipedia page for lack of notability. That changed one day in October, around 9:56 amthe morning she won the Nobel Prize. Strickland shared the prize with a male colleague, Grard Mourou, who has had a Wikipedia page since 2005. Earlier that year, when users attempted to create a page for Strickland, a moderator denied the request, replying that the article's references "do not show that the subject qualifies" for Wikipedia.

For activists, the Greenfield example reflects a familiar pattern. Absences on Wikipedia echo throughout the Internet, and that is universal for any fieldart, politics, and so on, says Kira Wisniewski, the executive director of the organization Art+Feminism, a group founded in 2014 to correct what it saw as gender imbalances in the arts on Wikipedia. Wisniewski pointed to a 2011 survey that suggested more than 90 percent of Wikipedia editors were male, one reason she suspects women might be less likely to have their past achievements deemed notable.

Lih, the Wikipedia expert, is more reluctant to attribute Greenfields rejection to gendersome male Senate candidates, like Al Gross in Alaska, similarly did not have a Wikipedia page for much of this yearbut nevertheless calls Wikipedias political rules a serious problem. Its pretty obvious an article was merited, he says of the Greenfield case, later adding: Were not doing the right thing.

Yet that wasnt so obvious on Wikipedia. As the Iowa race became a virtual toss-up, Greenfields proponents became increasingly heated. They pointed to the growing national interest in the campaign. This draft now clearly exceeds [the] notability threshold, wrote one user.

But the other side insisted that Greenfields life was just not notable, and never would beunless she won. Drop the stick, and move away from the [horse] carcass, wrote Muboshgu. She'll get an article if she wins. Another user evaluated Greenfields biography and wrote, I don't think that gives her a meaningful career outside of her current Senate run, adding that if Greenfield lost, she will very likely be seen as insignificant.

Here is the original post:
The Senate Race That Could be Pivotal for Americaand Wikipedia - WIRED

A vicious culture war is tearing through Wikipedia – Wired.co.uk

Getty Images / Wikipedia / WIRED

In July 2019 an anonymous Wikipedia editor added a line to the article about Jai Shri Ram, a Hindi expression that translates as Glory to Lord Rama. The editor made what would prove to be an extremely controversial addition, noting the phrase was also used as a war cry.

The edit was the first in a struggle that raged for more than a year, with one side claiming it constituted a form of Hinduphobia and the other side saying it was an accurate portrayal of the religious term, which had been embraced by Indias ruling party BJP and, according to some in the media, had become a dog whistle for nationalists.

The edit war spilled over to other articles on Wikipedia, including one about the 2020 Delhi riots. There, the claim that the war cry was part of a rising trend of beating up Muslims and forcing them to chant Jai Shri Ram by violent Hindu mobs in India, was also noted. The edit to the original page also claimed this trend became more prominent after the Hindu nationalist Narendra Modi was re-elected as Prime Minister of India.

In fact, it was only when Modis nationalist BJP party was reelected, that Manisha, a student from Mumbai who edits Wikipedia under the username Papayadaily, started to notice what she calls widespread anti-Hindu bias on Wikipedia.

Every article on Wikipedia is against the ruling party, and whitewashes the Indian National Congress, she says, referencing the former ruling party which, under the leadership of Mahatma Gandhi, led the country to independence in 1947. There are even conspiracy theories that there is involvement by members of the Congress, as every article is in their favour, she says.

Another editor, Raj Aryan, who edits under the username Factual Indian Hindu, says that the page for Modi is full of criticism, while that for Rahul Gandhi is full of praise.

I dont side with any party, says Minisha, who identifies as anti-left. This is not a liberal versus conservative debate. Its about hatred and lies and propaganda, she says. Sometimes articles I follow will be changed within minutes. It makes me think this editing is funded effort by either communists or religious minorities. The latter is a thinly-veiled euphemism for the countrys Muslim community, which makes about 15 per cent of the population and is being increasingly targeted by Modis right-wing government. The claims against Wikipedia show how facts are being weaponised as part of Indias political struggle.

Aryan runs FactualHindu, an Instagram account that flags examples of Wikipedias purported bias on social media. Its not just articles about Modi and the BJP that editors like these see as skewed: recently, Aryan lambasted Wikipedia over an article about an early Indian nationalist leader, Subhas Chandra Bose, which the encyclopaedia had labeled a radical.

Such perceived slights seem to strike a chord with some Indian editors, who have now made it their mission to seek out instances of the alleged prejudice across Wikipedia. Wikipedia pages are defaming the Indian culture and its roots, Aryan says, claiming the Wikipedia page for Christianity and Islam were positive, while that for Hinduism was negative and stressed issues like casteism instead of the more progressives sides of the faith.

Allegations of political biases on the part of volunteer-run Wikipedia are common. The open encyclopaedia, now entering its twentieth year, has been accused of partiality by figures on both the left and the right of the political spectrum across the world with regularity.

This trend is worrying to another Indian editor called Subhashish who sees these campaigns as an attempt to tarnish Wikipedia. The fact is that Wikipedia is not a singular body, but a collective and therefore has many many biases. For him, the goal should be to fix these biases, rather than criticising the whole project. Instead, he says, we are seeing political leaders accusing Wikipedia of spreading false information.

Some BJP officials have been vocally opposed to the free encyclopaedia. In August 2020, when Wikipedia began its annual fundraising drive, Nupur Sharma, the BJPs national spokesperson, tweeted the site was no longer neutral and known to carry fake info. She also suggested Wikipedia had been completely taken over by a certain cabal. Others on social media claimed Wikipedia has an anti-Hindu and even anti-India bias, in what local media called a campaign against the open encyclopaedia.

Wikipedia has long been popular in India. In 2011, Hindi Wikipedia, written in the local Devanagari script, became the first non-English Wikipedia to pass 100,000 articles and as of this year received more than 47 million page views. However, English is the main language in which readers in India access and edit Wikipedia. Today, traffic from the subcontinent accounts for roughly five per cent of all the traffic to English Wikipedia. So far this year India is the fourth country by traffic to English Wikipedia compared to seventh in 2017, and tenth in 2012.

Three local editors say that in recent months, there has been a push by the right wing to prove Wikipedia has a particular bias, as Subhashish puts it. Wikipedia articles on everything from Brahmanism and Islamophobia in India, to Jai Shri Ram and the 2020 Delhi riots have been beset by massive edit wars.

Even the article for a local Hindu guru deemed not notable enough for a Wikipedia page a common occurrence caused a stir when deleted, as the gurus followers took to social media to cry foul.

Wikipedia articles on Indian culture, history, and entertainment have also been pulled into the fray. In recent weeks, the most viewed celebrity death on English Wikipedia has been not of RBG, but rather SSR or Sushant Singh Rajput, a Bollywood star whose suicide has inspired massive public attention in India. In a weird turn of events, SSRs suicide also spawned political conspiracy theories on social media, which are spreading like a wildfire in Indias increasingly polarised and politicised society and inevitably spilled to Wikipedia, too.

When Covid-19 hit India, these tensions reached boiling point. It was an article about the origin of the viruss spread in the country that finally thrust Wikipedia into the centre of Indias culture wars. The article is about what is now termed the Tablighi Jamaat coronavirus hotspot in Delhi. It was first created in April, and focuses on a mosque that hosted an event at the beginning of March for the local Muslim community. The mosque later became the focal point of religious and social tensions in India, with many accusing the event of being the actual origin of the viruss spread in the country.

As the highly contested article now carefully states, the religious congregation of the Sunni sect of Tablighi Jamaat in Delhi was a coronavirus super-spreader event, with more than 4,000 confirmed cases and at least 27 deaths linked to the event reported across the country.

The events Wikipedia page became a perfect storm of religious acrimony, political hate-mongering and Covid-19 misinformation. Claims by Hindu politicians that the local Muslim community was to blame for the viruss spread in India, or that Islamic leadership was not doing enough to stop it, grew rampant both on the Wikipedia article and offline. Questionable reports that claimed Muslims had refused to let cops and health officials enter the building to conduct medical examination were noted in the article and stoked tensions around a possibly true statement that the mosque had failed to properly follow social distancing procedures.

Conversely, claims that Muslims were now being targeted in vengeance for the mosque super-spreader event also started to appear in the article. The government hospital in Rajasthans Bharatpur refused to admit to pregnant Muslim woman citing her religion, one edit claimed, showing the tit-for-tat dynamic such articles can take even when their tone is neutral.

Alongside factual information added by regular editors, more politically driven users dragged the article into racist infighting. One deleted version briefly claimed that Muslims arriving at a local medical centre, created a ruckus [...] claiming that the government wants to kill them. Citing unsubstantiated reports by right-wing media, this version claimed that in the hospital, members of the community were seen molesting nurses and spitting on hospital staff... [and even] reportedly found defecating in the hospital corridor.

With editing reaching fever pitch, it became clear no compromise or consensus could be reached, and it was decided to put the articles very existence to a vote. The article was deleted in a highly controversial move that further highlighted how toxic the discourse had become.

There is a lot of reliable coverage of this Islamic religious gathering contributing to the spread of Covid-19 in India, the final decision said, noting that there was a factual basis to the articles existence. At the same time, there are a lot of tensions between Hindus and Muslims in India, and there is also increasing state-sanctioned Islamophobia and persecution of Muslims in India. The issue of Covid-19, was highly volatile and rife with misinformation and keeping the article, it was feared, could have a disruptive impact on the real world.

Contributing to the storm around the article was the fact that its first version was written by an undisclosed paid editor (for-profit editing is only allowed if disclosed in the edit) who has since been banned from the project. To make matters worse, the deletion was reversed in a subsequent vote on April 10, sparking another edit war.

The page spun so out of control that Jimmy Wales, Wikipedias co-founder, waded into the controversy after being called out about the article on Twitter. Wales was accused of taking bribes from Muslims to have the article deleted, and spent some time explaining on Twitter how Wikipedia works to an army of critics. After he called the article poorly written and with zero sources, he pleaded, this isnt about religious sentiments, its about not putting junk into Wikipedia. (Waless press office did not respond to a request for comment about the controversy by the time of publication.)

The ability to maintain what Wikipedians call good faith was gone. After Waless comments and what seemed to be his foreign interference on an Indian issue on behalf of a minority the article began making headlines in India. The Times of India, Indias paper of record and most reliable news source, reported on Wikipedias battle against communalism editing that promoted tribalism over facticity just when reliable information regarding Covid-19 was needed most.

One outlet stood out: OpIndia, a conservative news site that hosts opinion pieces, launched a de facto campaign against Wikipedia and has even interviewed Wikipedias other co-founder-cum-critic Larry Sanger, about Wikipedias left-wing bias. The tensions relating to religious, geopolitical and social views is an ongoing occurrence and such tussles are going to stay no matter what the current issue is. What is really unfortunate is that we have an immoral political environment here in India, says Subhashish, who believes it is a lack of understanding of Wikipedia either due to ignorance or intentional that is the cause of the problem. He blames the media environment and politicians that he says are threatening to ban Wikipedia like they did in China.

Claims that Wikipedia has a liberal bias have long gone hand-in-hand with campaigns by media outlets displeased with the encyclopaedia. Conservapedia was set up in 2009 to provide a more evangelical-friendly version of the encyclopaedia for Americans reluctant to accept what mainstream sources say about climate change and evolution. After The Daily Mail was deprecated as a source on Wikipedia, it too joined a growing chorus of right-wing criticism of Wikipedia. In recent years, Breitbart has also focused on the issue.

Now OpIndia seems to have picked up the mantle, reporting to its readers about a recent decision to downgrade the status of Fox News as a source on Wikipedia. Wikipedia is clearly being politicised by a particular group of people who identify themselves as left-liberal, an OpIndia spokesperson says. They add that the Wikipedias bias can manifest as anti-Hindu on a few occasions, but the bias is not anti-Hindu primarily.

Even Manisha, the editor worried about Wikipedias anti-Hindu bias, agrees the situation has gone too far. She says that the pro-BJP media outlets like OpIndia, that initially helped her call attention to anti-Hindu biases on Wikipedia, are now part of the problem. Initially, people werent aware of how many articles Wikipedia has against a single community in India, she says, referencing the Hindu community. In wake of their work, people started to read and fact check Wikipedia, which is good. But nowadays they sometimes exaggerate, she says.

They report about issues that are not really issues and then people come to edit articles and make them worse. So theres this polarisation and the grey area gets left behind, Manisha says. Everything is now polarised, everything is left or right and there is no common ground.

Not every country treated the pandemic the same did Swedens Covid-19 experiment work?

This AI Telegram bot has been abusing thousands of women

Apples new phones have arrived: Should you get the iPhone 12 or iPhone 12 Pro?

Listen to The WIRED Podcast, the week in science, technology and culture, delivered every Friday

Follow WIRED on Twitter, Instagram, Facebook and LinkedIn

Get WIRED Daily, your no-nonsense briefing on all the biggest stories in technology, business and science. In your inbox every weekday at 12pm UK time.

by entering your email address, you agree to our privacy policy

Thank You. You have successfully subscribed to our newsletter. You will hear from us shortly.

Sorry, you have entered an invalid email. Please refresh and try again.

View post:
A vicious culture war is tearing through Wikipedia - Wired.co.uk