Archive for the ‘Wikipedia’ Category

Coronavirus updates in Hindi, Bangla, Tamil and 6 more Indian languages on Wikipedia – The Indian Express

By: Tech Desk | New Delhi | Updated: March 27, 2020 2:32:09 pm Coronavirus related information now available in 9 Indian languages on Wikipedia

Cases of COVID-19 are rising significantly in India with every passing day. In just a week it has crossed the 700 mark and this is worrying the Indian government as well as the citizens. While India is under 21 days lockdown until April 14 people are stuck at home and are completely dependent on the internet, social media and television for information related to coronavirus pandemic.

Sadly, given the current state the internet is flooded with unverified information, fake news, and rumours and finding authentic information is difficult. This is were Wikipedia is playing a big role by informing people with verified information related to coronavirus pandemic. Wikipedia is providing COVID-19 related information in 9 Indian languages including Bangla, Bhojpuri, Hindi, Kannada, Arabic, Malayalam, Tamil, Telugu, and Urdu.

The World Health Organisation (WHO) has warned against an infodemic of inaccurate information that makes it hard for people to find trustworthy sources and reliable guidance when they need it. This is a monumental challenge that a group of Wikipedia editors is tackling as they add, review, and improve COVID-19 information in English as well as 9 Indian languages.

ALSO READ: This app lets you report coronavirus suspect, find testing labs nearby and more

Given Wikipedia follow an open editing model designed to prevent bias it has partnered with SWASTHA, a branch of another much larger Wikipedia group WikiProject Medicine which includes doctors and experts from around the globe.WikiProject Medicine has so far produced more than 35,000 medical articles across different languages that are monitored by more than 150 editors. With this partnership, Wikipedia aims to make critical coronavirus health information freely accessible to all Indians.

Verifying what is a coronavirus fact versus fiction is a huge job, and we are calling on local universities to help as we increase efforts to translate and review local Indic content about the pandemic, said Abhishek Suryawanshi whos a part of the newly-formed Wikipedia group. Indias volunteer editors have created multiple articles such as Wikipedia article about the coronavirus pandemic in India, across multiple Indian languages. The English Wikipedia article alone has been edited 1,400 times by more than 100 editors.

ALSO READ: Cybersecurity experts come together to fight coronavirus-related hacking

SWASTHA works with Indias National Health Authority and Ministry of Health as well as with international pandemic control experts from Johns Hopkins University in the United States and the World Health Organisation in Switzerland. To produce better results SWASTHA said it requires more help from local partners as the pandemic grew to help them reach local communities.

The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines

For all the latest Technology News, download Indian Express App.

IE Online Media Services Pvt Ltd

Excerpt from:
Coronavirus updates in Hindi, Bangla, Tamil and 6 more Indian languages on Wikipedia - The Indian Express

We Need the Wisdom of Wikipedia – The LumberJack

Weve all been there. Youre sitting in a class. Your professor wants you to write a paper on the different types of asexual reproduction of the Sanderia malayensis jellyfish or some other arcane drivel. Your first reaction is to hit up Wikipedia. Then comes the kicker. You cant cite Wikipedia. You scowl and snarl under your breath.

Wikipedia is cool and it is useful. Turning a blind eye to Wikipedia as a reliable source is shortsighted and has implications beyond the realm of encyclopedias. Distrusting Wikipedia represents academias unwillingness to open the gates of collaborative truth-knowledge.

Scientific papers, meanwhile, are far from perfect.

Contrary to what your professors may tell you, Wikipedia, as a source, is statistically just as accurate as published encyclopedias for most of its content. A 2005 study by the Nature research journal, Internet encyclopaedias go head to head, found errors in both encyclopedias, but among the entries tested, the difference in accuracy was small.

Wikipedia, in their signature self-aware style, has reported on their own reliability as well. Wikipedia does not guarantee validity, but it is an invaluable research resource.

Inaccurate information on Wikipedia is usually corrected quickly. Hyperlinked citations back up nearly every claim made on an entry. The Sanderia malayensis jellyfishs page hosts six sources from international professionals, biologists and a handbook on poisonous jellies.

Scientific papers, meanwhile, are far from perfect. Soft sciences have suggested cures to unhappiness or boosts to confidence through simple behavioral change, but as other researchers try to replicate the experiments, their conclusions are significantly different. This indicates a serious error in the scientific method. If science isnt replicable, science is null.

In the last few years, a plethora of papers have fallen under criticism after researchers have failed to reproduce their resultsits been called the replication crisis. The crisis may have a few sources.

Mistakes happen on Wikipedia too and it is always essential to be critical of anything read.

First, its not hard to get published. The University of World News said in 2018 that too much scientific research is being published. It estimated nearly 30,000 scientific journals are in circulation, publishing approximately two million articles each year. They said the volume burdens the peer review system and makes it dysfunctional.

Second, the media likes to be the first to report on news, including science news. Journalists can be wrong and often are when it comes to reporting on science, especially when theyre grasping to be the first to report on new findings. These bad practices report inaccurate, unconfirmed, flawed science to their audience before the study can be replicated.

Mistakes happen on Wikipedia too and it is always essential to be critical of anything read. Search around, find supporting articles for any claim made and be aware that there may be flaws. But be able to recognize valid and sound knowledge.

Critical review by the editors of Wikipediawho can be any personis what makes Wikipedia so powerful and so accurate. Its the worlds largest encyclopediaabout 50 times larger than Britannicawith over six million entries and over 200,000 contributors. Wikipedia should serve as a banner for collaborationespecially between diverse groups.

In The wisdom of polarized crowds, a 2019 study from Nature Human Behavior, researchers found politically-diverse teams created more accurate entries than teams with less political diversity.

Wikipedia comes in clutch, often. Using it as a source may be frowned upon by professors, but a short chat with most of them and theyll say Wikipedia is an excellent place to start. The website is a tool, not a cheat code. It would be ignorant to ignore it, but if its used appropriately, maybe, just maybe, we could learn something about jellyfish.

Like Loading...

Read the original here:
We Need the Wisdom of Wikipedia - The LumberJack

Wikipedia is flooded with information but it has a blind spot – Grist

Remember the 2018 floods in Sudan?

If the answer is no, youre not alone and its not just disaster fatigue making you lose track. The devastating floods that killed 23 people and affected 70,000 more barely caused a blip in American media.

That erasure now lives on in Wikipedia, the collaboratively written online encyclopedia that receives billions of visits per year. According to a recent study, the 2018 Sudan floods are just one of many major floods in low-income countries especially in Africa, Asia, and Latin America that either have truncated Wikipedia pages or lack pages altogether.

Get Grist in your inboxAlways free, always fresh

The Beacon Other choices

Ask your climate scientist if Grist is right for you. See our privacy policy

The researchers cross-referenced three global databases to identify major floods that occurred between 2016 and 2019 and found that fewer than 20 percent of major floods in low-income countries have Wikipedia pages in English. Meanwhile, 68 percent of major floods in high-income countries have been immortalized on Wikipedia. Although a bias toward English-speaking places can be expected in English-language Wikipedia articles, the authors found that language alone doesnt account for the magnitude of difference in article quantity and quality between the Global North and South.

Wikipedia allows anybody to edit or submit articles and has come under fire before for non-representative or biased coverage in areas ranging from women in STEM to black history. The root problem is often diagnosed as a lack of editor diversity: The vast majority of editors are white men, a trend thats even more pronounced among frequent editors.

Wikipedias information gap about floods also reflects the broader dearth of media coverage of disasters in low-income countries. Wikipedia authors are required to cite sources when they write or edit articles, and a severe lack of media coverage on floods and other disasters in underreported regions makes the corresponding Wikipedia articles inevitably less detailed. Articles about poorer and non-English speaking places that do surface in Western media tend to reinforce stereotypes or provide a flat, truncated snapshot of the situation.

These omissions on Wikipedia make even basic information about disasters in overlooked areas much more difficult for citizens or local policymakers to find, since data can be scattered across different flood databases. And people living in countries that tend to be ignored by Wikipedia editors might also be missing out on an important benefit that Wikipedia provides to higher-income communities as disasters are happening.

In particular in disasters, those pages are created when the disaster unfolds minutes or seconds after the disaster hits, said study author Carlos Castillo, a data science professor at Universitat Pompeu Fabra. And then they are visited very heavily. Theyre clearly providing a service to the community. English-language Wikipedia pages clearly wouldnt do much good in areas where most people dont speak or read English or areas without reliable internet access. But more robust Wikipedia coverage could provide helpful and accurate information in areas that are considered the most vulnerable to extreme weather events related to climate change.

If Wikipedias breathtaking variety of pedantic, obscure, and borderline pointless articles is any indication i.e. this bizarre New Yorkerlength article about the urge to defecate upon entering a bookstore the collective army of online editors should certainly be capable of amassing information about major disasters in developing countries. Efforts to tackle other famous gaps in Wikipedias coverage, like articles on female scientists or African-American public figures and history, have included mass Edit-a-thons focused on boosting information about a single topic or subject area.

Changing the makeup of the people who write (or dont write) the information that millions of people reference seems to be the best, and perhaps only, solution. All information about disasters is biased, Castillo added. But Wikipedia in reality is only as representative as the editors it has.

Never miss a beat! Sign up for The Beacon today. Its your daily dose of good news coupled with all the latest environmental coverage from Grist. Stop freakin and sign up for The Beacon. Youre gonna love it.

Read the original post:
Wikipedia is flooded with information but it has a blind spot - Grist

Building the bots that keep Wikipedia fresh – GCN.com

Building the bots that keep Wikipedia fresh

While we can all learn from Wikipedias 40 million articles, government bot builders specifically can get a significant education by studying the creation, vetting and roles of the 1,601 bots that help maintain the site and interact with its more than 137,000 human editors.

Researchers at Stevens Institute of Technology classified the Wikipedia bots into nine roles and 25 associated functions with the goal of understanding what bots do now and what they might do in the future. Jeffrey Nickerson, professor and associate dean of research at Stevens School of Business, and an author of The Roles Bots Play in Wikipedia, published in November 2019, likened the classification to the way humans talk about occupations and professions, the skills required to do them and the tasks that must be performed.

Each bot performs a unique job: some generate articles based on templates; some fix typos, spelling mistakes and errors in links; some identify spam, vandals or policy violations; some interact with humans by greeting newcomers, sending notifications or providing suggestions.

The nine main roles account for about 10% of all activity on the site and up to 88% of activity on subsections, such as the Wikidata platform, where more than 1,200 fixer bots have made a total of more than 80 million edits, according to the report.

Anyone can build a bot -- an automated, artificial intelligence-powered software tool -- for use in Wikipedia, but before its deployed, it needs the blessing of the Bot Approval Group. Members determine what the bot will do and which pages it will touch, and they review a trial run of the bot on sample data. That may be all that's required, or the group may also ask to check the source code, Nickerson said. That entire process is public.

Its a good place to start [for bot builders] because you can actually see it, Nickerson said. You can see the bots that are successful, and you can see the conversations take place there, and you can see the way the developers of the bots actually talk to the end users.

Builders consider risks and advantages of their bots, what functions they will start with and which features will come later, and how their bot might interact with others that perform similar functions, for example, he said.

Theres this vetting of the bot, Nickerson said. If the bot is going to do something fairly minor and not on very many pages, there may be less vetting than if the bot is going to create a whole bunch of new pages or is going to do a lot edits.

Another feature of the Wikipedia bots is how they work with human editors. Often, editors create a bot to automate some of their editing processes, Nickerson said. Once they build it, they set it loose and check on it periodically. That frees the editors to do the work that most interests them, but they also become bot maintainers.

The subsection of Wikipedia called Wikidata, a collaboratively editedknowledge baseof open source data, is especially bot-intensive. The platform is a knowledge graph, meaning that every piece of knowledge has a little fact involved and because of the way these are hooked together, the value of it can be a link to another fact, and essentially it forms a very, very large graph, Nickerson said.

Wikidatas factual information is used in knowledge production in Wikipedia articles, thanks to adviser and fixer bots. For example, when theres an election, the results will populate in Wikidata, and pages about a citys government will automatically update the name of the mayor by extracting election information from Wikidata.

Bots interaction with human editors are critical to the success of a website based on knowledge production. On Wikipedia, if someone makes an incorrect edit, a bot may reverse that change and explain what was wrong. Being corrected by a machine can be unpleasant, Nickerson said, but bots can also be diplomatic.

The researchers call these first- and second-order effects. The former are the knowledge artifacts the bots help protect or create, while the latter are the reactions they bring out in humans.

They can actually pay attention to what people are interested in, he said. They can be patient. They can direct somebody toward a page that they know with high probability is going to be the kind of page where that person can actually make an important contribution. The instinct of some people is to go to the pages that are actually very highly edited and very mature and try to make changes to those pages, and thats actually not the right place to start. The place to start is with a page that is newer and needs a particular kind of expertise.

When human editors have a positive interaction with bots right out of the gate, that helps with the cultural aspect of bot building. It also provides insight into what makes a bot successful -- a topic Nickerson plans to study more in the future.

Researchers at MIT, meanwhile, have developed a system to further automate the work done by Wikipedias human editors. Rather than editors crafting updates, a text generating system would take unstructured information and rewrite the entry in a humanlike fashion.

Unlike the rules-based bots on the site, MITs bot takes as input an outdated sentence from a Wikipedia article, plus a separate claim sentence that contains the updated and conflicting information, according to a report in MIT News. The system updates the facts but maintains the existing style and grammar. Thats an easy task for humans, but a novel one in machine learning, it added.

About the Author

Stephanie Kanowitz is a freelance writer based in northern Virginia.

Read the rest here:
Building the bots that keep Wikipedia fresh - GCN.com

Wikipedia Is the Last Best Place on the Internet – WIRED

The site's innovations have always been cultural rather than computational. It was created using existing technology. This remains the single most underestimated and misunderstood aspect of the project: its emotional architecture. Wikipedia is built on the personal interests and idiosyncrasies of its contributors; in fact, without getting gooey, you could even say it is built on love. Editors' passions can drive the site deep into inconsequential territoryexhaustive detailing of dozens of different kinds of embroidery software, lists dedicated to bespectacled baseball players, a brief but moving biographical sketch of Khanzir, the only pig in Afghanistan. No knowledge is truly useless, but at its best, Wikipedia weds this ranging interest to the kind of pertinence where Larry David's Pretty, pretty good! is given as an example of rhetorical epizeuxis. At these moments, it can feel like one of the few parts of the internet that is improving.

One challenge in seeing Wikipedia clearly is that the favored point of comparison for the site is still, in 2020, Encyclopedia Britannica. Not even the online Britannica, which is still kicking, but the print version, which ceased publication in 2012. If you encountered the words Encyclopedia Britannica recently, they were likely in a discussion about Wikipedia. But when did you last see a physical copy of these books? After months of reading about Wikipedia, which meant reading about Britannica, I finally saw the paper encyclopedia in person. It was on the sidewalk, being thrown away. The 24 burgundy-bound volumes had been stacked with care, looking regal before their garbage-truck funeral. If bought new in 1965, each of them would have cost $10.50the equivalent of $85, adjusted for inflation. Today, they are so unsalable that thrift stores refuse them as donations.

Wikipedia and Britannica do, at least, share a certain lineage. The idea of building a complete compendium of human knowledge has existed for centuries, and there was always talk of finding some better substrate than paper: H. G. Wells thought microfilm might be the key to building what he called the World Brain; Thomas Edison bet on wafer-thin slices of nickel. But for most people who were alive in the earliest days of the internet, an encyclopedia was a book, plain and simple. Back then, it made sense to pit Wikipedia and Britannica against each other. It made sense to highlight Britannica's strengthsits rigorous editing and fact-checking procedures; its roster of illustrious contributors, including three US presidents and a host of Nobel laureates, Academy Award winners, novelists, and inventorsand to question whether amateurs on the internet could create a product even half as good. Wikipedia was an unknown quantity; the name for what it did, crowdsourcing, didn't even exist until 2005, when two WIRED editors coined the word.

Wikipedia is built on the personal interests and idiosyncrasies of its contributors. You could even say it is built on love.

That same year, the journal Nature released the first major head-to-head comparison study. It revealed that, for articles on science, at least, the two resources were nearly comparable: Britannica averaged three minor mistakes per entry, while Wikipedia averaged four. (Britannica claimed almost everything about the journal's investigation was wrong and misleading, but Nature stuck by its findings.) Nine years later, a working paper from Harvard Business School found that Wikipedia was more left-leaning than Britannicamostly because the articles tended to be longer and so were likelier to contain partisan code words. But the bias came out in the wash. The more revisions a Wikipedia article had, the more neutral it became. On a per-word basis, the researchers wrote, the political bent hardly differs.

But some important differences don't readily show up in quantitative, side-by-side comparisons. For instance, there's the fact that people tend to read Wikipedia daily, whereas Britannica had the quality of fine china, as much a display object as a reference work. The edition I encountered by the roadside was in suspiciously good shape. Although the covers were a little wilted, the spines were uncracked and the pages immaculatetelltale signs of 50 years of infrequent use. And as I learned when I retrieved as many volumes as I could carry home, the contents are an antidote for anyone waxing nostalgic.

I found the articles in my '65 Britannica mostly high quality and high minded, but the tone of breezy acumen could become imprecise. The section on Brazil's education system, for instance, says it is good or bad depending on which statistics one takes and how they are interpreted. Almost all the articles are authored by white men, and some were already 30 years out of date when they were published. Noting this half-life in 1974, the critic Peter Prescott wrote that encyclopedias are like loaves of bread: the sooner used, the better, for they are growing stale before they even reach the shelf. The Britannica editors took half a century to get on board with cinema; in the 1965 edition, there is no entry on Luis Buuel, one of the fathers of modern film. You can pretty much forget about television. Lord Byron, meanwhile, commands four whole pages. (This conservative tendency wasn't limited to Britannica. Growing up, I remember reading the entry on dating in a hand-me-down World Book and being baffled by its emphasis on sharing milkshakes.)

The worthies who wrote these entries, moreover, didn't come cheap. According to an article in The Atlantic from 1974, Britannica contributors earned 10 cents per word, on averageabout 50 cents in today's money. Sometimes they got a full encyclopedia set as a bonus. They apparently didn't show much gratitude for this compensation; the editors complained of missed deadlines, petulant behavior, lazy mistakes, and outright bias. People in the arts all fancy themselves good writers, and they gave us the most difficult time, one editor told The Atlantic. At Britannica rates, the English-language version of Wikipedia would cost $1.75 billion to produce.

There was another seldom remembered limitation to these gospel tomes: They were, in a way, shrinking. The total length of paper encyclopedias remained relatively finite, but the number of facts in the universe kept growing, leading to attrition and abbreviation. It was a zero-sum game in which adding new articles meant deleting or curtailing incumbent information. Even the most noteworthy were not immune; between 1965 and 1989, Bach's Britannica entry shrank by two pages.

By the time the internet came into being, a limitless encyclopedia was not just a natural idea but an obvious one. Yet there was still a senseeven among the pioneers of the webthat, although the substrate was new, the top-down, expert-driven Britannica model should remain in place.

Original post:
Wikipedia Is the Last Best Place on the Internet - WIRED