Archive for the ‘Wikipedia’ Category

Battle of the macrons: Debate about Mori words on Wikipedia ends – Newstalk ZB

A Christchurch man's campaign for Wikipedia to officially adopt the use of macrons in Mori words has been a success.

Axel Wilke proposed Wikipedia change its New Zealand naming conventions, saying the website is "one of the last bastions of macron resistance for place names".

Debates and editing battles have long raged on the popular site for years, with various editors repeatedly adding and removing macrons from words.

Christchurch man Axel Wilke (left) and museum curator Mike Dickison have led the campaign to make the use of macrons official on Wikipedia. Photo / Supplied

"Macrons have been used in Wikipedia for some time: every use of the word "Mori" has its macron, and articles are increasingly adopting macrons in their names . . . But place names have always been a sticking point. For some reason, people feel especially attached to towns and rivers, and resist changing their spelling," Wilke said.

"Wikipedia rules have, for years, stated that place names were 'under discussion', and macrons have not been used in the meantime for place names," Wilke said.

Wikipedia is written by volunteers, with naming conventions discussed and decided on by contributors.

He suggested the naming conventions be amended to include macrons in cases where the New Zealand Geographic Board has adopted them.

Wilke said the decision was finally made by a user in England.

"It may be surprising to some people that I, a non-New Zealander, appear to be deciding this matter," the user wrote.

"This is a red herring. As a discussion closer, my role is not to decide, but to determine what the community has decided in the discussion below.

"I determine this in the way set out at Wikipedia:Closing discussions, and my role is to evaluate what we call the 'consensus', which on Wikipedia is not unanimity but 'rough consensus'."

In the end, 33 editors voted for the adoption of the new guideline, with five against.

"This marks a big change for Wikipedia. The idea was first raised on Wikipedia discussion pages in 2007 with no clear consensus," Wilke said.

In 2018, volunteersengaged in a battleover whether the Kpiti town of Paekkriki should have macrons in its name, with editors repeatedly removing and replacing the macrons from the page.

"It's gratifying to see that after a well-researched proposal was put forward, agreement on macron use for place names has now been achieved."

Originally posted here:
Battle of the macrons: Debate about Mori words on Wikipedia ends - Newstalk ZB

Coronavirus updates in Hindi, Bangla, Tamil and 6 more Indian languages on Wikipedia – The Indian Express

By: Tech Desk | New Delhi | Updated: March 27, 2020 2:32:09 pm Coronavirus related information now available in 9 Indian languages on Wikipedia

Cases of COVID-19 are rising significantly in India with every passing day. In just a week it has crossed the 700 mark and this is worrying the Indian government as well as the citizens. While India is under 21 days lockdown until April 14 people are stuck at home and are completely dependent on the internet, social media and television for information related to coronavirus pandemic.

Sadly, given the current state the internet is flooded with unverified information, fake news, and rumours and finding authentic information is difficult. This is were Wikipedia is playing a big role by informing people with verified information related to coronavirus pandemic. Wikipedia is providing COVID-19 related information in 9 Indian languages including Bangla, Bhojpuri, Hindi, Kannada, Arabic, Malayalam, Tamil, Telugu, and Urdu.

The World Health Organisation (WHO) has warned against an infodemic of inaccurate information that makes it hard for people to find trustworthy sources and reliable guidance when they need it. This is a monumental challenge that a group of Wikipedia editors is tackling as they add, review, and improve COVID-19 information in English as well as 9 Indian languages.

ALSO READ: This app lets you report coronavirus suspect, find testing labs nearby and more

Given Wikipedia follow an open editing model designed to prevent bias it has partnered with SWASTHA, a branch of another much larger Wikipedia group WikiProject Medicine which includes doctors and experts from around the globe.WikiProject Medicine has so far produced more than 35,000 medical articles across different languages that are monitored by more than 150 editors. With this partnership, Wikipedia aims to make critical coronavirus health information freely accessible to all Indians.

Verifying what is a coronavirus fact versus fiction is a huge job, and we are calling on local universities to help as we increase efforts to translate and review local Indic content about the pandemic, said Abhishek Suryawanshi whos a part of the newly-formed Wikipedia group. Indias volunteer editors have created multiple articles such as Wikipedia article about the coronavirus pandemic in India, across multiple Indian languages. The English Wikipedia article alone has been edited 1,400 times by more than 100 editors.

ALSO READ: Cybersecurity experts come together to fight coronavirus-related hacking

SWASTHA works with Indias National Health Authority and Ministry of Health as well as with international pandemic control experts from Johns Hopkins University in the United States and the World Health Organisation in Switzerland. To produce better results SWASTHA said it requires more help from local partners as the pandemic grew to help them reach local communities.

The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines

For all the latest Technology News, download Indian Express App.

IE Online Media Services Pvt Ltd

Excerpt from:
Coronavirus updates in Hindi, Bangla, Tamil and 6 more Indian languages on Wikipedia - The Indian Express

We Need the Wisdom of Wikipedia – The LumberJack

Weve all been there. Youre sitting in a class. Your professor wants you to write a paper on the different types of asexual reproduction of the Sanderia malayensis jellyfish or some other arcane drivel. Your first reaction is to hit up Wikipedia. Then comes the kicker. You cant cite Wikipedia. You scowl and snarl under your breath.

Wikipedia is cool and it is useful. Turning a blind eye to Wikipedia as a reliable source is shortsighted and has implications beyond the realm of encyclopedias. Distrusting Wikipedia represents academias unwillingness to open the gates of collaborative truth-knowledge.

Scientific papers, meanwhile, are far from perfect.

Contrary to what your professors may tell you, Wikipedia, as a source, is statistically just as accurate as published encyclopedias for most of its content. A 2005 study by the Nature research journal, Internet encyclopaedias go head to head, found errors in both encyclopedias, but among the entries tested, the difference in accuracy was small.

Wikipedia, in their signature self-aware style, has reported on their own reliability as well. Wikipedia does not guarantee validity, but it is an invaluable research resource.

Inaccurate information on Wikipedia is usually corrected quickly. Hyperlinked citations back up nearly every claim made on an entry. The Sanderia malayensis jellyfishs page hosts six sources from international professionals, biologists and a handbook on poisonous jellies.

Scientific papers, meanwhile, are far from perfect. Soft sciences have suggested cures to unhappiness or boosts to confidence through simple behavioral change, but as other researchers try to replicate the experiments, their conclusions are significantly different. This indicates a serious error in the scientific method. If science isnt replicable, science is null.

In the last few years, a plethora of papers have fallen under criticism after researchers have failed to reproduce their resultsits been called the replication crisis. The crisis may have a few sources.

Mistakes happen on Wikipedia too and it is always essential to be critical of anything read.

First, its not hard to get published. The University of World News said in 2018 that too much scientific research is being published. It estimated nearly 30,000 scientific journals are in circulation, publishing approximately two million articles each year. They said the volume burdens the peer review system and makes it dysfunctional.

Second, the media likes to be the first to report on news, including science news. Journalists can be wrong and often are when it comes to reporting on science, especially when theyre grasping to be the first to report on new findings. These bad practices report inaccurate, unconfirmed, flawed science to their audience before the study can be replicated.

Mistakes happen on Wikipedia too and it is always essential to be critical of anything read. Search around, find supporting articles for any claim made and be aware that there may be flaws. But be able to recognize valid and sound knowledge.

Critical review by the editors of Wikipediawho can be any personis what makes Wikipedia so powerful and so accurate. Its the worlds largest encyclopediaabout 50 times larger than Britannicawith over six million entries and over 200,000 contributors. Wikipedia should serve as a banner for collaborationespecially between diverse groups.

In The wisdom of polarized crowds, a 2019 study from Nature Human Behavior, researchers found politically-diverse teams created more accurate entries than teams with less political diversity.

Wikipedia comes in clutch, often. Using it as a source may be frowned upon by professors, but a short chat with most of them and theyll say Wikipedia is an excellent place to start. The website is a tool, not a cheat code. It would be ignorant to ignore it, but if its used appropriately, maybe, just maybe, we could learn something about jellyfish.

Like Loading...

Read the original here:
We Need the Wisdom of Wikipedia - The LumberJack

Wikipedia is flooded with information but it has a blind spot – Grist

Remember the 2018 floods in Sudan?

If the answer is no, youre not alone and its not just disaster fatigue making you lose track. The devastating floods that killed 23 people and affected 70,000 more barely caused a blip in American media.

That erasure now lives on in Wikipedia, the collaboratively written online encyclopedia that receives billions of visits per year. According to a recent study, the 2018 Sudan floods are just one of many major floods in low-income countries especially in Africa, Asia, and Latin America that either have truncated Wikipedia pages or lack pages altogether.

Get Grist in your inboxAlways free, always fresh

The Beacon Other choices

Ask your climate scientist if Grist is right for you. See our privacy policy

The researchers cross-referenced three global databases to identify major floods that occurred between 2016 and 2019 and found that fewer than 20 percent of major floods in low-income countries have Wikipedia pages in English. Meanwhile, 68 percent of major floods in high-income countries have been immortalized on Wikipedia. Although a bias toward English-speaking places can be expected in English-language Wikipedia articles, the authors found that language alone doesnt account for the magnitude of difference in article quantity and quality between the Global North and South.

Wikipedia allows anybody to edit or submit articles and has come under fire before for non-representative or biased coverage in areas ranging from women in STEM to black history. The root problem is often diagnosed as a lack of editor diversity: The vast majority of editors are white men, a trend thats even more pronounced among frequent editors.

Wikipedias information gap about floods also reflects the broader dearth of media coverage of disasters in low-income countries. Wikipedia authors are required to cite sources when they write or edit articles, and a severe lack of media coverage on floods and other disasters in underreported regions makes the corresponding Wikipedia articles inevitably less detailed. Articles about poorer and non-English speaking places that do surface in Western media tend to reinforce stereotypes or provide a flat, truncated snapshot of the situation.

These omissions on Wikipedia make even basic information about disasters in overlooked areas much more difficult for citizens or local policymakers to find, since data can be scattered across different flood databases. And people living in countries that tend to be ignored by Wikipedia editors might also be missing out on an important benefit that Wikipedia provides to higher-income communities as disasters are happening.

In particular in disasters, those pages are created when the disaster unfolds minutes or seconds after the disaster hits, said study author Carlos Castillo, a data science professor at Universitat Pompeu Fabra. And then they are visited very heavily. Theyre clearly providing a service to the community. English-language Wikipedia pages clearly wouldnt do much good in areas where most people dont speak or read English or areas without reliable internet access. But more robust Wikipedia coverage could provide helpful and accurate information in areas that are considered the most vulnerable to extreme weather events related to climate change.

If Wikipedias breathtaking variety of pedantic, obscure, and borderline pointless articles is any indication i.e. this bizarre New Yorkerlength article about the urge to defecate upon entering a bookstore the collective army of online editors should certainly be capable of amassing information about major disasters in developing countries. Efforts to tackle other famous gaps in Wikipedias coverage, like articles on female scientists or African-American public figures and history, have included mass Edit-a-thons focused on boosting information about a single topic or subject area.

Changing the makeup of the people who write (or dont write) the information that millions of people reference seems to be the best, and perhaps only, solution. All information about disasters is biased, Castillo added. But Wikipedia in reality is only as representative as the editors it has.

Never miss a beat! Sign up for The Beacon today. Its your daily dose of good news coupled with all the latest environmental coverage from Grist. Stop freakin and sign up for The Beacon. Youre gonna love it.

Read the original post:
Wikipedia is flooded with information but it has a blind spot - Grist

Building the bots that keep Wikipedia fresh – GCN.com

Building the bots that keep Wikipedia fresh

While we can all learn from Wikipedias 40 million articles, government bot builders specifically can get a significant education by studying the creation, vetting and roles of the 1,601 bots that help maintain the site and interact with its more than 137,000 human editors.

Researchers at Stevens Institute of Technology classified the Wikipedia bots into nine roles and 25 associated functions with the goal of understanding what bots do now and what they might do in the future. Jeffrey Nickerson, professor and associate dean of research at Stevens School of Business, and an author of The Roles Bots Play in Wikipedia, published in November 2019, likened the classification to the way humans talk about occupations and professions, the skills required to do them and the tasks that must be performed.

Each bot performs a unique job: some generate articles based on templates; some fix typos, spelling mistakes and errors in links; some identify spam, vandals or policy violations; some interact with humans by greeting newcomers, sending notifications or providing suggestions.

The nine main roles account for about 10% of all activity on the site and up to 88% of activity on subsections, such as the Wikidata platform, where more than 1,200 fixer bots have made a total of more than 80 million edits, according to the report.

Anyone can build a bot -- an automated, artificial intelligence-powered software tool -- for use in Wikipedia, but before its deployed, it needs the blessing of the Bot Approval Group. Members determine what the bot will do and which pages it will touch, and they review a trial run of the bot on sample data. That may be all that's required, or the group may also ask to check the source code, Nickerson said. That entire process is public.

Its a good place to start [for bot builders] because you can actually see it, Nickerson said. You can see the bots that are successful, and you can see the conversations take place there, and you can see the way the developers of the bots actually talk to the end users.

Builders consider risks and advantages of their bots, what functions they will start with and which features will come later, and how their bot might interact with others that perform similar functions, for example, he said.

Theres this vetting of the bot, Nickerson said. If the bot is going to do something fairly minor and not on very many pages, there may be less vetting than if the bot is going to create a whole bunch of new pages or is going to do a lot edits.

Another feature of the Wikipedia bots is how they work with human editors. Often, editors create a bot to automate some of their editing processes, Nickerson said. Once they build it, they set it loose and check on it periodically. That frees the editors to do the work that most interests them, but they also become bot maintainers.

The subsection of Wikipedia called Wikidata, a collaboratively editedknowledge baseof open source data, is especially bot-intensive. The platform is a knowledge graph, meaning that every piece of knowledge has a little fact involved and because of the way these are hooked together, the value of it can be a link to another fact, and essentially it forms a very, very large graph, Nickerson said.

Wikidatas factual information is used in knowledge production in Wikipedia articles, thanks to adviser and fixer bots. For example, when theres an election, the results will populate in Wikidata, and pages about a citys government will automatically update the name of the mayor by extracting election information from Wikidata.

Bots interaction with human editors are critical to the success of a website based on knowledge production. On Wikipedia, if someone makes an incorrect edit, a bot may reverse that change and explain what was wrong. Being corrected by a machine can be unpleasant, Nickerson said, but bots can also be diplomatic.

The researchers call these first- and second-order effects. The former are the knowledge artifacts the bots help protect or create, while the latter are the reactions they bring out in humans.

They can actually pay attention to what people are interested in, he said. They can be patient. They can direct somebody toward a page that they know with high probability is going to be the kind of page where that person can actually make an important contribution. The instinct of some people is to go to the pages that are actually very highly edited and very mature and try to make changes to those pages, and thats actually not the right place to start. The place to start is with a page that is newer and needs a particular kind of expertise.

When human editors have a positive interaction with bots right out of the gate, that helps with the cultural aspect of bot building. It also provides insight into what makes a bot successful -- a topic Nickerson plans to study more in the future.

Researchers at MIT, meanwhile, have developed a system to further automate the work done by Wikipedias human editors. Rather than editors crafting updates, a text generating system would take unstructured information and rewrite the entry in a humanlike fashion.

Unlike the rules-based bots on the site, MITs bot takes as input an outdated sentence from a Wikipedia article, plus a separate claim sentence that contains the updated and conflicting information, according to a report in MIT News. The system updates the facts but maintains the existing style and grammar. Thats an easy task for humans, but a novel one in machine learning, it added.

About the Author

Stephanie Kanowitz is a freelance writer based in northern Virginia.

Read the rest here:
Building the bots that keep Wikipedia fresh - GCN.com