Archive for the ‘Wikipedia’ Category

Wikipedia, tampons and the Simpsons: Things to do in Hamilton March 7, 2017 – CBC.ca


CBC.ca
Wikipedia, tampons and the Simpsons: Things to do in Hamilton March 7, 2017
CBC.ca
Here's our list of great things to do today in Hamilton. Scroll through the box below to have a look. Anything we're missing? Let us know below or on Twitter or by email or Facebook. On mobile? Read here.

Read the original:
Wikipedia, tampons and the Simpsons: Things to do in Hamilton March 7, 2017 - CBC.ca

Help expand Wikipedia’s entries on women in the arts at these local edit-a-thons – Generocity

An average of 800 Wikipedia articles are being created per day, but theres always more work to be done in making sure theres information correct information on the people, places and events that have shaped our history.

Hence the many edit-a-thons being held where people gather to edit and create entries together, as well as learn how to do it themselves. Philadelphians have been pretty good at organizing these events, such aswhen Bryn Mawr College hosted an edit-a-thon for women in STEM or when WHYYs Terry Gross wanted more entries on the guests shes interviewed on air.

March is dedicated to the annual Art+Feminism Wikipedia Edit-a-thon, started in 2014, in whichpeoplearound the world gather to enhance the information available on women inthe arts.

Temple University just hosted its edit-a-thon this past Friday and got some pretty good work done 17 editors edited a total of 19 articles, with 137 total edits made.

With the Moore College of Art and Design set to host its event next on March 17, plus one at University of the Arts on March 18 and one at the Philadelphia Museum of Art on March 26, you still have plenty of time to get prepared and get involved.

The PMA even has a crowdsourced list of suggestions for entries to create and improve upon and as you can see, its pretty extensive. The list includes a bunchof historical figures such as paintersSusan H. Bradley andBlanche Dillaye,who both studied at PAFA.

But there are plenty of people and organizations around today that we at Generocity feel could always use some fine-tuning with their entries maybe someone like Michelle Taylor, a.k.a. Feminista Jones, program manager for Witnesses to Hunger.

Register for any of the above events here.

Albert Hong is Generocity's contributing reporter. He started hanging around the Technically Media office as a summer intern for Technical.ly and eventually made his way to freelancing for both news sites. While technology and video games are two of his main interests, he's grown to love Philadelphia as a city and is always excited to hear someone else's story.

See more here:
Help expand Wikipedia's entries on women in the arts at these local edit-a-thons - Generocity

The Wikipedia Battle Over Really Short Articles – Slate Magazine

How short is too short?

Photo illustration by Slate. Ruler image by iStock.

You probably wouldnt expect a blood protein to create a major fuss about one of the internets largest platforms. Yet here we are.

As Andrea James described on Boing Boing in February, Wikipedia editors recently went to battle over the removal of an article on the blood protein hemovanadin. (It has since been restored.) Even though the article is three sentences long, it is well-sourced, and while it is unlikely to become much longer, it obviously is scientific and potentially useful to Wikipedia readers. After all, good coverage of obscure, academic topics is one of Wikipedias advantages. In a follow-up piece, James argued that the hemovanadin incident is an example of deletionism,an extreme version of Wikipedia editing philosophy. Whats more, James said that deletionism is a threat to Wikipedia, as it leads to eliminating valuable seed contributions. If you, like so many, rely on Wikipedia to settle dinner-table disputes or start work on a term paper, reading about a threat to Wikipedia should be alarming.

But its a complicated story that requires you to understand certain things about how Wikipedia actually works. Wikipedia is edited entirely by volunteers, who create articles and stubs, debate changes, and try to enforce the sites many policies and guidelines. Subjects must meet certain notability standards to be included, but those standards vary depending on the topic. While in some areas, like the notability of academics, the criteria are quite clear, in others there is a lot of interpretive freedom and different editors make judgment calls about leaving or deleting articles basing on their gut feeling (which very well may have been the case of hemovanadin).

Deleting is much easier than writing.

Even if we optimistically assumed that Wikipedia volunteers all know the policies by heart (and it is virtually impossibleI once checked and found that the different regulatory documents on Wikipedia are more than 150,000 words), they all interpret them differently. The removal of the hemovanadin article and other examples dont necessarily mean that the whole system of selecting articles for deletion is broken. People make mistakes, even Wikipedians, who are typically hard-working, dedicated to common good, and generally knowledgeable people. Still, the way Wikipedia treats short articles, and how it approaches deleting content in general, is detrimental to it in the long run.

Deletionists, as opposed to inclusionists, generally believe that the threshold for notability of topics covered on Wikipedia should be high. They also think that all content added to Wikipediaeven if it is meant as a stub to be developed later, like the hemovanadin itemshould meet the high editorial standards of the worlds leading encyclopedia.

This approach can be utterly frustrating and demotivating, especially to new editors. They can get frustrated when their stub articles get deleted and they dont really understand why, and no one tells them how they can improve their work for the future. To make matters worse, even a relatively small number of dedicated deletionists can make a huge impact, as deleting is much easier than writing.

In fact, the very ease of this process may be the reason for deletionisms prevalence: Many Wikipedians suffer from editcountitis, the state of being overly obsessed with the number of edits one makes. Deleting is a quick and easy way to score. The phenomenon is dangerous, as a lot of Wikipedias powerful model relies on micro-contributions. Most people first get involved with Wikipediaone of the largest social movements in historyby making some minor corrections or starting a small article that is missing. If their contributions get deleted, especially if there is no sufficient explanation why, they are likely to quit. It is quite destructive to the communitys long-term survival, as Wikipedia has struggled for quite a while with editor retention. Deletionism also often affects very specialized fields: For niche topics, an editor who is unfamiliar with them can find it really difficult to ascertain notability correctly.

On the other hand, deletionists have some points, too. After all, we dont need encyclopedic articles for every single Pokmon. In fact, Wikipedia used to have them all described under separate articles. At some point inclusionists even referred to a Pokmon test as an argument for a given articles inclusion: They argued that if a single Pokmon can have its own article, then surely the discussed topic is encyclopedic, too. But in early 2007, many of the articles about Pokmon were merged into one main entry, and others were deleted. Now the prevailing thought is that just because something can be described by verifiable sources doesnt necessarily mean its notable.

Stubs are a particular point of contention for deletionists. When a stub is created, a link to the article from elsewhere on Wikipedia turns from red to blue, and the article no longer appears to be missing. Editors are generally encouraged to create red links to nonexistent articles, if they want to indicate that the topic is notable and worth covering. Research shows that red links help Wikipedia grow, or at least they did in the past: Editors perceive such red links as invitations to creating articles. But if only a short stub is created, editorsno longer seeing those red links that scream outmay feel the topic is already covered. Short stubs can exist for years, and they do not do justice to the typical high accuracy and informational saturation of Wikipedia articles.

In theory, instead of deleting, Wikipedia editors could just add more references or slightly expand the stub to make it better. Still, deleting is much quicker. Also, sometimes stubs are deleted not just because of a lack of information or references but because of their style. An article about early childhood trauma and resilience is a great example: While the knowledge contained in the article is really useful and well-developed, it is different stylistically from typical encyclopedic articles, and it does not follow the typical referencing syntax. It is perfectly understandable why it may be easier to delete the article rather than help improve it.

Nevertheless, deletionism in its current form and the general approach to stubs are damaging to Wikipedia. We need a cultural shift to prioritize support for goodwill, to encourage generation of fleshed-out articles about notable topics, and to be more forgiving and more inviting to the general public.

First, it would be useful if stub articles were not deleted as often, but instead flagged for expansion or improvement, with clear notation that it is a work in progress. This change would require a behavioral change of Wikipedians, so it will likely turn out to be difficult. After all, Wikipedia already has a work in progress template, which could and should be used for this purpose. But unfortunately, it is not very popular among editors.

Second, better sorting of stubs would help. Even though stubs already are marked as such, Wikipedians do not often focus on expanding them, possibly due to the fact that it is not easy to filter out stubs from specific areas of interest that one may have. Sadly, categorization of stubs is not consistently applied, although some important efforts are made in this respect. (A dedicated task force spends considerable time sorting stubs).

Third, in an even bolder move, we could consider introducing a different color for links leading to stubs and more aggressive flagging of incomplete articles. Such a change would go against the historical trend, though: On some projects (like the German and Polish ones), stubs are already not marked at all.

Fourth, the editors with deletionist inclination should put effort intoconstructive criticismafter all, the authors put considerable effort into developing the articles. Just like in academia, writing useful suggestions for improvement is difficult, but it also helps achieve a much better result in the end, while not frustrating the newcomers with sheer, imprecise negativity. If the Wikipedia community wanted to enforce this behavior, deleting promising, easily expandable stubs on clearly notable subjects without proper feedback to the author should be considered damaging to Wikipedia.

Fifth, whatever threshold for notability criteria we agree on, it is even more important for them not to be selectively biased. For instance, if we have very detailed articles about popular culture, we should make sure we put even more effort in developing articles, not just about the sciences, but also about topics that are simply more culturally diverse, and referring to different phenomena, institutions, and people from other countries with the same notability threshold (in practice, not just theory) as the one used on the English-language Wikipedia. A lot of misunderstandings and conflicts stem from the fact that Wikipedias notability criteria seem to be very uneven across fields, and they are also prone to possible gender bias.

Finally, more experienced editors should make a more serious effort to expand their contributions, if they can. Sometimes it is better to create one solid starting article than three stubs. Writing three stubs is much more useful than deleting six stubs. Experienced Wikipedians usually know other editors and can ask them for help in developing the articles, thus they should at least make an effort to not leave poor stubs unattended. Some of them should be also politely advised to use their own personalized sandboxes before publishing half-baked stubs.

Deleting someones work without proper feedback has a very bad effect on his or her engagement. Sometimes, if the person is a troll, thats a good thingbut if it affects good editors, it damages Wikipedia in the long term. After all, the two most typical reactions to ones work being deleted is fighting or fleeing. And obviously, it is not only the newcomers who get upset when their articles disappearit affects well-seasoned Wiki-veterans, too. This is why it is so important to put sufficient effort into explaining the reasons for justified deletion and to support the goodwill contributors, even if their work is not good enough to keep.

Though the author currently serves on the board of trustees of the Wikimedia Foundation, the views expressed in this article are solely his own.

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.

More:
The Wikipedia Battle Over Really Short Articles - Slate Magazine

Intellipedia: How to Access the Wikipedia of US Secret Services – Guiding Tech (blog)

Were well aware of Wikipedia and the plethora of information it stores about everything in this world, but are you aware the US government maintains their own secret Wiki Intellipedia which stores information in a similar fashion as Wikipedia, just with a few tweaks containing facts about the issue.

Intellipedia is an encyclopaedia for US government secret services as well as other government organisations with similar clearance and the website has been active for a little more than a decade.

The website has three levels of classification of the contained data one consists sensitive but unclassified documents, the mid-level one contains secret information and another for top secret information.

The top secret wiki contains almost 40% of a total of 269,000 articles on the website.

Since 2014, multiple applications have been filed under the Freedom of Information Act, which has allowed public access to several unclassified documents present in the secret service encyclopaedia.

The pages of Intellipedia are essentially copies of Wikipedia pages for the same topic but with additional sensitive and critical information added by analysts of the intelligence community.

Official access to Intellipedia is restricted to authorised personnel only which presumably are members of the 17 intelligence agencies of the US government and if anyone else is found trying to gain unauthorised entry into the database, theyll face criminal prosecution asthe website mentions.

However, since the Freedom of Information Act (FOIA) grants the right to the citizenry to ask for information even from the intelligence agencies, news media outlets such as MuckRock and other websites such as The Black Vault have been hoarding their websites with findings from applications under FOIA.

You can access multiple files from Intellipedia which includes information about Area 51, Benghazi, JFK assassination, Project MK Ultra, UFOs, Greenbrier files, Freemasonry, Bay of Pigs files and many more interesting reads which have been unclassified.

There are loads of other interesting topics that you might wish to learn about on this secret wiki and John Greenwalds Black Vault is a sure stop if you undertake this journey of finding out what extra information do the secret service guys hold.

See more here:
Intellipedia: How to Access the Wikipedia of US Secret Services - Guiding Tech (blog)

Study: Bot-on-Bot Editing Wars Raging on Wikipedia’s pages | Sci … – Sci-Tech Today

For many it is no more than the first port of call when a niggling question raises its head. Found on its pages are answers to mysteries from the fate of male anglerfish, the joys of dorodango, and the improbable death of Aeschylus.

But beneath the surface of Wikipedia lies a murky world of enduring conflict. A new study from computer scientists has found that the online encyclopedia is a battleground where silent wars have raged for years.

Since Wikipedia launched in 2001, its millions of articles have been ranged over by software robots, or simply bots, that are built to mend errors, add links to other pages, and perform other basic housekeeping tasks.

In the early days, the bots were so rare they worked in isolation. But over time, the number deployed on the encyclopedia exploded with unexpected consequences. The more the bots came into contact with one another, the more they became locked in combat, undoing each others edits and changing the links they had added to other pages. Some conflicts only ended when one or other bot was taken out of action.

The fights between bots can be far more persistent than the ones we see between people, said Taha Yasseri, who worked on the study at the Oxford Internet Institute. Humans usually cool down after a few days, but the bots might continue for years.

The findings emerged from a study that looked at bot-on-bot conflict in the first ten years of Wikipedias existence. The researchers at Oxford and the Alan Turing Institute in London examined the editing histories of pages in 13 different language editions and recorded when bots undid other bots changes.

They did not expect to find much. The bots are simple computer programs that are written to make the encyclopedia better. They are not intended to work against each other. We had very low expectations to see anything interesting. When you think about them they are very boring, said Yasseri. The very fact that we saw a lot of conflict among bots was a big surprise to us. They are good bots, they are based on good intentions, and they are based on same open source technology.

While some conflicts mirrored those found in society, such as the best names to use for contested territories, others were more intriguing. Describing their research in a paper entitled Even Good Bots Fight in the journal Plos One, the scientists reveal that among the most contested articles were pages on former president of Pakistan Pervez Musharraf, the Arabic language, Niels Bohr and Arnold Schwarzenegger.

One of the most intense battles played out between Xqbot and Darknessbot which fought over 3,629 different articles between 2009 and 2010. Over the period, Xqbot undid more than 2,000 edits made by Darknessbot, with Darknessbot retaliating by undoing more than 1,700 of Xqbots changes. The two clashed over pages on all sorts of topics, from Alexander of Greece and Banqiao district in Taiwan to Aston Villa football club.

Another bot named after Tachikoma, the artificial intelligence in the Japanese science fiction series Ghost in the Shell, had a two year running battle with Russbot. The two undid more than a thousand edits by the other on more than 3,000 articles ranging from Hillary Clinton s 2008 presidential campaign to the demography of the UK.

The study found striking differences in the bot wars that played out on the various language editions of Wikipedia. German editions had the fewest bot fights, with bots undoing others edits on average only 24 times in a decade. But the story was different on the Portuguese Wikipedia, where bots undid the work of other bots on average 185 times in ten years. The English version saw bots meddling with each others changes on average 105 times a decade.

The findings show that even simple algorithms that are let loose on the internet can interact in unpredictable ways. In many cases, the bots came into conflict because they followed slightly different rules to one another.

Yasseri believes the work serves as an early warning to companies developing bots and more powerful artificial intelligence (AI) tools. An AI that works well in the lab might behave unpredictably in the wild. Take self-driving cars. A very simple thing thats often overlooked is that these will be used in different cultures and environments, said Yasseri. An automated car will behave differently on the German autobahn to how it will on the roads in Italy. The regulations are different, the laws are different, and the driving culture is very different, he said.

As more decisions, options and services come to depend on bots working properly together, harmonious cooperation will become increasingly important. As the authors note in their latest study: We know very little about the life and evolution of our digital minions.

Earlier this month, researchers at Googles DeepMind set AIs against one another to see if they would cooperate or fight. When the AIs were released on an apple-collecting game, the scientists found that the AIs cooperated while apples were plentiful, but as soon as supplies got short, they turned nasty. It is not the first time that AIs have run into trouble. In 2011, scientists in the US recorded a conversation between two chatbots. They bickered from the start and ended up arguing about God.

2017 Guardian Web under contract with NewsEdge/Acquire Media. All rights reserved.

Continued here:
Study: Bot-on-Bot Editing Wars Raging on Wikipedia's pages | Sci ... - Sci-Tech Today