Archive for the ‘Wikipedia’ Category

Wikipedia’s Moment of Truth – The New York Times

In the future, Sastry added, A.I. systems might interpret whether a query requires a rigorous factual answer or something more creative. In other words, if you wanted an analytical report with citations and detailed attributions, the A.I. would know to deliver that. And if you desired a sonnet about the indictment of Donald Trump, well, it could dash that off instead.

In late June, I began to experiment with a plug-in the Wikimedia Foundation had built for ChatGPT. At the time, this software tool was being tested by several dozen Wikipedia editors and foundation staff members, but it became available in mid-July on the OpenAI website for subscribers who want augmented answers to their ChatGPT queries. The effect is similar to the retrieval process that Jesse Dodge surmises might be required to produce accurate answers. GPT-4s knowledge base is currently limited to data it ingested by the end of its training period, in September 2021. A Wikipedia plug-in helps the bot access information about events up to the present day. At least in theory, the tool lines of code that direct a search for Wikipedia articles that answer a chatbot query gives users an improved, combinatory experience: the fluency and linguistic capabilities of an A.I. chatbot, merged with the factuality and currency of Wikipedia.

One afternoon, Chris Albon, whos in charge of machine learning at the Wikimedia Foundation, took me through a quick training session. Albon asked ChatGPT about the Titan submersible, operated by the company OceanGate, whose whereabouts during an attempt to visit the Titanics wreckage were still unknown. Normally you get some response thats like, My information cutoff is from 2021, Albon told me. But in this case ChatGPT, recognizing that it couldnt answer Albons question What happened with OceanGates submersible? directed the plug-in to search Wikipedia (and only Wikipedia) for text relating to the question. After the plug-in found the relevant Wikipedia articles, it sent them to the bot, which in turn read and summarized them, then spit out its answer. As the responses came back, hindered by only a slight delay, it was clear that using the plug-in always forced ChatGPT to append a note, with links to Wikipedia entries, saying that its information was derived from Wikipedia, which was made by volunteers. And this: As a large language model, I may not have summarized Wikipedia accurately.

But the summary about the submersible struck me as readable, well supported and current a big improvement from a ChatGPT response that either mangled the facts or lacked real-time access to the internet. Albon told me, Its a way for us to sort of experiment with the idea of What does it look like for Wikipedia to exist outside of the realm of the website, so you could actually engage in Wikipedia without actually being on Wikipedia.com. Going forward, he said, his sense was that the plug-in would continue to be available, as it is now, to users who want to activate it but that eventually, theres a certain set of plug-ins that are just always on.

In other words, his hope was that any ChatGPT query might automatically result in the chatbots checking facts with Wikipedia and citing helpful articles. Such a process would probably block many hallucinations as well: For instance, because chatbots can be deceived by how a question is worded, false premises sometimes elicit false answers. Or, as Albon put it, If you were to ask, During the first lunar landing, who were the five people who landed on the moon? the chatbot wants to give you five names. Only two people landed on the moon in 1969, however. Wikipedia would help by offering the two names, Buzz Aldrin and Neil Armstrong; and in the event the chatbot remained conflicted, it could say it didnt know the answer and link to the article.

Link:
Wikipedia's Moment of Truth - The New York Times

The shocking truth about Wikipedias Holocaust disinformation – Forward

Artistic rendering of an editor adding Holocaust distortions to wikipedia Photo by iStock/Creative Commons/Forward Montage

Shira Klein June 14, 2023

Manipulating Wikipedia is all the rage these days. Companies, governments and even presidential candidates reportedly do it.

Yet we sleep well at night because we trust Wikipedias editors will protect us from blatant disinformation. After all, there are 125,000 active editors on English Wikipedia, 460 administrators and a 12-member Arbitration Committee, often dubbed Wikipedias Supreme Court. Above these volunteers towers the Wikimedia Foundation, with its 700-strong staff. Together, it comprises an entire security system.

This month, we are seeing the system fail. And it is time for the Wikimedia Foundation to get involved.

My colleague and I recently exposed a persistent Holocaust disinformation campaign on English Wikipedia.

The study, which I published with Jan Grabowski from the University of Ottawa, examined two dozen Wikipedia articles on the Holocaust in Poland and over 300 back pages (including talk pages, noticeboards, and arbitration cases, spaces where editors decide what the rest of the world will accept as fact).

To our dismay, we found dozens of examples of Holocaust distortion which, taken together, advanced a Polish nationalist narrative, whitewashed the role of Polish society in the Holocaust and bolstered harmful stereotypes about Jews.

People who read these pages learned about Jews supposed complicity in their own catastrophe, gangs of Jewish collaborators aiding the Gestapo and Jews supporting the communists to betray Poles. A handful of distortions have been corrected since our publication, but many remain.

A fraction of it is true: There were scattered instances of Jewish collaboration in WWII, for example. But Wikipedia inflates their scale and prominence. In one article that remains gravely distorted, alleged Jewish collaboration with the Nazis takes up more space than the Ukrainian, Belorussian and ethnic German collaboration combined.

In one glaring hoax discovered by an Israeli reporter, Wikipedia claimed for 15 years that the Germans annihilated 200,000 non-Jewish Poles in a giant gas chamber in the middle of Warsaw.

Wikipedias ArbCom just released aruling responding to our study, sanctioning several editors. While this may seem promising, in fact, ArbComs actions should concern anyone who cares about disinformation.

The problem is not the individual arbitrators, nor even ArbCom as a whole; the committees mandate is to judge conduct, never content. This is a good policy. We wouldnt want arbitrators, who are anonymous volunteers with no expertise in any particular subject, to control content. Wikipedias strength lies in its enabling anyone to edit, democratizing knowledge like never before.

But this leaves a gaping hole in Wikipedias security apparatus. Its safeguards only protect us from fake information when enough editors reach a consensus that the information is indeed fake. When an area is dominated by a group of individuals pushing an erroneous point of view, then wrong information becomes the consensus.

Wikipedias structure leaves it vulnerable to be exploited by any small group of people willing to spend the time to control the content, whether they are from a government or a corporation or are simply ideologically driven private individuals.

In theory, anyone can edit Wikipedia; no editor has any ownership over any article. Yet over the years, anyone who tried to fix distortions related to Holocaust disinformation faced a team of fierce editors who guard old lies and produce new ones.

These few editors, with no evident ties to any government, sport playful pseudonyms, such as Piotrus (Little Peter in Polish) or Volunteer Marek. But they are a resilient team whose seniority and prolific editing across the encyclopedia give them high status in Wikipedias editorial community. Methodically and patiently, they go from article to article, removing and adding content until it aligns with a Polish nationalist worldview. They misrepresent sources, use unreliable sources, and push fringe points of view.

To be sure, Wikipedia has policies in place to prevent source misrepresentation, unreliable sources and fringe claims. If an editor commits these violations repeatedly, administrators and arbitrators can kick them out.

But administrators and arbitrators lack the expertise to recognize when a source has been misrepresented. Instead, they focus on editors interpersonal conduct. Editors who are uncivil, aggressive or long-winded find themselves sanctioned, while those who are polite and show a willingness to compromise generally emerge unscathed, regardless of the content they author.

This problem is not unique to Wikipedias treatment of the Holocaust. A similar disinformation campaign is taking place in Wikipedias articles on Native American history, where influential editors misrepresent sources to the effect of erasing Native history and whitewashing American settler colonial violence. The Wikipedia article on Andrew Jackson, plagued by such manipulations, attracts thousands of readers a day.

This was the third ArbCom case on the Holocaust to make the same mistakes. ArbCom paid lip service to the importance of tackling source manipulations, while completely disregarding dozens of such problems presented to them by our study and by concerned editors. By ignoring egregiously false content, and focusing only on editors civility, ArbCom sends the message that theres no problem with falsifying the past, as long as you are nice about it.

The results are tragic: The arbitrators have banned one editor who, as our article showed, had brought in trustworthy scholarship to rebut the distortions. They sanctioned another editor for documenting the distortionists whitewashing of current Polish antisemitic figurines (called, tellingly, Jew with a Coin).

Worse still, they have described as exemplary a distortionist editor who has defended Holocaust revisionist Ewa Kurek. Kurek has claimed that Jews had fun in the Warsaw ghetto and that COVID-19 is a Jewfication of Europe. Two additional editors who were banned are indeed distortionists, but the ban (appealable in 12 months) responded to their bad manners, not their manipulation of history.

The Wikimedia Foundation needs to intervene, as it has already done to stem disinformation in Chinese Wikipedia, Saudi Wikipedia and Croatian Wikipedia, with excellent results. It must do so in English Wikipedia as well.

In a statement they issued last week in response to press inquiries about our study and the recent ArbCom decision, the foundation said, Wikipedias volunteer editors act as a vigilant first line of defense.

But what is the second line of defense? What happens when cases keep bouncing back to ArbCom, as has occurred with the Holocaust in Poland, India-Pakistan, Armenia-Azerbaijan and gender and sexuality, to mention just a few controversies?

The Wikimedia Foundation must harness subject-matter experts to assist volunteer editors. In cases where Wikipedias internal measures fail repeatedly, the foundation should commission scholars mainstream scholars who are currently publishing in reputable peer-reviewed presses and work in universities unencumbered by state dictates to weigh in.

In the case of Wikipedias coverage of Holocaust history, there is a need for an advisory board of established historians who would be available to advise editors on a sources reliability, or help administrators understand whether a source has been misrepresented.

The foundation certainly has the resources to build more bridges with academia: It boasts an annual revenue of $155 million, mostly from the publics donations. The public deserves a Wikipedia that provides not just any knowledge, but accurate knowledge, and asking for academics help is a necessary next step in Wikipedias ongoing development.

This is no radical departure from Wikipedias ethos of democratized knowledge that anyone can edit. This is an additional safeguard to ensure Wikipedias existing content policies are actually upheld.

Academia must also play its part to keep Wikipedia accurate. Scholars should uncover Wikipedias weaknesses and flag them for editors to fix, instead of snubbing Wikipedia as unreliable. Wikipedia is the seventh-most-visited site on the internet, most peoples first and last stop for information. All the more so with ChatGPT, which amplifies online content to a deafening pitch.

Volunteer editors and professional experts need to work together to get it right.

To contact the author, email [emailprotected]

Shira Klein is an associate professor of history at Chapman University in California and co-author of the study, Wikipedias Intentional Distortion of the History of the Holocaust in The Journal of Holocaust Research.

The views and opinions expressed in this article are the authors own and do not necessarily reflect those of the Forward. Discover more perspective in Opinion.

Read more from the original source:
The shocking truth about Wikipedias Holocaust disinformation - Forward

New Mall-Based Minnesota Kohls Store Coming in 2024 | Joel … – NewsBreak Original

Despite a recent industry-wide department store downturn that fueled online speculation regarding the future of the company, the newest Kohls entity may not be the last.

This article is based on corporate postings and accredited media reports. Linked information within this article is attributed to the following outlets: Wikipedia.org, ScrapeHero.com, and EchoPress.com.

Wikipedia features a comprehensive and highly-attributed overview of the Kohls chain: Kohl's (stylized in all caps) is an American department store retail chain, operated by Kohl's Corporation... The company was founded by Polish immigrant Maxwell Kohl, who opened a corner grocery store in Milwaukee, Wisconsin, in 1927. It went on to become a successful chain in the local area, and in 1962 the company branched out by opening its first department store.

Regarding location count, ScrapeHero.com features the most recently-updated number: There are 1,157 Kohls retail stores in the United States as of December 11, 2022. The state with the most number of Kohls locations in the U.S. is California, with 117 retail stores, which is about 10% of all Kohls retail stores in the U.S.

It should be noted NewsBreak has published several articles of mine on the Kohls franchise, most detailing issues of a sale that was expected industry-wide in 2022 and the anticipated repercussions thereof.

That expected sale, however, did not happen.

See here for Plans For Kohls Closings in 2022, and here for Plans For Kohls Closings in 2022 Update: Sales Negotiations Terminated for further information.

Regardless, a new Kohls entity will be opening in 2024.

Let us explore.

According to a report from EchoPress.com, entitled Kohl's to Open in Alexandria's Viking Plaza Mall Next April, the identity of the malls new tenant had been closely guarded until recently.

As excerpted from the article: National retailer Kohls is coming to the west wing of the Viking Plaza Mall in 2024. The mall is owned by New Jersey-based company, Lexington Realty International , and managed by a regional team led by Nicki Martineau, Midwest Portfolio Manager in Minnesota. The Viking Plazas Senior Facility Manager is Sergio Rolfzen.

The new restaurant is reportedly scheduled to open next April.

The EchoPress.com piece goes on to state: The Viking Plaza Mall, along with Lexington Realty International, is excited to bring Kohls to the Alexandria community. We appreciate the hard work and collaboration of staff at Lexington, Viking Plaza Mall, Tradesmen Construction , and the City of Alexandria, Martineau said in a news release.

In the event of pertinent updates to these matters, I will share them here on NewsBreak.

Thank you for reading.

Original post:
New Mall-Based Minnesota Kohls Store Coming in 2024 | Joel ... - NewsBreak Original

Open Access Makes Research More Widely Cited, Helping Spread … – Techdirt

from the share-the-knowledge dept

Open access has been discussedmany timeshere on Techdirt. There are several strands to its story. Its about allowing the public to access research they have paid for through tax-funded grants, without needing to take out often expensive subscriptions to academic titles. Its about saving educational institutions money that they are currently spending on over-priced academic journals, and which could be better spent elsewhere. Its about helping to spread knowledge without the friction that traditional publishing introduces, ideally moving to licenses that allow academic research papers to be distributed freely and without restrictions.

But theres another aspect that receives less attention, revealed here by a new paper that looks at how open access articles are used in a particular and important context that of Wikipedia. There is a natural synergy between the two, which both aim to make access to knowledge easier.The paper seeks to quantify that:

we analyze a large dataset of citations from Wikipedia and model the role of open access in Wikipedias citation patterns. We find that open-access articles are extensively and increasingly more cited in Wikipedia. What is more, they show a 15% higher likelihood of being cited in Wikipedia when compared to closed-access articles, after controlling for confounding factors. This open-access citation effect is particularly strong for articles with low citation counts, including recently published ones. Our results show that open access plays a key role in the dissemination of scientific knowledge, including by providing Wikipedia editors timely access to novel results. These findings have important implications for researchers, policymakers, and practitioners in the field of information science and technology.

What this means in practice is that for the general public open access articles are even more beneficial than those published in traditional titles, since they frequently turn up as Wikipedia sources that can be consulted directly. They are also advantageous for the researchers who write them, since their work is more likely to be cited on the widely-read and influential Wikipedia than if the papers were not open access. As the research notes, this effect is even more pronounced for articles with low citation counts basically, academic work that may be important but is rather obscure. This new paper provides yet another compelling reason why researchers should be publishing their work as open access as a matter of course: out of pure self interest.

Follow me @glynmoody onMastodon. Originally posted to the Walled Culture blog.

Filed Under: access to information, open access, research, sharing knowledge, studies Companies: wikipedia

Go here to read the rest:
Open Access Makes Research More Widely Cited, Helping Spread ... - Techdirt

Russia fines Wikipedia owner for failing to delete Azov battalion content – Ifax – Yahoo News

MOSCOW (Reuters) - A Russian court on Tuesday fined the Wikimedia Foundation, which owns Wikipedia, 3 million roubles ($36,854) for refusing to delete an article on Ukraine's Azov battalion, the Interfax news agency reported.

Wikimedia did not immediately respond to a Reuters request for comment.

It has previously said information that Russian authorities complained about was well-sourced and in line with Wikipedia standards.

The Azov battalion, a unit of Ukraine's military, has been designated a terrorist group by Russia.

Wikipedia is one of the few surviving independent sources of information in Russia since a state crackdown on online content intensified after Moscow sent its armed forces into Ukraine.

Russia has said it was not yet planning to block Wikipedia, but has handed the online encyclopaedia a series of fines.

Wikimedia has previously criticised the penalties as "part of an ongoing effort by the Russian government to limit the spread of reliable, well-sourced information in the country".

"We are against such efforts as pressure tactics, and see them as an attempt to use legal liabilities to try to curb free knowledge," the foundation has said.

Russia fined Meta's messenger service WhatsApp for the first time last week for not deleting banned content.

Rakuten Group's messaging app Viber also faces a first-time fine of up to 4 million roubles over content, TASS reported on Tuesday.

($1 = 81.4025 roubles)

(Reporting by Reuters; Writing by Alexander Marrow; Editing by Louise Heavens and Sriraj Kalluvila)

Read this article:
Russia fines Wikipedia owner for failing to delete Azov battalion content - Ifax - Yahoo News