Archive for the ‘Wikipedia’ Category

Why Wikipedia is so imperative for public relations – PR Daily

Rhiannon Ruff is a co-founder of the digital agency Lumino and the author of the new book, Wikipedia & Crisis Communications.

Whats the most important platform for public relations? Twitter (I mean X)? Cable news? YouTube? The right answer, of course, is Wikipedia. The free encyclopedia that anyone can edit has become the internets most prominent piece of real estate, with its 6 million articles attracting a staggering 260 million monthly viewers. No single source of information is referenced more frequently, and no other web result does more to shape perceptions of the people, organizations and brands we hear about in the news and search for online.

Just consider how Wikipedia dominates Google: Wikipedia content appears as a top organic result in countless searches, especially for queries about people, places or things, because Wikipedia articles provide comprehensive summaries of these topics and are thus highly relevant to user search intent a technical term Google uses to describe the reason someone conducts a specific search. (Just think of how many times in the past week alone youve landed at Wikipedia after Googling a particular topic.)

And heres the thing: users dont even need to click over to the site to see its content, as Google includes descriptions from Wikipedia in the knowledge panels it displays at the top of search results. You can also find Wikipedia popping up in featured snippet responses and People Also Ask results.

Okay, youre saying to yourself, what about voice search?

Well, Alexa, Siri and Google Voice often read directly from Wikipedia articles when answering questions. In fact, a Voicebot report found that when users were asking about brands, these programs relied on Wikipedia for 99% of correct answers. The report doesnt specify this, but these answers are probably the same two-sentence brand descriptions that Google uses in its knowledge panels. This content is pulled from the first two sentences of the respective Wikipedia articles.

What about AI chatbots? Thats the future of our collective knowledge, after all. Users wont even need to search the web, ChatGPT will just have the answer!

Well, heres where it gets really interesting.

A journal article from the ChatGPT engineering team confirmed that the chatbot was trained on Wikipedia, with the encyclopedia likely helping the program learn patterns of language related to particular people, places and things. So when you ask ChatGPT about a brand or prominent individual, theres a good chance that at least some of the information it provides in response will come from Wikipedia. Googles Bard AI, meanwhile, cites the encyclopedia directly in its responses.

In short, no matter where you go to search for information, youre eventually going to get content from Wikipedia. This is especially true when topics like, say, your brand or organization are gaining exposure in the news.

A 2018 study by the Wikimedia Foundation, Wikipedias parent organization, found that media coverage of particular topics was the second-largest driver of traffic to the site. This data probably mirrors your own anecdotal experience: When you want to learn more about something you heard in the news, you skip the press releases, company website and social accounts and head directly to Wikipedia.

OceanGate is a perfect example of this. A Wikipedia article for the deep-sea research and tourism company was created in 2015. The entry would have been a top branded-search result, but the page still received low page views as few people had any reason to search for the company. In fact, in 2023, the article accumulated a whopping zero views. until June 20, when 80 readers suddenly showed up. A few days later that number was up to half a million.

Source: https://pageviews.wmcloud.org/

OceanGate was, of course, in the middle of a tragic news cycle following the loss of the Titan submersible and its five passengers somewhere above the Titanic wreckage. Page viewers likely included not only cable news fans second screening with their phones, but also journalists scrambling to research the company. Wikipedia editors, meanwhile, frantically added details to the article as revelations continued to emerge.

During that period, anyone who wanted to know more about OceanGate was visiting the Wikipedia entry.

So what does this all mean for public relations?

Wikipedias impact on reputation cant be overstated, and brands should have a plan for engaging with the site. Having a well-written and accurate Wikipedia page can enhance an organizations credibility and reputation. Conversely, having a short article filled with outdated information can create perception problems, especially for companies that have rebranded or altered their services or business model.

However, be aware that Wikipedia has strict guidelines about neutrality, verifiability, and conflicts of interest. For example, when you have a conflict of interest with a topic you should never directly edit the article. Instead, you must disclose your connection to the topic and make a request on the topics Talk Page, which volunteer editors with no connection to the topic can review. These uninvolved editors should be the ones to implement suggested edits, never a brand or PR firm attached to the topic.

PR professionals must adhere to these guidelines when contributing to or creating Wikipedia content. Attempts to manipulate or excessively promote an organization can lead to the removal of content and editor animosity towards the brand or organization or even worse, the generation of critical news coverage, as weve seen in the past with politicians and prominent brands who tried to edit their own pages.

See the original post here:
Why Wikipedia is so imperative for public relations - PR Daily

More Wikipedia taunts as Max Verstappen erases a Lewis Hamilton World title – Yahoo Eurosport UK

Race winner Max Verstappen speaking with second placed Lewis Hamilton. Spain June 2023. Jenson Button Credit: Alamy

Wikipedia has become the insult word of Formula 1 2023 with Toto Wolff calling Max Verstappens run of 10 wins a Wikipedia stat with the Red Bull driver hitting back by questioning how many titles Lewis Hamilton has won.

After all, he doesnt read Wikipedia.

Claiming a 10th successive grand prix win at the Italian Grand Prix, a new Formula 1 record, Mercedes motorsport boss Wolff downplayed Verstappens achievement.

It is not something that would be important for me, those numbers, are for Wikipedia and nobody reads that anyway, the Austrian told Sky Sports before going on to call the record completely irrelevant.

Hamilton doubled down on that as he told the media including PlanetF1.com: I mean I dont care about statistics in general. Good for him.

Verstappen has managed to get in the last word at least for now when he then questioned Hamiltons tally of World titles.

According to the Express, he told a journalist: With Lewis, I dont know for sure but I mean for someone who has won six World Championships, you must know

It is seven. Its in the history books. You can look it up, the reporter replied.

To which Verstappen asked: Are you sure its seven, not six?

I mean, Im not very sure, you know. I dont read Wikipedia.

F1 2023 title permutations: When can Max Verstappen win the World Championship?

Revealed: The F1 2023 World Championship standings without Red Bull

Meanwhile Helmut Marko, having previously declared Red Bull dont bother themselves with Mercedes as they are not a serious opponent, had another dig at the Brackley squad on ServusTV.

Responding to Wolffs Wikipedia stat taunt, Marko said: Maybe you can say that to Wolff, its the most-read medium ever! Its not that insignificant either.

So were happy and well take these records with us.

In fact, the 80-year-old says continuing that winning streak to 11 and beyond is what is driving Red Bull at the moment.

Story continues

Its part of the motivation, he said. And the more we win, the more important it becomes.

With every record comes even more motivation, even more passion. And thats the strength of this team, that its not about money or anything else.

Read next: Big Singapore upgrade set to fire Red Bulls new plan to life

The article More Wikipedia taunts as Max Verstappen erases a Lewis Hamilton World title appeared first on Planetf1.com.

Read more from the original source:
More Wikipedia taunts as Max Verstappen erases a Lewis Hamilton World title - Yahoo Eurosport UK

Local Teacher Becomes First Malaysian To Win Wikimedian Award … – The Rakyat Post

Subscribe to our FREENewsletterorTelegramchannel for the latest stories and updates.

A local teacher was recently awarded the Wikimedian of the Year award. Taufik Rosman was presented with the accolade for his contributions to the Malay edition of Wikipedia at the award show this year in Singapore.

The accomplishment makes him the first Malaysian to ever bag the award since it was established in 2011. The Universiti Sains Malaysia graduate was ecstatic to have earned the title, especially since he has been a fan of Wikipedias accessible nature.

Speaking with Free Malaysia Today, Taufik revealed that he was first drawn to the site when he was 13. He had realised that there were not many entries on the Malay edition of Wiktionary the dictionary counterpart to Wikipedia.

This, therefore, inspired him to include Malay words on the site. And eventually, this gave him the idea to also spread the word of Malaysian culture on Wikipedia.

Usually, what I translate is related to culture, both Malaysian and cultures from abroad. Ive translated articles on Japanese and Maori culture among others, in the past, said Taufik.

He was also recognised for his efforts to spread knowledge of Malaysian culture at this years Wikimedian of the Year award.

Credible enough?

Taufik, however, is aware of the publics perception of Wikipedia. Since anyone can edit the information on the page, he realises how the public questions the pages credibility. To which, he replies by noting the various sources that Wikipedia cites for its points on the page.

Most people have been told that Wikipedia is unreliable, and I mean, it is. You cannot cite Wikipedia but all Wikipedia articles have sources and citations which you can find at the bottom of the article, Taufik argued.

The sources, according to him, make the articles reliable. Hence, he calls on other Malaysians to join in on the effort to contribute their thoughts and knowledge about Malaysia to the website.

I guess I can say each one of us knows something about the world. If we all could gather in one place to add this knowledge online, it would make information about our country Malaysia more accessible.

While locals, especially youngsters may not be sure of their knowledge, Taufik still believes sharing is the way to go.

I would love to see the Malaysian public having this culture of free knowledge. Having people share knowledge with one another is quite beneficial because it shows that Malaysians care about providing knowledge for free and making it accessible, he concluded.

More about the Wikimedian of the Year

The Wikimedian of the Year is an award that is dedicated to honour great contributions to the Wikimedia (Wiktionary, WikiCommons, and Wikipedia) movement.

Established in August 2011 by co-founder Jimmy Wales, they select and honour recipients at the annual awards show called Wikimania.

Prior to Taufik, no Malaysian has managed to bring home the accolade. Previous title holders had been from countries such as Ghana, Arab, and the United States.

Thus, Taufiks win marks a monumental moment for both Malaysians and individuals within the Wikimedia community.

Share your thoughts with us via TRPsFacebook,Twitter,Instagram, orThreads.

Continue reading here:
Local Teacher Becomes First Malaysian To Win Wikimedian Award ... - The Rakyat Post

Wikipedia’s Moment of Truth – The New York Times

In the future, Sastry added, A.I. systems might interpret whether a query requires a rigorous factual answer or something more creative. In other words, if you wanted an analytical report with citations and detailed attributions, the A.I. would know to deliver that. And if you desired a sonnet about the indictment of Donald Trump, well, it could dash that off instead.

In late June, I began to experiment with a plug-in the Wikimedia Foundation had built for ChatGPT. At the time, this software tool was being tested by several dozen Wikipedia editors and foundation staff members, but it became available in mid-July on the OpenAI website for subscribers who want augmented answers to their ChatGPT queries. The effect is similar to the retrieval process that Jesse Dodge surmises might be required to produce accurate answers. GPT-4s knowledge base is currently limited to data it ingested by the end of its training period, in September 2021. A Wikipedia plug-in helps the bot access information about events up to the present day. At least in theory, the tool lines of code that direct a search for Wikipedia articles that answer a chatbot query gives users an improved, combinatory experience: the fluency and linguistic capabilities of an A.I. chatbot, merged with the factuality and currency of Wikipedia.

One afternoon, Chris Albon, whos in charge of machine learning at the Wikimedia Foundation, took me through a quick training session. Albon asked ChatGPT about the Titan submersible, operated by the company OceanGate, whose whereabouts during an attempt to visit the Titanics wreckage were still unknown. Normally you get some response thats like, My information cutoff is from 2021, Albon told me. But in this case ChatGPT, recognizing that it couldnt answer Albons question What happened with OceanGates submersible? directed the plug-in to search Wikipedia (and only Wikipedia) for text relating to the question. After the plug-in found the relevant Wikipedia articles, it sent them to the bot, which in turn read and summarized them, then spit out its answer. As the responses came back, hindered by only a slight delay, it was clear that using the plug-in always forced ChatGPT to append a note, with links to Wikipedia entries, saying that its information was derived from Wikipedia, which was made by volunteers. And this: As a large language model, I may not have summarized Wikipedia accurately.

But the summary about the submersible struck me as readable, well supported and current a big improvement from a ChatGPT response that either mangled the facts or lacked real-time access to the internet. Albon told me, Its a way for us to sort of experiment with the idea of What does it look like for Wikipedia to exist outside of the realm of the website, so you could actually engage in Wikipedia without actually being on Wikipedia.com. Going forward, he said, his sense was that the plug-in would continue to be available, as it is now, to users who want to activate it but that eventually, theres a certain set of plug-ins that are just always on.

In other words, his hope was that any ChatGPT query might automatically result in the chatbots checking facts with Wikipedia and citing helpful articles. Such a process would probably block many hallucinations as well: For instance, because chatbots can be deceived by how a question is worded, false premises sometimes elicit false answers. Or, as Albon put it, If you were to ask, During the first lunar landing, who were the five people who landed on the moon? the chatbot wants to give you five names. Only two people landed on the moon in 1969, however. Wikipedia would help by offering the two names, Buzz Aldrin and Neil Armstrong; and in the event the chatbot remained conflicted, it could say it didnt know the answer and link to the article.

Link:
Wikipedia's Moment of Truth - The New York Times

The shocking truth about Wikipedias Holocaust disinformation – Forward

Artistic rendering of an editor adding Holocaust distortions to wikipedia Photo by iStock/Creative Commons/Forward Montage

Shira Klein June 14, 2023

Manipulating Wikipedia is all the rage these days. Companies, governments and even presidential candidates reportedly do it.

Yet we sleep well at night because we trust Wikipedias editors will protect us from blatant disinformation. After all, there are 125,000 active editors on English Wikipedia, 460 administrators and a 12-member Arbitration Committee, often dubbed Wikipedias Supreme Court. Above these volunteers towers the Wikimedia Foundation, with its 700-strong staff. Together, it comprises an entire security system.

This month, we are seeing the system fail. And it is time for the Wikimedia Foundation to get involved.

My colleague and I recently exposed a persistent Holocaust disinformation campaign on English Wikipedia.

The study, which I published with Jan Grabowski from the University of Ottawa, examined two dozen Wikipedia articles on the Holocaust in Poland and over 300 back pages (including talk pages, noticeboards, and arbitration cases, spaces where editors decide what the rest of the world will accept as fact).

To our dismay, we found dozens of examples of Holocaust distortion which, taken together, advanced a Polish nationalist narrative, whitewashed the role of Polish society in the Holocaust and bolstered harmful stereotypes about Jews.

People who read these pages learned about Jews supposed complicity in their own catastrophe, gangs of Jewish collaborators aiding the Gestapo and Jews supporting the communists to betray Poles. A handful of distortions have been corrected since our publication, but many remain.

A fraction of it is true: There were scattered instances of Jewish collaboration in WWII, for example. But Wikipedia inflates their scale and prominence. In one article that remains gravely distorted, alleged Jewish collaboration with the Nazis takes up more space than the Ukrainian, Belorussian and ethnic German collaboration combined.

In one glaring hoax discovered by an Israeli reporter, Wikipedia claimed for 15 years that the Germans annihilated 200,000 non-Jewish Poles in a giant gas chamber in the middle of Warsaw.

Wikipedias ArbCom just released aruling responding to our study, sanctioning several editors. While this may seem promising, in fact, ArbComs actions should concern anyone who cares about disinformation.

The problem is not the individual arbitrators, nor even ArbCom as a whole; the committees mandate is to judge conduct, never content. This is a good policy. We wouldnt want arbitrators, who are anonymous volunteers with no expertise in any particular subject, to control content. Wikipedias strength lies in its enabling anyone to edit, democratizing knowledge like never before.

But this leaves a gaping hole in Wikipedias security apparatus. Its safeguards only protect us from fake information when enough editors reach a consensus that the information is indeed fake. When an area is dominated by a group of individuals pushing an erroneous point of view, then wrong information becomes the consensus.

Wikipedias structure leaves it vulnerable to be exploited by any small group of people willing to spend the time to control the content, whether they are from a government or a corporation or are simply ideologically driven private individuals.

In theory, anyone can edit Wikipedia; no editor has any ownership over any article. Yet over the years, anyone who tried to fix distortions related to Holocaust disinformation faced a team of fierce editors who guard old lies and produce new ones.

These few editors, with no evident ties to any government, sport playful pseudonyms, such as Piotrus (Little Peter in Polish) or Volunteer Marek. But they are a resilient team whose seniority and prolific editing across the encyclopedia give them high status in Wikipedias editorial community. Methodically and patiently, they go from article to article, removing and adding content until it aligns with a Polish nationalist worldview. They misrepresent sources, use unreliable sources, and push fringe points of view.

To be sure, Wikipedia has policies in place to prevent source misrepresentation, unreliable sources and fringe claims. If an editor commits these violations repeatedly, administrators and arbitrators can kick them out.

But administrators and arbitrators lack the expertise to recognize when a source has been misrepresented. Instead, they focus on editors interpersonal conduct. Editors who are uncivil, aggressive or long-winded find themselves sanctioned, while those who are polite and show a willingness to compromise generally emerge unscathed, regardless of the content they author.

This problem is not unique to Wikipedias treatment of the Holocaust. A similar disinformation campaign is taking place in Wikipedias articles on Native American history, where influential editors misrepresent sources to the effect of erasing Native history and whitewashing American settler colonial violence. The Wikipedia article on Andrew Jackson, plagued by such manipulations, attracts thousands of readers a day.

This was the third ArbCom case on the Holocaust to make the same mistakes. ArbCom paid lip service to the importance of tackling source manipulations, while completely disregarding dozens of such problems presented to them by our study and by concerned editors. By ignoring egregiously false content, and focusing only on editors civility, ArbCom sends the message that theres no problem with falsifying the past, as long as you are nice about it.

The results are tragic: The arbitrators have banned one editor who, as our article showed, had brought in trustworthy scholarship to rebut the distortions. They sanctioned another editor for documenting the distortionists whitewashing of current Polish antisemitic figurines (called, tellingly, Jew with a Coin).

Worse still, they have described as exemplary a distortionist editor who has defended Holocaust revisionist Ewa Kurek. Kurek has claimed that Jews had fun in the Warsaw ghetto and that COVID-19 is a Jewfication of Europe. Two additional editors who were banned are indeed distortionists, but the ban (appealable in 12 months) responded to their bad manners, not their manipulation of history.

The Wikimedia Foundation needs to intervene, as it has already done to stem disinformation in Chinese Wikipedia, Saudi Wikipedia and Croatian Wikipedia, with excellent results. It must do so in English Wikipedia as well.

In a statement they issued last week in response to press inquiries about our study and the recent ArbCom decision, the foundation said, Wikipedias volunteer editors act as a vigilant first line of defense.

But what is the second line of defense? What happens when cases keep bouncing back to ArbCom, as has occurred with the Holocaust in Poland, India-Pakistan, Armenia-Azerbaijan and gender and sexuality, to mention just a few controversies?

The Wikimedia Foundation must harness subject-matter experts to assist volunteer editors. In cases where Wikipedias internal measures fail repeatedly, the foundation should commission scholars mainstream scholars who are currently publishing in reputable peer-reviewed presses and work in universities unencumbered by state dictates to weigh in.

In the case of Wikipedias coverage of Holocaust history, there is a need for an advisory board of established historians who would be available to advise editors on a sources reliability, or help administrators understand whether a source has been misrepresented.

The foundation certainly has the resources to build more bridges with academia: It boasts an annual revenue of $155 million, mostly from the publics donations. The public deserves a Wikipedia that provides not just any knowledge, but accurate knowledge, and asking for academics help is a necessary next step in Wikipedias ongoing development.

This is no radical departure from Wikipedias ethos of democratized knowledge that anyone can edit. This is an additional safeguard to ensure Wikipedias existing content policies are actually upheld.

Academia must also play its part to keep Wikipedia accurate. Scholars should uncover Wikipedias weaknesses and flag them for editors to fix, instead of snubbing Wikipedia as unreliable. Wikipedia is the seventh-most-visited site on the internet, most peoples first and last stop for information. All the more so with ChatGPT, which amplifies online content to a deafening pitch.

Volunteer editors and professional experts need to work together to get it right.

To contact the author, email [emailprotected]

Shira Klein is an associate professor of history at Chapman University in California and co-author of the study, Wikipedias Intentional Distortion of the History of the Holocaust in The Journal of Holocaust Research.

The views and opinions expressed in this article are the authors own and do not necessarily reflect those of the Forward. Discover more perspective in Opinion.

Read more from the original source:
The shocking truth about Wikipedias Holocaust disinformation - Forward