Archive for the ‘Wikipedia’ Category

Wikipedia buttons up key pages ahead of U.S. election – Reuters

(Reuters) - Wikipedia has locked down its main election page ahead of the U.S. presidential election so that only certain editors can make changes, part of preparations to combat potential disinformation and abuses related to Tuesdays vote.

The online encyclopedias articles, written primarily by unpaid volunteers, are relied on by platforms from Alphabet Incs Google to Amazon Incs voice assistant Alexa to give their users information and context.

Were not worried about vandals who want to just mess up an article in order to cause a little trouble. The Wikipedia community deals with those issues for breakfast, Ryan Merkley, chief of staff at the Wikimedia Foundation, the nonprofit organization which hosts Wikipedia, said in a phone interview.

Were really worried about coordinated actors ... trying to find a way to disseminate information ... in a way that could cause people, for example, to choose not to vote or to influence the outcome of the election based on something that was not true.

Internet researchers say Wikipedia, which says it is committed to neutrality, has emerged as a relatively trusted site, while major social platforms like Facebook and Twitter have struggled to curb viral misinformation.

This year, Merkley said, the Wikimedia Foundation for the first time put together a disinformation task-force to run election exercises with staff and community members.

Last week, community members moved to add extra protections to the 2020 United States presidential election article so only users who have had a registered account for more than 30 days and have made 500 edits on the site can alter the page.

Merkley said Wikipedia has seen the creation of fake accounts, people making false edits to screenshot and share on social media and attempts to use content from unreliable sources or skew articles to a bias.

He said the Wikimedia Foundation had been meeting with industry partners and U.S. government officials, including from the Federal Bureau of Investigation and Department of Homeland Security, but that it had not yet seen any state actors flagged by government officials operating on Wikipedia.

A Wikimedia spokeswoman said there are currently 72 English-Wikipedia articles related to the U.S. election and that there are about 2,600 editors watching those pages who get alerts for any edits.

Merkley said staff rarely make interventions but there could be instances around the election, such as direct calls for violence, where they would remove content or take action against a user.

Reporting by Elizabeth Culliford; Editing by Greg Mitchell and Nick Zieminski

Read the original post:
Wikipedia buttons up key pages ahead of U.S. election - Reuters

Wikipedia Locks Key Pages to Combat Disinformation Ahead of US Presidential Elections – Gadgets 360

Wikipedia has locked down its main election page ahead of the US presidential election so that only certain editors can make changes, part of preparations to combat potential disinformation and abuses related to Tuesday's vote.

The online encyclopedia's articles, written primarily by unpaid volunteers, are relied on by platforms from Alphabet's Google to Amazonvoice assistant Alexa to give their users information and context.

"We're not worried about vandals who want to just mess up an article in order to cause a little trouble. The Wikipedia community deals with those issues for breakfast," Ryan Merkley, chief of staff at the Wikimedia Foundation, the nonprofit organisation which hosts Wikipedia, said in a phone interview.

"We're really worried about coordinated actors ... trying to find a way to disseminate information ... in a way that could cause people, for example, to choose not to vote or to influence the outcome of the election based on something that was not true."

Internet researchers say Wikipedia, which says it is committed to neutrality, has emerged as a relatively trusted site, while major social platforms like Facebook and Twitter have struggled to curb viral misinformation.

This year, Merkley said, the Wikimedia Foundation for the first time put together a disinformation task-force to run election exercises with staff and community members.

Last week, community members moved to add extra protections to the '2020 United States presidential election' article so only users who have had a registered account for more than 30 days and have made 500 edits on the site can alter the page.

Merkley said Wikipedia has seen the creation of fake accounts, people making false edits to screenshot and share on social media and attempts to use content from unreliable sources or skew articles to a bias.

He said the Wikimedia Foundation had been meeting with industry partners and US government officials, including from the Federal Bureau of Investigation and Department of Homeland Security, but that it had not yet seen any state actors flagged by government officials operating on Wikipedia.

A Wikimedia spokeswoman said there are currently 72 English-Wikipedia articles related to the US election and that there are about 2,600 editors 'watching' those pages who get alerts for any edits.

Merkley said staff rarely make interventions but there could be instances around the election, such as direct calls for violence, where they would remove content or take action against a user.

Thomson Reuters 2020

Are iPhone 12 mini, HomePod mini the Perfect Apple Devices for India? We discussed this on Orbital, our weekly technology podcast, which you can subscribe to via Apple Podcasts, Google Podcasts, or RSS, download the episode, or just hit the play button below.

Originally posted here:
Wikipedia Locks Key Pages to Combat Disinformation Ahead of US Presidential Elections - Gadgets 360

Minecraft Developer Read A Wikipedia Article For Inspiration On Crystals – TheGamer

While we still don't have a solid idea of what the amethyst crystals will be capable of, we do know where to look for ideas on what could be coming.

Minecraft's 1.17 Caves and Cliffs update may be months away, but it seems that we get more information about it every week. This time, we learned that a Minecraft developer read a Wikipedia article for inspiration on the new amethyst crystals being added in 1.17.

At the beginning of October, we learned that the next big update to Minecraft would be called Caves and Cliffs. This update will include new mobs, like the Glow Squid; new cave types, like the Mesh caves; and a new copper ore. Of course, the Minecraft Live event that introduced these elements didn't elaborate on much, leading many to wonder what the specifics of the new additions would be.

Related:Minecraft: 10 Things You Didn't Know About Hoglins

While we still don't have a solid idea of what the amethyst crystals will be capable of, we do know where to look for ideas on what could be coming. During a Q&A on October 30, a group of developers was asked if you would be able to craft binoculars, as well as a spyglass, with the new crystals. While developer Corey Scheviak said that, no, you will not be able to craft binoculars, he did tell ushow they thought to add the spyglass.

Apparently, Scheviak was sitting in bedthinking of what the team could do with crystalswhen he decided to google the word "crystal". That led him tothrough a Wikipediapage on everything that crystals can do. He explains that, "I gave her [Agnes Larsson] like 50 ideas, and she was like, 'That one! That one right there! The spyglass!'"

In our attempt to replicate that search (we went to Wikipedia and searched "uses for crystals") we found a lot of interesting potential usesincluding redstone and health applications. For example, crystals have been used to create radios and as healing remedies in the real-world. It wouldn't be too much of a stretch to think that these uses would easily translate to your Minecraft world.

Scheviak said that there is "at least one" additional use for crystals coming in 1.17 and one that he is hoping will be added in. Although he did not elaborate on what those would be, he did reference the list of items (that he got from Wikipedia) before mentioning this. Time will tell what will be added, but the possibilities are fascinating, even if those possibilitieswon't include colored lights.

Go here to read the rest:
Minecraft Developer Read A Wikipedia Article For Inspiration On Crystals - TheGamer

7 Marketing Disasters That Turned Out to Be Precious Lessons – Search Engine Journal

Some people love horror stories and others dont, but probably at some point, all of us have experienced a terrifying twist in life.

This inspired the SE Ranking team to reach out to SEO and marketing experts from all over the world and ask them to share some nightmare-of-a-case stories they had in their careers.

Our assumption was that since SEO is so fickle and marketing success depends on so many factors, our fellow colleagues will surely have some enticing stories to share.

Luckily, they didnt mind dragging their skeletons out of the closet.

This article is all about mistakes that led to devastating consequences.

From lack of caution or vision to unexpected obstacles and bad luck the reasons behind their marketing failures vary.

But fortunately, most of the stories had a happy ending.

And they all once again prove that the road to success is paved with failure.

They teach us that every mistake we make will ultimately steer us in the right direction as long as we learn the lesson and retain the necessary vigor to keep going.

Advertisement

Continue Reading Below

A warning had been on the Wikipedia article for two years.

Then early July 2020, a new warning appeared.

Two weeks later, the Wikipedia article about me had been deleted by the administrators.

Within a week, my entity had disappeared from Googles Knowledge Graph and the knowledge panel on my personal Brand SERP had gone.

Advertisement

Continue Reading Below

A major nightmare for someone who calls himself The Brand SERP Guy.

Worse, a week later, the Wikipedia article about my folk-punk band from the 90s was gone.

Two days later, the article about my TV series from the 90s was gone, too.

Seems someone at Wikipedia had it in for me.

In fact, truth be told, it was my own fault.

In the interests of experimenting to see how much I could feed Googles Knowledge Graph and control the knowledge panel on my personal Brand SERP, and those of my music group and TV series, I had (over) edited all three Wikipedia articles.

Which is against the rules.

So what happened?

Read on, because this horror story actually has a redemptive ending.

I rebuilt it all, took control of the entities, learned a lot about knowledge panels, and got some amazing insights into how the Knowledge Graph functions.

I panicked when the Wikipedia page was deleted and moved the structured data about me on my site from the home page to a dedicated About Page.

That turned out to be a mistake.

As described, the knowledge panel disappeared and the entry in the Knowledge Graph got deleted.

Once again, my own fault.

This is the folk-punk group I mentioned earlier.

There had previously been a mix-up of information in the knowledge panel due to the ambiguity of the name, but last year I had sorted it out using:

Advertisement

Continue Reading Below

The deletion of the Wikipedia article brought back the mix-ups.

However, because of all the work I had done and the schema markup I had added, Google now saw my site as the main authority about the band.

That means I could now change things quite easily.

Including the description in the knowledge panel (updates take 10 minutes).

I had control.

This is the cartoon characters and TV series I mentioned earlier.

Advertisement

Continue Reading Below

Following the deletion of the Wikipedia article about the TV series, the Knowledge Graph entity remained in place, and the information in the knowledge panel remained as-was, except the description, which disappeared.

Three weeks later that was back, but this time from my site (it has since switched to the official site).

Once again, my site and the schema markup I provide was Googles fallback, the second-best source of information about the entity.

Once again, the deletion of the Wikipedia page gave me control.

Every entity needs a home.

Advertisement

Continue Reading Below

Preferably on your site.

For all the three entities, my site was the home the source of information Google uses as its point of reference in the absence of a Wikipedia article.

It appears that, when a substantial piece of information about an entity such as Wikipedia disappears, that is the fallback crutch Google uses to reassure itself that the Knowledge Graph is correct.

The schema markup on your site that describes you and related entities is vitally important to Google in its understanding of those entities and its confidence in its understanding of those entities.

Advertisement

Continue Reading Below

The good news is that by leveraging the (rather groovy) entity-based markup provided by WordLift, in just 6 weeks I created a completely new entity in the Knowledge Graph and rebuilt the entire knowledge panel better than ever.

Google now uses my site as the reference for information about me (rather than Wikipedia).

And that means what appears in the knowledge panel is now (semi) controlled by me and no longer affected by anonymous Wikipedia editors who know nothing about me, and what information is important about me.

Brilliant!

Nobody likes to see their organic traffic and rankings drop.

Advertisement

Continue Reading Below

When a drop happens, though, you can usually figure out the cause.

But the scariest moment for me was when a client faced a traffic and rankings drop with no apparent cause.

Overnight, this client lost half their organic traffic.

The terms they had ranked highly for were simply gone.

There was no algorithm update, no changes to the website, no alterations to the content there wasnt even a surge in server errors (or any error) in any tool we looked at.

Competitors hadnt changed anything either.

There was no growth in external backlinks.

Search Console wasnt reporting a manual action.

The content was highly authoritative within this clients industry and the company had (and still has) a strong brand reputation.

Mysteriously, overnight, this companys organic traffic was simply gone.

Advertisement

Continue Reading Below

Any traffic drop is scary enough but what made this a true nightmare scenario was that we couldnt find any cause no matter where we looked.

For some unknown reason, Google decided to kick this site out of the index.

Without a cause, there wasnt a clear place to begin recovering the traffic.

Do we start by fixing content?

Keep looking for a technical problem?

Maybe something happened with links?

Like any good mystery, the solution is only to be found via careful investigation.

So, we pushed through the nightmare and kept digging.

As we dug in, we started to find some hidden and underlying problems that had been lurking on this site for years.

The phrase legacy code has always worried me but this project made me realize that legacy code is one of the scarier parts of any website.

Given how scary legacy code can be, we maybe ought to rename it.

Advertisement

Continue Reading Below

Maybe zombie code would be more fitting?

Thankfully, this story ended well.

After months of digging, we figured out that Googles bots had stumbled across one of the nastier legacy areas of the site and deranked the website given what they had found there.

That one bad section of the site had caused Google to reevaluate the website in a negative light.

Here is the original post:
7 Marketing Disasters That Turned Out to Be Precious Lessons - Search Engine Journal

How Wikipedia will fight Election Day misinformation – nation.lk – The Nation Newspaper

2 hr 1 min agoHow Wikipedia will fight Election Day misinformation

From CNN Business' Kaya Yurieff

Staffers at Wikipedia's parent organization and the volunteer editors who maintain its millions of pages have a plan to ensure that election-related entries aren't improperly edited.

Last week, the Wikipedia community placed extended protections on the 2020 United States presidential election page, which means only experienced volunteers with at least 500 edits and 30 days on the platform can make changes. Other pages related to the election and presidential candidates already have protections, like the articles for Hunter Biden, the son of Democratic presidential nominee Joe Biden, Jared Kushner, President Donald Trumps son-in-law, and the pages for both the Trump and Biden campaigns.

Generally, anyone can go into an article and make a change, however, there are varying levels of protections for what Wikipedia calls contested pages, which range from political topics to more obscure subjects over which editors disagree.

There are over 70 English-language articles about the 2020 election, according to the Wikimedia Foundation, Wikipedia's parent. It said more articles may be protected as Election Day nears.

Editors will be monitoring a list of relevant articles on Election Day and beyond. If someone makes an edit to those pages, over 500 people will get an email alerting them that there could be something worth checking.

Wired previously reported that editors have been actively discussing what measures they are considering for election night on a public page.

Since late August, some Wikimedia staff have been running through different scenarios of what could happen on its site during the election, such as how it would handle malicious content or a coordinated attack by multiple accounts making edits across several Wikipedia pages on Election Day.

We are under no illusions that we will prevent every bad edit from making it onto the site," said Wikimedia chief of staff Ryan Merkley, who leads its new internal US election task force. "We think our responsibility is to make sure that we are as prepared to respond and that we can do it as swiftly as possible and ideally prevent its spread broadly.

From CNN Business' Kaya Yurieff

Ahead of Election Day, Instagram has moved to temporarily restrict a popular way to browse posts.

Instagram announced that it will temporarily hide the "Recent" tab from showing up on all hashtag pages whether they're related to politics or not. The company said it hopes the move will help prevent the spread of misinformation and harmful content related to the election.

Hashtag pages will still work, they'll just only show "Top Posts" as determined by the platform's algorithms. This may include some recent posts.

An Instagram spokesperson said the change was rolled out Thursday evening, and there is no specific timeline for when the action will end.

Other social platforms have also implemented similar temporary changes ahead of Election Day. For example, Twitter is encouraging users to quote tweet rather than to retweet, hoping people will add context or a reaction before spreading information.

From CNN Business' Donie OSullivan and Marshall Cohen

Twitter labeled a video from the Russian-state controlled broadcaster RT as election misinformation on Thursday.

RT is registered with the US Justice Department as an agent of the Russian government.It is the first time Twitter has taken action against RT for US election misinformation in this way, Twitter confirmed to CNN.

The four-minute video posted by RT was titled"Questionsmount amid voter fraud, rigging claims ahead of#USelection."

Twitter deactivated the retweet featured on the video, to reduce how much it can be shared, and slapped a label over it that read,"Someor all of the content shared in this Tweet is disputed and might be misleading about how to participate in an election or another civic process.

The Kremlin uses RT to spread English-language propaganda to American audiences, and was part of Russias election meddling in 2016, according to US intelligence agencies.

A report released by the US intelligence community in 2017 said RT has historically"portrayedthe US electoral process as undemocratic and amplifies false narratives claiming that"USelection results cannot be trusted.

The four-minute video that RT posted Thursday touches on many of these themes. It raises concerns aboutfraudand echoes many of the lies President Donald Trump has spread about mail-in voting. Their segment cites Fox News, which has championed many of Trumps attacks against the electoral process. It highlights isolated incidents of ballot mishaps, many of which have already been deemed by local authorities to be accidents and errors and not fraud.

Earlier this year, aninternalintelligence bulletinissued by the Department of Homeland Security said Russia was amplifying disinformation about mail-in voting as part of a broader effort "to undermine public trust in the electoral process."

CNN Business' Gabe Ramirez

Facebook has hired a network of fact-checkers across America. CNN talks to two who have received threats for simply doing their jobs during the 2020 election cycle.

CNN Business' Kaya Yurieff

Ted Cruz yelled. His Democratic colleague Brian Schatz called the hearing in which he was speaking "a sham." Committee chair Roger Wicker couldn't pronounce the last name of Google's CEO. Just another day on Capitol Hill for Big Tech.

In a contentious hearing on Wednesday, the CEOs of Facebook (FB), Google (GOOG) and Twitter (TWTR) were questioned by Senators on the Commerce Committee over their content moderation policies. Some demanded more transparency while others sought explanations on a few specific cases in which content was removed or labeled by platforms. Though the hearing was meant to focus on a crucial law, known as Section 230, that protects the companies' ability to moderate content as they see fit, Senators strayed from the brief and confronted the executives on other topics, including antitrust and election interference.

Schatz and other senators slammed the timing of hearing, which comes less than a week before the US election. "This is bullying and it is for electoral purposes," Schatz said. "Do not let the United States Senate bully you into carrying water for those who want to spread misinformation."

Cruz angrily went after Twitter CEO Jack Dorsey, pressing him on the platform's decision to restrict content posted by the New York Post. He concluded by shouting at Dorsey: "Mr. Dorsey, who the hell elected you and put you in charge of what the media are allowed to report and what the American people are allowed to hear, and why do you persist in behaving as a Democratic super PAC silencing views to the contrary of your political beliefs?"

CNN Business will cover the hearing live. You can read our updating coverage here.

From CNN Business' Brian Fung

TikTok said Wednesday it will reduce the distribution of claims of election victory before official results are confirmed by authoritative sources.

Eric Han, TikToks US head of safety, announced that premature claims of victory surrounding the 2020 election will be restricted if the Associated Press has not declared a result. Han also said the company is working with third-party fact-checkers who are on expedited call during this sensitive time."

"Out of an abundance of caution, if claims can't be verified or fact-checking is inconclusive, we'll limitdistribution of the content, Han addedin a blog post."We'll also add a banner pointing viewers to our election guide on contentwith unverifiable claims about voting, premature declarations of victory, or attempts to dissuade peoplefrom voting by exploiting COVID-19 as a voter suppression tactic."

The policy is similar to ones previously announced by other social media companies like Facebook and Twitter.

From CNN Business' Rishi Iyengar

One of Facebook's top executives in India where it has more users than anywhere else has resigned months after being linked to allegations of political bias and hate speech against the platform.

Ankhi Das, Facebook's head of public policy in India, allowed a politician from the country's ruling party to remain on its platform even though his anti-Muslim posts flouted its rules against hate speech, current and former Facebook employees told the Wall Street Journal in August. Das reportedly opposed banning the politician (which Facebook ultimately did weeks later) because doing so would hurt Facebook's business in the country.

"Ankhi has decided to step down from her role in Facebook to pursue her interest in public service," Ajit Mohan, the company's vice president and managing director in India, said in a statement. "We are grateful for her service and wish her the very best for the future."

Facebook has long faced controversies over harmful misinformation and hate speech in India, whose 600 million-plus internet users are increasingly important to its business as it's locked out of China and looks for future growth.

The Indian government has repeatedly called on Facebook to do more to curb misinformation, particularly on its mobile messaging platform WhatsApp, after viral hoaxes in 2018 were linked to more than a dozen lynchings.

WhatsApp misinformation may be finding its way into the upcoming US presidential election, with Reuters reporting that misleading messages about Democratic candidate Joe Biden have been making the rounds on the private messaging service particularly within the Indian-American community.

WhatsApp counts India as its largest market, with around 400 million users.

From CNN Business' Kaya Yurieff

A misleading video clip of Democratic presidential candidate Joe Biden has been spreading on social media without any warning labels since Saturday after having been promoted by members of President Donald Trump's inner circle.

In the 24-second clip from an interview with the podcast Pod Save America, which is hosted by four former members of the Obama administration, Biden is heard saying, in part, that "we have put together, I think, the most extensive and inclusive voter fraud organization in the history of American politics."

The clip was first posted by RNC Research, a Twitter account operated by the Republican National Committee.

It's clear in context that Biden is talking about an effort to combat voter suppression and provide resources for those seeking to vote, not an organized effort to perpetrate voter fraud.

The clip is part of a longer response by the former Vice President to a two-part question from host Dan Pfeiffer about Biden's message to people who haven't voted and those who already have.

In his response, Biden encouraged people to "make a plan exactly how you're going to vote, where you're going to vote, when you're going to vote. Because it can get complicated, because the Republicans are doing everything they can to make it harder for people to vote, particularly people of color, to vote..."

He continued a few sentences later: "We have put together, I think, the most extensive and inclusive voter fraud organization in the history of American politics. What the President is trying to do is discourage people from voting by implying that their vote won't be counted, it can't be counted, we're going to challenge it..."

Biden goes to explain that his campaign has arranged for legal assistance for people who feel their right to vote has ben challenged.

On Saturday evening, White House Press Secretary Kayleigh McEnany posted the shortened clip from her personal Twitter account saying: "BIDEN ADMITS TO VOTER FRAUD!"

Fact-checking website Snopes debunked the claim as false.

McEnany's post has been retweeted more than 32,000 times. The clip has been viewed 7.9 million times on Twitter. Eric Trump also posted the video on both Twitter and Facebook without any false commentary.

President Trump's verified YouTube account also posted the clip, with the title: Joe Biden brags about having 'the most extensive and inclusive VOTER FRAUD organization' in history. It's been viewed nearly 500,000 times.

A Twitter spokesperson said it will not label the tweets by McEnany or Eric Trump.The social network did not provide further detail.

Facebook did not immediately respond to requests for comment, but the platform has not added any information labels or fact-checking resources to the clip.

According to Twitter's rules, users may not "deceptively share synthetic or manipulated media that are likely to cause harm." However, it's unclear if a clip taken out of context, but not technologically manipulated, would fall into this category.

Facebook's manipulated media policy states users should not post videothat has been "edited or synthesized ... in ways that are not apparent to an average person, and would likely mislead an average person to believe that a subject of the video said words that they did not say."

A YouTube spokesperson said the video does not violate its rules.

While the video shared with us by CNN does not violate our Community Guidelines,we have robust policies prohibiting deceptive practices such as technically manipulating content in a way that misleads users (beyond clips taken out of context) and may pose egregious harm." saidIvy Choi, a YouTube spokesperson.

The Trump campaign did not respond to a request for comment. Biden's national press secretary TJ Ducklo said: "The President of the United States has already demonstrated hes willing to lie and manipulate our countrys democratic process to help himself politically, which is why we have assembled the most robust and sophisticated team in presidential campaign history to confront voter suppression and fight voter fraud however it may present itself."

When asked if the RNC stood by the clipped video, and if its the official position of the RNC that Biden was endorsing and explicitly encouraging voter fraud, RNC Rapid Response Director Steve Guest said: You should ask Joe Biden if he stands by the words he uttered, not us for sharing them. It's not the RNC's responsibility to clarify for the Biden campaign their candidates repeated blunders."

From CNN Business' Kaya Yurieff and Rishi Iyengar

Facebook announced its latest crackdown on foreign actors trying to interfere in the US election.

On Tuesday, Facebook said it took down three different networks, two of which targeted the United States. The other originated in and targeted Myanmar.

Facebook said it identified and removed these networks before they were able to build up a substantial audience. The people within each network worked with each other and used fake accounts to mislead people about who they were, according to a blog post announcing the takedowns from Nathaniel Gleicher, Facebook's head of security policy.

The company announced it took action against three distinct networks. The first network consisted of two Facebook pages and 22 Instagram accounts and was taken down for violating Facebook's policy against foreign interference, which covers accounts from outside a particular country trying to influence behavior in that country. These accounts and pages were run by individuals from Mexico and Venezuela and targeted the United States. They posted in Spanish and English about US news and current events, including "memes and other content about humor, race relations and racial injustice, feminism and gender relations, environmental issues and religion," Gleicher said.

Facebook also removed 12 Facebook accounts, 6 pages and 11 Instagram accounts for government interference, with the company saying it found links to individuals associated with the Iranian government. That network started in Iran and primarily targeted the United States and Israel.

Facebook said the accounts focused on Saudi Arabia's activities in the Middle East and also spread claims about an alleged massacre.

A third network, comprising 9 Facebook accounts, 8 pages, two groups and two Instagram accounts, originated in Myanmar and targeted local audiences there. Its posts, mainly in Burmese, included criticism of Myanmar's armed forces, Facebook said.

The removals are part of Facebook's effort to crack down on "coordinated inauthentic behavior" and widespread misinformation, particularly leading up to the US presidential election.

Last month, the company said it had identified and shut down a network of fake accounts that included fictitious personas it said were tied to Russian military intelligence.

While those accounts had not been primarily targeting the United States, Facebook pointed to concerns that similar accounts could be used in Russian influence operations as the November 3 election draws closer.

"In recent weeks, government agencies, technology platforms and security experts have alerted the public to expect attempts to spread false information about the integrity of the election," Gleicher said in his blog post on Tuesday. "Were closely monitoring for potential scenarios where malicious actors around the world may use fictitious claims, including about compromised election infrastructure or inaccurate election outcomes, to suppress voter turnout or erode trust in the poll results."

See more here:
How Wikipedia will fight Election Day misinformation - nation.lk - The Nation Newspaper