Archive for the ‘Wikipedia’ Category

7 Marketing Disasters That Turned Out to Be Precious Lessons – Search Engine Journal

Some people love horror stories and others dont, but probably at some point, all of us have experienced a terrifying twist in life.

This inspired the SE Ranking team to reach out to SEO and marketing experts from all over the world and ask them to share some nightmare-of-a-case stories they had in their careers.

Our assumption was that since SEO is so fickle and marketing success depends on so many factors, our fellow colleagues will surely have some enticing stories to share.

Luckily, they didnt mind dragging their skeletons out of the closet.

This article is all about mistakes that led to devastating consequences.

From lack of caution or vision to unexpected obstacles and bad luck the reasons behind their marketing failures vary.

But fortunately, most of the stories had a happy ending.

And they all once again prove that the road to success is paved with failure.

They teach us that every mistake we make will ultimately steer us in the right direction as long as we learn the lesson and retain the necessary vigor to keep going.

Advertisement

Continue Reading Below

A warning had been on the Wikipedia article for two years.

Then early July 2020, a new warning appeared.

Two weeks later, the Wikipedia article about me had been deleted by the administrators.

Within a week, my entity had disappeared from Googles Knowledge Graph and the knowledge panel on my personal Brand SERP had gone.

Advertisement

Continue Reading Below

A major nightmare for someone who calls himself The Brand SERP Guy.

Worse, a week later, the Wikipedia article about my folk-punk band from the 90s was gone.

Two days later, the article about my TV series from the 90s was gone, too.

Seems someone at Wikipedia had it in for me.

In fact, truth be told, it was my own fault.

In the interests of experimenting to see how much I could feed Googles Knowledge Graph and control the knowledge panel on my personal Brand SERP, and those of my music group and TV series, I had (over) edited all three Wikipedia articles.

Which is against the rules.

So what happened?

Read on, because this horror story actually has a redemptive ending.

I rebuilt it all, took control of the entities, learned a lot about knowledge panels, and got some amazing insights into how the Knowledge Graph functions.

I panicked when the Wikipedia page was deleted and moved the structured data about me on my site from the home page to a dedicated About Page.

That turned out to be a mistake.

As described, the knowledge panel disappeared and the entry in the Knowledge Graph got deleted.

Once again, my own fault.

This is the folk-punk group I mentioned earlier.

There had previously been a mix-up of information in the knowledge panel due to the ambiguity of the name, but last year I had sorted it out using:

Advertisement

Continue Reading Below

The deletion of the Wikipedia article brought back the mix-ups.

However, because of all the work I had done and the schema markup I had added, Google now saw my site as the main authority about the band.

That means I could now change things quite easily.

Including the description in the knowledge panel (updates take 10 minutes).

I had control.

This is the cartoon characters and TV series I mentioned earlier.

Advertisement

Continue Reading Below

Following the deletion of the Wikipedia article about the TV series, the Knowledge Graph entity remained in place, and the information in the knowledge panel remained as-was, except the description, which disappeared.

Three weeks later that was back, but this time from my site (it has since switched to the official site).

Once again, my site and the schema markup I provide was Googles fallback, the second-best source of information about the entity.

Once again, the deletion of the Wikipedia page gave me control.

Every entity needs a home.

Advertisement

Continue Reading Below

Preferably on your site.

For all the three entities, my site was the home the source of information Google uses as its point of reference in the absence of a Wikipedia article.

It appears that, when a substantial piece of information about an entity such as Wikipedia disappears, that is the fallback crutch Google uses to reassure itself that the Knowledge Graph is correct.

The schema markup on your site that describes you and related entities is vitally important to Google in its understanding of those entities and its confidence in its understanding of those entities.

Advertisement

Continue Reading Below

The good news is that by leveraging the (rather groovy) entity-based markup provided by WordLift, in just 6 weeks I created a completely new entity in the Knowledge Graph and rebuilt the entire knowledge panel better than ever.

Google now uses my site as the reference for information about me (rather than Wikipedia).

And that means what appears in the knowledge panel is now (semi) controlled by me and no longer affected by anonymous Wikipedia editors who know nothing about me, and what information is important about me.

Brilliant!

Nobody likes to see their organic traffic and rankings drop.

Advertisement

Continue Reading Below

When a drop happens, though, you can usually figure out the cause.

But the scariest moment for me was when a client faced a traffic and rankings drop with no apparent cause.

Overnight, this client lost half their organic traffic.

The terms they had ranked highly for were simply gone.

There was no algorithm update, no changes to the website, no alterations to the content there wasnt even a surge in server errors (or any error) in any tool we looked at.

Competitors hadnt changed anything either.

There was no growth in external backlinks.

Search Console wasnt reporting a manual action.

The content was highly authoritative within this clients industry and the company had (and still has) a strong brand reputation.

Mysteriously, overnight, this companys organic traffic was simply gone.

Advertisement

Continue Reading Below

Any traffic drop is scary enough but what made this a true nightmare scenario was that we couldnt find any cause no matter where we looked.

For some unknown reason, Google decided to kick this site out of the index.

Without a cause, there wasnt a clear place to begin recovering the traffic.

Do we start by fixing content?

Keep looking for a technical problem?

Maybe something happened with links?

Like any good mystery, the solution is only to be found via careful investigation.

So, we pushed through the nightmare and kept digging.

As we dug in, we started to find some hidden and underlying problems that had been lurking on this site for years.

The phrase legacy code has always worried me but this project made me realize that legacy code is one of the scarier parts of any website.

Given how scary legacy code can be, we maybe ought to rename it.

Advertisement

Continue Reading Below

Maybe zombie code would be more fitting?

Thankfully, this story ended well.

After months of digging, we figured out that Googles bots had stumbled across one of the nastier legacy areas of the site and deranked the website given what they had found there.

That one bad section of the site had caused Google to reevaluate the website in a negative light.

Here is the original post:
7 Marketing Disasters That Turned Out to Be Precious Lessons - Search Engine Journal

How Wikipedia will fight Election Day misinformation – nation.lk – The Nation Newspaper

2 hr 1 min agoHow Wikipedia will fight Election Day misinformation

From CNN Business' Kaya Yurieff

Staffers at Wikipedia's parent organization and the volunteer editors who maintain its millions of pages have a plan to ensure that election-related entries aren't improperly edited.

Last week, the Wikipedia community placed extended protections on the 2020 United States presidential election page, which means only experienced volunteers with at least 500 edits and 30 days on the platform can make changes. Other pages related to the election and presidential candidates already have protections, like the articles for Hunter Biden, the son of Democratic presidential nominee Joe Biden, Jared Kushner, President Donald Trumps son-in-law, and the pages for both the Trump and Biden campaigns.

Generally, anyone can go into an article and make a change, however, there are varying levels of protections for what Wikipedia calls contested pages, which range from political topics to more obscure subjects over which editors disagree.

There are over 70 English-language articles about the 2020 election, according to the Wikimedia Foundation, Wikipedia's parent. It said more articles may be protected as Election Day nears.

Editors will be monitoring a list of relevant articles on Election Day and beyond. If someone makes an edit to those pages, over 500 people will get an email alerting them that there could be something worth checking.

Wired previously reported that editors have been actively discussing what measures they are considering for election night on a public page.

Since late August, some Wikimedia staff have been running through different scenarios of what could happen on its site during the election, such as how it would handle malicious content or a coordinated attack by multiple accounts making edits across several Wikipedia pages on Election Day.

We are under no illusions that we will prevent every bad edit from making it onto the site," said Wikimedia chief of staff Ryan Merkley, who leads its new internal US election task force. "We think our responsibility is to make sure that we are as prepared to respond and that we can do it as swiftly as possible and ideally prevent its spread broadly.

From CNN Business' Kaya Yurieff

Ahead of Election Day, Instagram has moved to temporarily restrict a popular way to browse posts.

Instagram announced that it will temporarily hide the "Recent" tab from showing up on all hashtag pages whether they're related to politics or not. The company said it hopes the move will help prevent the spread of misinformation and harmful content related to the election.

Hashtag pages will still work, they'll just only show "Top Posts" as determined by the platform's algorithms. This may include some recent posts.

An Instagram spokesperson said the change was rolled out Thursday evening, and there is no specific timeline for when the action will end.

Other social platforms have also implemented similar temporary changes ahead of Election Day. For example, Twitter is encouraging users to quote tweet rather than to retweet, hoping people will add context or a reaction before spreading information.

From CNN Business' Donie OSullivan and Marshall Cohen

Twitter labeled a video from the Russian-state controlled broadcaster RT as election misinformation on Thursday.

RT is registered with the US Justice Department as an agent of the Russian government.It is the first time Twitter has taken action against RT for US election misinformation in this way, Twitter confirmed to CNN.

The four-minute video posted by RT was titled"Questionsmount amid voter fraud, rigging claims ahead of#USelection."

Twitter deactivated the retweet featured on the video, to reduce how much it can be shared, and slapped a label over it that read,"Someor all of the content shared in this Tweet is disputed and might be misleading about how to participate in an election or another civic process.

The Kremlin uses RT to spread English-language propaganda to American audiences, and was part of Russias election meddling in 2016, according to US intelligence agencies.

A report released by the US intelligence community in 2017 said RT has historically"portrayedthe US electoral process as undemocratic and amplifies false narratives claiming that"USelection results cannot be trusted.

The four-minute video that RT posted Thursday touches on many of these themes. It raises concerns aboutfraudand echoes many of the lies President Donald Trump has spread about mail-in voting. Their segment cites Fox News, which has championed many of Trumps attacks against the electoral process. It highlights isolated incidents of ballot mishaps, many of which have already been deemed by local authorities to be accidents and errors and not fraud.

Earlier this year, aninternalintelligence bulletinissued by the Department of Homeland Security said Russia was amplifying disinformation about mail-in voting as part of a broader effort "to undermine public trust in the electoral process."

CNN Business' Gabe Ramirez

Facebook has hired a network of fact-checkers across America. CNN talks to two who have received threats for simply doing their jobs during the 2020 election cycle.

CNN Business' Kaya Yurieff

Ted Cruz yelled. His Democratic colleague Brian Schatz called the hearing in which he was speaking "a sham." Committee chair Roger Wicker couldn't pronounce the last name of Google's CEO. Just another day on Capitol Hill for Big Tech.

In a contentious hearing on Wednesday, the CEOs of Facebook (FB), Google (GOOG) and Twitter (TWTR) were questioned by Senators on the Commerce Committee over their content moderation policies. Some demanded more transparency while others sought explanations on a few specific cases in which content was removed or labeled by platforms. Though the hearing was meant to focus on a crucial law, known as Section 230, that protects the companies' ability to moderate content as they see fit, Senators strayed from the brief and confronted the executives on other topics, including antitrust and election interference.

Schatz and other senators slammed the timing of hearing, which comes less than a week before the US election. "This is bullying and it is for electoral purposes," Schatz said. "Do not let the United States Senate bully you into carrying water for those who want to spread misinformation."

Cruz angrily went after Twitter CEO Jack Dorsey, pressing him on the platform's decision to restrict content posted by the New York Post. He concluded by shouting at Dorsey: "Mr. Dorsey, who the hell elected you and put you in charge of what the media are allowed to report and what the American people are allowed to hear, and why do you persist in behaving as a Democratic super PAC silencing views to the contrary of your political beliefs?"

CNN Business will cover the hearing live. You can read our updating coverage here.

From CNN Business' Brian Fung

TikTok said Wednesday it will reduce the distribution of claims of election victory before official results are confirmed by authoritative sources.

Eric Han, TikToks US head of safety, announced that premature claims of victory surrounding the 2020 election will be restricted if the Associated Press has not declared a result. Han also said the company is working with third-party fact-checkers who are on expedited call during this sensitive time."

"Out of an abundance of caution, if claims can't be verified or fact-checking is inconclusive, we'll limitdistribution of the content, Han addedin a blog post."We'll also add a banner pointing viewers to our election guide on contentwith unverifiable claims about voting, premature declarations of victory, or attempts to dissuade peoplefrom voting by exploiting COVID-19 as a voter suppression tactic."

The policy is similar to ones previously announced by other social media companies like Facebook and Twitter.

From CNN Business' Rishi Iyengar

One of Facebook's top executives in India where it has more users than anywhere else has resigned months after being linked to allegations of political bias and hate speech against the platform.

Ankhi Das, Facebook's head of public policy in India, allowed a politician from the country's ruling party to remain on its platform even though his anti-Muslim posts flouted its rules against hate speech, current and former Facebook employees told the Wall Street Journal in August. Das reportedly opposed banning the politician (which Facebook ultimately did weeks later) because doing so would hurt Facebook's business in the country.

"Ankhi has decided to step down from her role in Facebook to pursue her interest in public service," Ajit Mohan, the company's vice president and managing director in India, said in a statement. "We are grateful for her service and wish her the very best for the future."

Facebook has long faced controversies over harmful misinformation and hate speech in India, whose 600 million-plus internet users are increasingly important to its business as it's locked out of China and looks for future growth.

The Indian government has repeatedly called on Facebook to do more to curb misinformation, particularly on its mobile messaging platform WhatsApp, after viral hoaxes in 2018 were linked to more than a dozen lynchings.

WhatsApp misinformation may be finding its way into the upcoming US presidential election, with Reuters reporting that misleading messages about Democratic candidate Joe Biden have been making the rounds on the private messaging service particularly within the Indian-American community.

WhatsApp counts India as its largest market, with around 400 million users.

From CNN Business' Kaya Yurieff

A misleading video clip of Democratic presidential candidate Joe Biden has been spreading on social media without any warning labels since Saturday after having been promoted by members of President Donald Trump's inner circle.

In the 24-second clip from an interview with the podcast Pod Save America, which is hosted by four former members of the Obama administration, Biden is heard saying, in part, that "we have put together, I think, the most extensive and inclusive voter fraud organization in the history of American politics."

The clip was first posted by RNC Research, a Twitter account operated by the Republican National Committee.

It's clear in context that Biden is talking about an effort to combat voter suppression and provide resources for those seeking to vote, not an organized effort to perpetrate voter fraud.

The clip is part of a longer response by the former Vice President to a two-part question from host Dan Pfeiffer about Biden's message to people who haven't voted and those who already have.

In his response, Biden encouraged people to "make a plan exactly how you're going to vote, where you're going to vote, when you're going to vote. Because it can get complicated, because the Republicans are doing everything they can to make it harder for people to vote, particularly people of color, to vote..."

He continued a few sentences later: "We have put together, I think, the most extensive and inclusive voter fraud organization in the history of American politics. What the President is trying to do is discourage people from voting by implying that their vote won't be counted, it can't be counted, we're going to challenge it..."

Biden goes to explain that his campaign has arranged for legal assistance for people who feel their right to vote has ben challenged.

On Saturday evening, White House Press Secretary Kayleigh McEnany posted the shortened clip from her personal Twitter account saying: "BIDEN ADMITS TO VOTER FRAUD!"

Fact-checking website Snopes debunked the claim as false.

McEnany's post has been retweeted more than 32,000 times. The clip has been viewed 7.9 million times on Twitter. Eric Trump also posted the video on both Twitter and Facebook without any false commentary.

President Trump's verified YouTube account also posted the clip, with the title: Joe Biden brags about having 'the most extensive and inclusive VOTER FRAUD organization' in history. It's been viewed nearly 500,000 times.

A Twitter spokesperson said it will not label the tweets by McEnany or Eric Trump.The social network did not provide further detail.

Facebook did not immediately respond to requests for comment, but the platform has not added any information labels or fact-checking resources to the clip.

According to Twitter's rules, users may not "deceptively share synthetic or manipulated media that are likely to cause harm." However, it's unclear if a clip taken out of context, but not technologically manipulated, would fall into this category.

Facebook's manipulated media policy states users should not post videothat has been "edited or synthesized ... in ways that are not apparent to an average person, and would likely mislead an average person to believe that a subject of the video said words that they did not say."

A YouTube spokesperson said the video does not violate its rules.

While the video shared with us by CNN does not violate our Community Guidelines,we have robust policies prohibiting deceptive practices such as technically manipulating content in a way that misleads users (beyond clips taken out of context) and may pose egregious harm." saidIvy Choi, a YouTube spokesperson.

The Trump campaign did not respond to a request for comment. Biden's national press secretary TJ Ducklo said: "The President of the United States has already demonstrated hes willing to lie and manipulate our countrys democratic process to help himself politically, which is why we have assembled the most robust and sophisticated team in presidential campaign history to confront voter suppression and fight voter fraud however it may present itself."

When asked if the RNC stood by the clipped video, and if its the official position of the RNC that Biden was endorsing and explicitly encouraging voter fraud, RNC Rapid Response Director Steve Guest said: You should ask Joe Biden if he stands by the words he uttered, not us for sharing them. It's not the RNC's responsibility to clarify for the Biden campaign their candidates repeated blunders."

From CNN Business' Kaya Yurieff and Rishi Iyengar

Facebook announced its latest crackdown on foreign actors trying to interfere in the US election.

On Tuesday, Facebook said it took down three different networks, two of which targeted the United States. The other originated in and targeted Myanmar.

Facebook said it identified and removed these networks before they were able to build up a substantial audience. The people within each network worked with each other and used fake accounts to mislead people about who they were, according to a blog post announcing the takedowns from Nathaniel Gleicher, Facebook's head of security policy.

The company announced it took action against three distinct networks. The first network consisted of two Facebook pages and 22 Instagram accounts and was taken down for violating Facebook's policy against foreign interference, which covers accounts from outside a particular country trying to influence behavior in that country. These accounts and pages were run by individuals from Mexico and Venezuela and targeted the United States. They posted in Spanish and English about US news and current events, including "memes and other content about humor, race relations and racial injustice, feminism and gender relations, environmental issues and religion," Gleicher said.

Facebook also removed 12 Facebook accounts, 6 pages and 11 Instagram accounts for government interference, with the company saying it found links to individuals associated with the Iranian government. That network started in Iran and primarily targeted the United States and Israel.

Facebook said the accounts focused on Saudi Arabia's activities in the Middle East and also spread claims about an alleged massacre.

A third network, comprising 9 Facebook accounts, 8 pages, two groups and two Instagram accounts, originated in Myanmar and targeted local audiences there. Its posts, mainly in Burmese, included criticism of Myanmar's armed forces, Facebook said.

The removals are part of Facebook's effort to crack down on "coordinated inauthentic behavior" and widespread misinformation, particularly leading up to the US presidential election.

Last month, the company said it had identified and shut down a network of fake accounts that included fictitious personas it said were tied to Russian military intelligence.

While those accounts had not been primarily targeting the United States, Facebook pointed to concerns that similar accounts could be used in Russian influence operations as the November 3 election draws closer.

"In recent weeks, government agencies, technology platforms and security experts have alerted the public to expect attempts to spread false information about the integrity of the election," Gleicher said in his blog post on Tuesday. "Were closely monitoring for potential scenarios where malicious actors around the world may use fictitious claims, including about compromised election infrastructure or inaccurate election outcomes, to suppress voter turnout or erode trust in the poll results."

See more here:
How Wikipedia will fight Election Day misinformation - nation.lk - The Nation Newspaper

Wikipedia locks Elon Musk’s page after he begs Twitter users to trash him on site – The Tribune India

San Francisco, August 17

Multi-billionaire tech mogul Elon Musks Wikipedia page has been locked for editing after Tesla CEO encouraged people to "trash" him on the site.

"History is written by the victors except on Wikipedia," Musk tweeted on Sunday.

In a follow-up tweet, he told his followers to "trash" him on the site, which typically allows anyone to edit its pages.

"Please trash me on Wikipedia, I'm begging you," Musk said.

The Tesla and SpaceX CEO responded to many of his followers who took up the challenge and shared their edits.

"All major wars, diseases and financial disasters of the last century can be directly attributed to Mr. Musk or one of his companies," one user edited the page to say, before sending the update straight to Musk, who signalled his approval.

Musk also approved of a suggestion to edit his page to describe him as a "business magnet" instead of a "business magnate." Due to the overwhelming traffic on Musk's Wikipedia page, Musk's Wikipedia page was quickly locked by the company, which limited changes to only editors approved by the site itself.

This is not the first time Musk has mentioned Wikipedia on Twitter. Earlier, he expressed disappointment with the site and its accuracy, especially regarding him.

"Just looked at my wiki for 1st time in years. It's insane! Btw, can someone please delete 'investor.' I do basically zero investing," Musk tweeted last year.

IANS

Original post:
Wikipedia locks Elon Musk's page after he begs Twitter users to trash him on site - The Tribune India

Whiny baby edits Jalen Hurd’s Wikipedia after 49ers receiver tears his ACL – SFGate

San Francisco 49ers wide receiver Jalen Hurd (17) against the Dallas Cowboys in an NFL preseason football game on Aug. 10, 2019. Hurd tore his ACL during a 49ers practice over the weekend. (AP Photo/John Hefti)

San Francisco 49ers wide receiver Jalen Hurd (17) against the Dallas Cowboys in an NFL preseason football game on Aug. 10, 2019. Hurd tore his ACL during a 49ers practice over the weekend. (AP Photo/John Hefti)

Photo: John Hefti / Associated Press

San Francisco 49ers wide receiver Jalen Hurd (17) against the Dallas Cowboys in an NFL preseason football game on Aug. 10, 2019. Hurd tore his ACL during a 49ers practice over the weekend. (AP Photo/John Hefti)

San Francisco 49ers wide receiver Jalen Hurd (17) against the Dallas Cowboys in an NFL preseason football game on Aug. 10, 2019. Hurd tore his ACL during a 49ers practice over the weekend. (AP Photo/John Hefti)

Whiny baby edits Jalen Hurd's Wikipedia after 49ers receiver tears his ACL

San Francisco 49ers wide receiver Jalen Hurd tore his ACL during Sunday's practice and will miss the 2020 NFL season, his agent Doug Hendrickson told KNBR on Monday evening.

"It was a fluke thing, his knee just kind of gave out on an on-the-side route [running] drill," Hendrickson said.

Hurd, a third-round draft pick in 2019 from the University of Tennessee and Baylor, suffered a stress fracture in his lower back last preseason that kept him from playing in 2019 as well. He had rehabbed his way back into shape, and was expected to play an important role in San Francisco's offense this year.

Hurd's Twitter and Instagram accounts are now deleted or deactivated, which some 49ers fans on social media have asserted was a response to the vitriol in Hurd's mentions after news of his injury began circulating over the weekend. We asked the 49ers if that was indeed the reason for Hurd's social media scrubbing, but they declined to comment.

The timeline does check out, though, and on Wikipedia, someone who's very angry at a pro athlete for suffering a second debilitating injury this time in the middle of a pandemic when football matters even less than usual changed Hurd's entry to read, "His bones and ligaments are possibly made of low quality glass or wet toilet paper."

Alex Shultz is the SFGate sports editor. Email: alex.shultz@sfgate.com | Twitter: @AlexShultz

Read the rest here:
Whiny baby edits Jalen Hurd's Wikipedia after 49ers receiver tears his ACL - SFGate

The Wikipedia War That Shows How Ugly This Election Will Be – The Atlantic

By Wednesday morning, the pro-Trump video blogging duo Diamond and Silk were claiming, Kamala is not even Black. So you know when she talks about racism, thats why she dont care about it. Shes not even Black I dont know what she is. And Newsweek ran an op-ed arguing, using a fringe legal theory, that Harris may not be eligible to be vice president.

The battles over Harriss Wikipedia page played out primarily over the specific term African American. That debate began in earnest last week, when Harris was only a much-discussed potential running mate.

How exactly can you describe someone with south-asian ancestry as African American? one anonymous user wrote. Does this term now mean black, dark skinned, or simply non-white? Please, this is disrespectful to people from the Indian sub-continent who have their own very distinct identities.

Read: Kamala Harriss nomination is a turning point for Democrats

She identifies as African-American. End of story, replied a user named Fowler&fowler, who frequently edits India-related articles on Wikipedia. Please dont rehearse tired old banalities about who is black who is African-American. Please also dont imply (if you are doing so) for the hundredth time the other pieties about whether the Middle-Passage is a sine qua non for being one or another By any definition, she is more African-American (in the traditional meaning of the word) than Barak [sic] Obama is.

(Harriss Senate bio describes her as the second African-American woman and first South Asian-American senator in history. When she was sworn in as California attorney general, her office described her as the first African American to hold the position.)

With Wikipedias article protection preventing them from making the edits themselves, a series of mostly anonymous users instead asked for Harris to be described as something other than African American:

Some information is wrong. She is not a black. Her mother is Indian, her father is Latino. She is not either black nor native American.

She is Asian American. India, where her mother immigrated from, is part of Asia. Jamaica, where her father immigrated from, is part of North America.

Harris is the second African American woman is wrong, she is Jamaican / Indian.

Kamala Harris is not African American as this states. Her mother is Indian and her father Jamaican.

Change first African American to Jamaican American. She is not African American because her father is from Jamaica and her mother is Indian. There is no Africa there.

She was born to a foreigner from India and a Foreigner from Jamaica. How does this combination make her an African American???????????????????

She is of Jamaican decent [sic] not African.

Wikipedia administrators and other senior editors have stood by the use of African American, noting the many years of news stories and official documents that have identified her as such. Wikipedia editing thrives on consensus, and amid all the debate, the consensus of editors was that a consensus had been reached. (The word consensus currently appears 49 times in the discussion.)

See the original post:
The Wikipedia War That Shows How Ugly This Election Will Be - The Atlantic