Facebook has a government-size censorship responsibility without the structure to handle it – Quartz

With nearly 2 billion users, Facebook reaches nearly a quarter of the people on the planet. And while its broadcasting power can be used for promoting good causes and unleashing viral cat videos, it can also be used to distribute hateful and violent content. This has put Facebook in the uncomfortable position of making judgment calls about whether the millions of posts flagged by its users as objectionable each week should be allowed to stay, flagged to other users as disturbing, or removed completely. Its an unprecedented responsibility at this scale.

The range of issues is broadfrom bullying and hate speech to terrorism and war crimesand complex, Monika Bickert, Facebooks head of global policy management, recently wrote in an op-ed. To meet this challenge, she said, our approach is to try to set policies that keep people safe and enable them to share freely.

Once Facebook sets these rules, it relies on 4,000 human content moderators to apply them to individual flagged posts.

The job isnt straightforward. According to a Guardian report based on thousands of pages of Facebooks content moderator training materials, Someone shoot Trump should be permitted, but not the phrase Lets beat up fat kids. Digitally created art showing sexual activity should be removed, but all handmade erotic art is fine. Videos showing abortions are also permittedas long as they dont feature nudity.

Guidelines like these illustrate the complexity of content regulation, which until social media came around, involved questions that, for the most part, only governments faced at scale. What constitutes dangerous speech? Should some peoplesuch as the presidentbe treated differently when they make criticisms or threats, or hate speech (paywall)? When is it in the public interest to show obscenity or violence? Should nudity be permitted, and in what contexts?

Some of Facebooks answers to these difficult questions mimic content regulation laws created by democratic governments. According to the Guardian, for instance, Facebook tolerates some violent content, unless it gives us a reasonable ground to accept that there is no longer simply an expression of emotion but a transition to a plot or design. This is somewhat similar to how the US views violent content, which tends to be protected unless it incites immediate violence. (Many European countries, meanwhile, have laws that prohibit violent content or hate speech.)

But the process Facebook uses to create and apply these policies has little in common with democratic governments, which have long, often-transparent processes for creating new laws and courts that weigh each case with considerations that arent available to Facebook moderators. Facebook could improve its content moderation policies, some suggest, by also borrowing some of these ideasrelated to process rather than policyfrom democratic governments.

The multiplication of guidelines, says Agns Callamard, the director of Global Freedom of Expression at Columbia University, as well meaning and well written as they may be, cannot be the answer.

Time to a decision: Facebook relies on thousands of content moderators to make decisions about whether to remove, permit, or label specific content as disturbing based on its rules. To deal with the massive scale on Facebook, the company recently said it would hire 3,000 additional people to review posts. It has also invested in artificial intelligence that could reduce the amount of work for human moderators.

For now, according to one report, a typical Facebook content moderator makes a decision about a flagged piece of content about once every 10 seconds (a Facebook spokesperson declined to confirm or deny this number, saying she didnt have the data). Context is so important, Facebooks Bickert told NPR last year. Its critical when we are looking to determine whether or not something is hate speech, or a credible threat of violence, she said. We look at how a specific person shared a specific post or word or photo to Facebook. So were looking to see why did this particular share happen on Facebook? Why did this particular post happen? Those questions take time to evaluate effectively.

Thats one reason why in most democratic countries, Callamard says, content regulation by media regulators and the courts involve decisions that take days or weeks.

Debate: Content moderators on Facebook dont hear arguments for why they should either permit or remove a piece of content. Users whose pages or accounts they remove do have an option to appeal the decision by submitting it for another review (Facebook recommends they remove the violating content first).

Government content regulators usually have more input from opposing sides. [Decisions] will often involve a judicial process, including several parties arguing one side or the other [as well as] judges reviewing the various arguments and making a decision, Callamard says.

Open discussion of rules: Facebook publishes broad guidelines for what it allows and disallows on its site, but, to keep users from gaming the system, the specifics are only shared in internal documents like the hundreds of training manuals, spreadsheets, and flowcharts that leaked to the Guardian.

A Facebook spokesperson says the company consults experts and local organizations to inform its community standards, but the public doesnt know all of Facebooks content moderation rules, nor is it part of creating them.

By contrast, Callamard says, in a democratic government, the laws upon which these decisions are made have been discussed and debated in Parliament by members of Parliament; by government ministers and where they exist by regional inter-governmental bodies. These laws or decrees would have been the object of several readings, and in the best case scenarios, the general public (including those particularly concerned by the law, e.g. the media) would have been brought in a formal consultation process.

Fundamental context: Governments have different goals than Facebook. In a democratic society, fundamental guiding principles include freedom of expression, freedom of political debate, and protecting content related to the public interest. At an advertising business like Facebook, success involves attracting and retaining users, many of whom dont want to visit a website that shows them offensive or dangerous content. This is a fundamental dimension of the way, in my opinion, Facebook always approaches content regulation, Callamard says. It cannot go so far and so as to undermine or weaken a business model based upon, and driven by data and more data (individuals data).

Here is the original post:
Facebook has a government-size censorship responsibility without the structure to handle it - Quartz

Related Posts

Comments are closed.