Facebook Needs to Be More Transparent About Why It Censors Speech – Fortune

Photograph by Chris Ratcliffe Bloomberg/Getty Images

The more Facebook tries to move beyond its original role as a social network for sharing family photos and other ephemera, the more it finds itself in an ethical minefield, torn between its desire to improve the world and its need to curb certain kinds of speech.

The tension between these two forces has never been more obvious than it is now, thanks to two recent examples of when its impulses can go wrong, and the potential damage that can be caused as a result. The first involves a Pulitzer Prize-winning journalist whose account was restricted, and the second relates to Facebook's leaked moderation guidelines.

In the first case, investigative reporter Matthew Caruana Galizia had his Facebook account suspended recently after he posted documents related to a story about a politician in Malta.

Caruana Galizia was part of a team that worked with the International Consortium of Investigative Journalists to break the story of the Panama Papers, a massive dump of documents that were leaked from an offshore law firm last year.

The politician, Maltese prime minister Joseph Muscat, was implicated in a scandal as a result of those leaked documents, which referred to shell companies set up by him and two other senior politicians in his administration.

Get Data Sheet , Fortune s technology newsletter.

Facebook not only suspended Caruana Galizia's account, it also removed a number of the documents that he had posted related to the story. It later restored his access to his account after The Guardian and a Maltese news outlet wrote about it, but some of the documents never reappeared.

The social network has rules that are designed to prevent people from posting personal information about other users, but it's not clear whether that's why the account was suspended.

Some of what Caruana Galizia posted contained screenshots of passports and other personal data, but many of these documents have remained available, while others have been removed. He is being sued by Muscat for libel, which has raised concerns about whether Facebook suspended the account because of pressure from officials in Malta.

A spokesman for Facebook told the Guardian that it was working with the reporter "so that he can publish what he needs to, without including unnecessary private details that could present safety risks. If we find that we have made errors, we will correct them."

Caruana Galizia said the incident was enlightening "because I realized how crippling and punitive this block is for a journalist." And they clearly reinforce the risks that journalists and media entities take when they decide to use the social network as a distribution outlet.

If nothing else, these and other similar incidents make it obvious that Facebook needs to do far more when it comes to being transparent about when and why it removes content, especially when that content is of a journalistic nature.

In an unrelated incident, the world got a glimpse into how the social network makes some of its content decisions thanks to a leaked collection of guidelines and manuals for the 4,500 or so moderators it employs, which was posted by the Guardian .

Outlined in the documents are rules about what kinds of statements are considered too offensive to allow, how much violence the site allows in videos including Facebook Live, which has been the subject of significant controversy recentlyand what to do with sexually suggestive imagery.

Much like Twitter, Facebook appears to be trying to find a line between getting rid of offensive behavior while still leaving room for freedom of expression.

In the process, however, it has raised questions about why the giant social network makes some of the choices it does. Statements within the guidelines about violence towards women, for examplesuch as "To snap a bitchs neck, make sure to apply all your pressure to the middle of her throat"are considered okay because they are not specific threats.

Facebook has already come under fire for some of its decisions around what to show on its live-streaming feature. There have been several cases in which people committed suicide and streamed it on Facebook Live, and in at least one case a man killed his child and then himself .

The guidelines say that while videos of violence and even death should be marked as disturbing, in many cases they do not have to be deleted because they can "help create awareness of issues such as mental illness," and because Facebook doesn't want to "censor or punish people in distress."

As a private corporation, Facebook is entitled to make whatever rules it wants about the type of speech that is permitted on its platform because the First Amendment only applies to the actions of governments. But when a single company plays such a huge role in the online behavior of more than a billion people, it's worth asking questions about the impact its rules have.

If Facebook censors certain kinds of speech, then for tens of millions of people it effectively ceases to exist, or becomes significantly less obvious.

The risks of this kind of private control over speech are obvious when it comes to things like filter bubbles or the role that "fake news" plays in political movements. But there's a deeper risk as well, which is that thanks to the inscrutability of Facebook's algorithm, many people won't know what they are missing when information is removed.

Facebook may not want to admit that it is a media entity, but the reality is that it plays a huge role in how billions of people see the world around them. And part of the responsibility that comes with that kind of role is being more transparent about why and how you make decisions about what information people shouldn't be able to see.

Read more from the original source:
Facebook Needs to Be More Transparent About Why It Censors Speech - Fortune

Related Posts

Comments are closed.