Archive for the ‘Censorship’ Category

Apple Runs Up Against State Censorship in China, Again – The Mac Observer

Apple is once again running into issues with state censorship in China, according toXinhua. Two different agencies will call Apple into their offices to demand tightercontrols over streaming apps in the App Store.

The move is part of a crackdown on streaming content from three Chinese sites: toutiao.com, huoshanzhibo.com, and huajiao.com. Government regulators said those three sites were offering illegal content, including porn.

Those companies had apps in Apples App Store in China.The Beijing Public Security Bureau and Beijing Cultural Market Administrative Law Enforcement Team want Apple more involved in policing such things.

This is part and parcel of the struggle Apple faces in China. On the one hand, Chinas government is an authoritarian communist government in the hands of a single party very focused on perpetuating the control of that party. Really, thats the other hand, too, but the other hand is highly interested in tamping down the success of western companies in China.

And thus we have Apple forced to shut down its iBooks and movie offerings on iTunes. More recently, Apple was forced to pull The New York Times app from the Chinese app store. China hates the idea of its people getting unfettered access to information.

Apple is far from the first U.S. tech giant to face such pressures. Facebook is banned outright. Microsoft chose to censor Bing to stay in business in China, while Google closed down its China business and redirected Chinese queries to its Hong Kong operation.

The problem for Apple is that these kinds of pressures are bound to increase. The bigger Apple gets, the more interest China has in knocking it down. At the same time, the bigger Apple gets, the more it becomes a pawn in political jousting between China and the U.S.

Its a tricky spot for Apple to be in, to be sure.

More here:
Apple Runs Up Against State Censorship in China, Again - The Mac Observer

Murder on Facebook raises big censorship questions: What should social-media companies do about violent content? – Salon

On Easter Sunday horrific footage of a 74-year-old man being gunned down on a Cleveland sidewalk was posted on Facebook by his killer, reigniting an ongoing debate over how social-media content should be policed.

But effective strategies forblocking every piece of offensive and illegal content have been elusive and may never be 100 percent effective, according to some experts. Others including Facebook itself say more can and should be done to root out offensive content, including hate speech, horrific and illegal snuff videos and fake news items that mold the opinions of gullible users.

Facebook says it receives millions of complaints objecting tocontent everyweek from its nearly 2 billion active users. When the company receives a complaint, an algorithm automatically flags the content, which is then reviewed by moderators to quickly determine if itviolates the law or the companys terms and conditions.

Footage of the murder of Robert Godwin Sr. by deranged killer Steve Stephens, 37, who committed suicide on Tuesday following a police chase in Pennsylvania, was publicly viewable on Stephens Facebook profile for about two hours on Sunday. Facebook said it disabled Stephens account 23 minutes after it received reports of the murder video, but it was publicly viewable long enough for users to capture the footage, prompting a pleaon Twitter from one of Godwins grandchildren for people tostop sharing the video.

Desmond Patton,an assistant professor of social work at Columbia University, said that while the Godwin murder video should clearly have been taken down, it one extreme example of a larger issue. Companies like Facebook, Twitter and Google (which owns YouTube), he said, need to recruit specialists and elicit feedback from community leaders to improve how content is moderated, including material that might not seem offensive to every user.

I study violence on social media and all of the [problematic] content that I see almost never gets taken down, Patton told Salon. If youre just using tech people from Silicon Valley [as content monitors,] youre going to miss a lot of things. You need to diversify who makes these decisions.

Facebook declined to comment to Salon about thestrategies its considering to fortify its efforts to block objectionable and illegal content uploaded by its users, but having a more aggressive content filtering system could have unintended consequences. For example, would a stricterpolicy lead to the censorship of footage like the July 2016 shooting of Philando Castile by a Minnesota police officer? It could be argued that this video servesthe publics interest because it viscerally highlights the ongoing problem of excessive force inflicted on African-Americans by members of law enforcement.

Sarah Esther Lageson, a sociologist at Rutgers Universitys School of Criminal Justice, saidthat Facebook is under intense pressure to take a stance and define its position on monitoring user-uploaded content, which couldleadto more surveillance something that not all Facebook users will welcome. But she said the benefits of having an open and easy way to produce and share online videos, which can highlight injustices and expose crimes, outweigh the negative effects of giving people so much freedom.

Facebook will likely provide an array of creative solutions and will likely do their best to streamline oversight of user-uploaded content using [artificial intelligence] or machine learning, but I wont make an argument that those efforts would catch every instance of an extremely rare event like this, Lageson told Salon in an email.

Besides, she said in a follow-up phone conversation, horrific crimes take place in public no matter what we do to prevent them; its the new medium by which criminals can advertise their crimes that concerns people.

This is clearly an innovative way of doing something that has always been done: People have always killed people in public, mass shootings happen, she said. That being said the internet is a way to get into peoples homes, which I think is what scares people, that you cant even feel protected from witnessing a crime on your cell phone or your laptop. Its one thing to see a crime happen on the street and another thing to see it when youre on your couch.

As Facebook and other social-networking service providers struggle to moderate the immense content stream coming at them from their users, the solution to the many problems that can arise is complicated. It requires, as Patton suggested, more feedback from experts and community members abouthow to establish policies for all types of harmful, violent and offensive content. And as Lageson pointed out, the fact that people can produce and share content so easily has helped fight crime and injustice.

The solution to the problem of preventing offensive, hateful, violent and murderous content from being distributed onsocial networks is as complicated as people are themselves, and there may never be a solution that satisfies everyones concerns.

Read the original:
Murder on Facebook raises big censorship questions: What should social-media companies do about violent content? - Salon

Bill To Protect Arizona Student Journalists From Censorship Hits A Roadblock – KJZZ


KJZZ
Bill To Protect Arizona Student Journalists From Censorship Hits A Roadblock
KJZZ
A measure that would protect student journalists from censorship hit a roadblock in the state legislature. House Majority Leader John Allen pulled the bill from consideration after more than an hour of debate. The measure would declare that student ...
Student journalism from censorship legislation hits a roadblockSierra Vista Herald

all 2 news articles »

View post:
Bill To Protect Arizona Student Journalists From Censorship Hits A Roadblock - KJZZ

Censorship on TV? Soaps, reality shows are crossing all limits, says Pahlaj Nihalani – Hindustan Times

Central Board of Film Certification chief Pahlaj Nihalani has said that restrictions must be imposed on the inflow of software on TV before its too late. Reports suggest that the Ministry of Information and Broadcasting is actually considering a more stringent policy to channelise and restrain free flow of content on the medium.

Far from abolishing film censorship, the Ministry Of Information & Broadcasting is actually considering a more stringent monitoring agency to channelise and restrain the free flow of content on the home-viewing medium.

While the CBFC chief Pahlaj Nihalani refrained from discussing the I&Bs plans to monitor content on television, he lashed out hard at what he considers the free flow of muck into homes. Television soaps, reality shows and crime shows are crossing all limits. Shows like Crime Petrol and Savdhan India show the most gruesome and heinous crimes in graphic detail.

Real-life people are named in the fictional recreation of crime stories.Women are raped in incestuous attacks, housewives and minor girls are shown to be violated. If the same content was shown in any film we at the CBFC would have to clamp down heavily on the content, he said.

Nihalani feels restrictions must be imposed on the inflow of software on television before its too late. Why are filmmakers required to get a new censor certification for their films to be shown on television when all the rest of content made specially for television gets to go on air unchecked? This free flow of content in television must stop. Its affecting the natural psychological development of young minds. Parents are worried, he said.

Follow @htshowbiz for more

Originally posted here:
Censorship on TV? Soaps, reality shows are crossing all limits, says Pahlaj Nihalani - Hindustan Times

China’s internet censors allow one-on-one complaining, but won’t let … – The Verge

Everyone knows that China has some of the most sophisticated censorship tools in the world, but the details of how they actually work what they censor and when are often not fully understood. A new report by Citizen Lab, a research group studying the web, human rights, and global security, sheds some light on one particularly fruitful target for Chinese censorship: mobile messaging.

Citizen Lab looked at how the Chinese government censors discussion on WeChat, a popular messaging app. WeChat is the fourth biggest messaging service in the world, with more than 768 million active users, but is also deeply embedded in Chinese society, where its used not only for chatting, but for tasks like banking, paying bills, booking holidays, calling cabs, and much more.

The cornerstone of WeChat censorship is keyword filtering, which blocks messages that contain terms like human rights, mass arrest, and spiritual freedom. However, Citizen Lab found that the censors dont just block messages containing any one specific phrase, but instead look for combinations of different terms. So you can send a message with the words human rights lawyer in it, but if you combine that with the name of a specific lawyer Jiang Tianyong, who was recently disappeared by the government the message is blocked.

When a message is censored, users are not notified of this fact. They see it as sent in their own app, but it just never reaches its intended recipient. The system works by examining every message that is sent when it passes through WeChats servers. The list of filtered keywords is also reactive, and changes in relation to the news; and only to WeChat accounts using mobile phone numbers registered in the Chinese mainland. Citizen Lab says much of the censorship on WeChat is currently focused around the 709 Crackdown a series of arrests against civil dissenters that began on the 9th of July 2015 (hence the name).

An interesting quirk of WeChat censorship discovered by Citizen Lab is that its stricter when it comes to group discussions. The group found that more keyword combinations were blocked in chats containing multiple users than in one-on-one conversations. The reason for this isnt clear, but it could be the Chinese government thinks it prudent to allow limited discussion of sensitive topics, but that group conversations are more dangerous, perhaps leading to organized dissent. WeChat Moments (a feature similar to Facebooks News Feed) was also more heavily censored, with certain images filtered out as well.

The report notes: The greater attention to group chat and Moments in particular may be due to the semi-public nature of the two features. Messages can reach and inspire discussions among wider audiences, making it subject to a higher level of scrutiny.

For a full list of censored keywords and combinations, you can read Citizen Labs report in full here.

More:
China's internet censors allow one-on-one complaining, but won't let ... - The Verge