About
Subscribe

Blood, gore, nipples... and Facebook

Staff Writer
By Staff Writer, ITWeb
Johannesburg, 24 Feb 2012

The that underlie Facebook's photo approval system have been exposed in the media this week, and the social networking site has come under scrutiny for its attitude towards sex and violence as a result.

A disgruntled Moroccan employee of one of the firms outsourced to moderate reported Facebook content, Amine Derkaoui, gave internal documents to news site Gawker. Derkaoui was paid $1 an hour for his services, and has accused the social network of exploiting the third world.

Facebook has come under fire in the past for its censorship of some content such as the removal of a photo of a kiss between gay men, a nude drawing and breast feeding photos. While the network's statement of “Rights and Responsibilities” is fairly vague, the confidential documents that have now emerged give a highly specific blueprint as to what is allowed and what isn't.

The guidelines for content moderators include categories such as sex and nudity, hate content, graphic content, illegal drug use, and bullying and harassment. Some of the content that should be deleted by moderators includes (quoted verbatim):

* Naked “private parts” including female nipple bulges and naked butt cracks; male nipples are ok.
* Mothers breastfeeding without clothes on.
* People “using the bathroom”.
* Blatant (obvious) depiction of camel toes and moose knuckles.
* Photos and images showing internal organs, bone, muscle, tendons etc. Deep flesh wounds are ok to show, excessive blood is ok to show.
* Crushed heads, limbs, etc are ok, as long as no insides are showing.
* Images of drunk and unconscious people, or sleeping people with things drawn on their faces.

The dark side

Content that is to be escalated includes poaching of endangered animals and credible threats against people or public figures. Under the “International Compliance” category, banned content includes holocaust denial and images of the Turkish flag being burnt must be escalated.

Responding to the Gawker report, a Facebook spokesperson is quoted as saying the policy document in question provides “a snapshot in time” of the network's standards with regard to one of its contractors. The spokesperson says the most up to date information can be found on the Facebook Community Standards page.

Facebook is said to process millions of items of reported content per day, and moderators working for the third parties hired to wade through the content often don't last long before quitting.

The Gawker quotes one former moderator saying: "Paedophilia, necrophilia, beheadings, suicides, etc. I left [because] I value my mental sanity."

Share