Facebook’s Internal Rulebook On Types Of Allowable Content Leaks

Updated on

Facebook’s secret rules and guidelines for deciding what its 2 billion users can and cannot post on the platform were disclosed for the first time, thanks to a leaked document. The Guardian reported that the leaked documents show how the social media giant moderates different issues like hate speech and self-harm on its platform.

How Facebook deals with content on violence, fake accounts and sex

The Guardian has seen over 100 internal training manuals, flowcharts and spreadsheets that give important insight into the blueprints as to how Facebook moderates issues like hate speech, racism, self-harm, pornography, violence and terrorism. These documents were supplied to the moderators of the tech company within the last year.

The documents revealed that Facebook reviews more than 6.5 million reports of potentially fake accounts per week, known as FNRP (fake, not real person). There are also guidelines on cannibalism and match-fixing as well. Recently, moderators were told to “escalate” to senior managers any content related to 13 Reasons Why because Facebook feared inspiration of copycat behavior. 13 Reasons Why is a Netflix original based on the suicide of a high-school student.

According to the leaked documents, photos of animal abuse are allowed to be shared, but only “extremely upsetting imagery” is to be marked as “disturbing.” All the handmade art showing sexual activity and nudity is allowed on the platform; however, digitally-made art showing sexual activity is not allowed. When it comes to child abuse, some pictures of bullying and non-sexual physical abuse of children do not have to be deleted or “actioned,” unless it has a celebratory or sadistic element.

In a statement to The Guardian, Monica Bickert, Facebook’s Head of Global Policy Management, said that keeping people on the platform safe is the most important thing that they do, and they work hard to make Facebook as safe as possible while enabling free speech.

“This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously,” Bickert said.

A hard time for moderators

A source told The Guardian, “Facebook cannot keep control of its content. It has grown too big, too quickly.”

The news outlet found that new challenges like “revenge porn” have overwhelmed moderators with so much work that they usually have only “ten seconds” to make a decision.

Several moderators are said to have concerns about the inconsistency of the policies. For example, The Guardian reported that the policies on sexual content are said to be the most confusing and complex. Another inconsistent point was that Facebook does not consider that “all disagreeable or disturbing content violates our community standards.”

Nevertheless, Facebook’s leaked document thus gives the first view on the rules and codes formulated by the site, which is under political pressure in both the United States and Europe.  But the guidelines may worry critics who say Facebook is a publisher now and must take more actions to remove violent, hurtful and hateful content.

Leave a Comment