Why Did Facebook Remove Your Post? This Doc Might Help

Use Gmail? Google’s email service expected to get a new look
April 19, 2018
Hulu passes 20 million US subscribers, says offline downloads are coming
May 2, 2018

Ever wonder how Facebook decides what—and who—to remove from its platform?

Wonder no more because the social network just published the lengthy “Community Standards” its reviewers use to determine what is and isn’t allowed on Facebook.

The standards are broken down into six categories: Violence and criminal behavior, safety, objectionable content, integrity and authenticity, respecting intellectual property, and content-related requests. They outline how Facebook deals with everything from threats of violence to suicide, self-injury, child porn and sexual exploitation, nudity, bullying, harassment, hate speech, and more.

The move to publish these once internal guidelines comes after The Guardian last year obtained and posted snippets of the company’s exhaustive and sometimes contradictory rules.

Facebook’s VP of Global Policy Management Monika Bickert said the company is now going public with this information to “help people understand where we draw the line on nuanced issues” and as a way to solicit feedback on how it can improve its guidelines. Next month, the company plans to launch a series of public events in the US, UK, Germany, France, India, and Singapore called “Facebook Forums: Community Standards” to get people’s feedback in person.

Facebook relies on artificial intelligence technology and reports from users to identify posts, photos, and other content that may violate its standards. Upon receiving a report, a member of the company’s 24/7 Community Operations team reviews the content in question to determine whether or not it should be taken down. Facebook currently employs more than 7,500 content reviewers.

Bickert acknowledged that Facebook’s reviewers sometimes make the wrong decision.

“In some cases, we make mistakes because our policies are not sufficiently clear to our content reviewers; when that’s the case, we work to fill those gaps,” she wrote. “More often than not, however, we make mistakes because our processes involve people, and people are fallible.”

Meanwhile, Facebook is now, for the first time, giving users the right to appeal its decisions on individual posts. This way, if the company removes your post and you think it made a mistake in doing so, you can ask for a second opinion.

At this point, you will only be able to ask for an appeal for posts removed for nudity/sexual activity, hate speech, or graphic violence. If Facebook removes something you posted for one of those reasons, it will notify you about the action and give you the option to request an additional review. Within 24 hours of initiating an appeal you should know whether Facebook plans to restore your content, or keep it off the platform for good.

This article was originally published by PC Mag.