Time to clean up your Facebook settings to keep them more private. Here’s how. — the social media company is for the first time publishing its secret rules and guidelines which will tell users what they can (and can’t) do on Facebook, so it’s 2.2 billion users can know what or not to post on the social network, including hate speech, pornography, even cannibalism.
Last May, The Guardian published a leaked copy of Facebook’s content moderation guidelines, which describe the company’s policies for determining whether posts should be removed from the service. Almost a year later, Facebook is making an expanded set of those guidelines available to the public, a move designed to gather input from users around the world. The company is also introducing a new appeals process, allowing users to request a review if they believe their post has been removed unfairly.
Until now, Facebook had not publicly disclosed the lengthy rules given to its content reviewers to guide their decisions on whether to remove Facebook posts flagged for violating the Silicon Valley Company’s policies. And Facebook users, whose content was removed, had little recourse if they believed content reviewers made the wrong call.
The community standards which is 27 pages and it cover interesting topics which includes bullying, violent threats, self-harm, and nudity, among many other topics.
Monika Bickert, head of global policy management at Facebook, in an interview with reporters explained saying that:
“These are issues in the real world,” he also added that “The community we have using Facebook and other large social media mirrors the community we have in the real world. So we’re realistic about that. The vast majority of people who come to Facebook come for very good reasons. But we know there will always be people who will try to post abusive content or engage in abusive behaviour. This is our way of saying these things are not tolerated. Report them to us, and we’ll remove them.”
The guidelines will apply in every country in which the social media giant operates, and have been translated into more than 40 languages. The company says it developed them in conjunction with a “couple hundred” of experts and advocacy groups representing the entire world. As the guidelines evolve — and they will evolve, Bickert said — they will be updated simultaneously in every language.
Facebook also announced plans to develop a more robust process for appealing takedowns that were made in error. The company has faced regular criticism for high-profile takedowns over the years, whether it’s over a picture of a woman breastfeeding her child or an iconic wartime photo.
Now users will be able to request that the company review takedowns of content they posted personally. If a user’s post is taken down, the user will be notified on Facebook with an option to “request review.” Facebook will review your request within 24 hours, it says, and if the company decides it has made a mistake, it will restore the post and notify you. By the end of this year, if you have reported a post but been told it does not violate the community standards, you’ll be able to request a review for that as well.