Facebook has a new appeals process in place

  • Facebook has a new appeals process in place

Facebook has a new appeals process in place

The captions and context matter in this case because Facebook does allow such images in some cases where they are condemned, or shared as news or in a medical setting.

Facebook has published its previously internal Community Standards, which explain on what grounds it might delete a post or ban a poster.

"And for the first time we're giving you the right to appeal our decisions on individual posts so you can ask for a second opinion when you think we've made a mistake".

"We chose to publish these internal guidelines for two reasons". First, it will help people to understand where we draw the line on controversial issues. What's more, the social network pledged to increase its team of content reviewers from 7,500 individuals to 20,000 by the end of 2018, according to Recode. Many of us have worked on the issues of expression and safety long before coming to Facebook. For instance, the company explains that "at times we will allow content that might otherwise violate our standards if we feel that it is newsworthy, significant or important to the public interest". That's why we have developed a set of Community Standards that outline what is and is not allowed on Facebook.

Facebook states: Our policies are only as good as the strength and accuracy of our enforcement and our enforcement isn't flawless. After clarifying how it collects user data through third-party apps even when one is not logged onto Facebook, it has now updated its community standards guidelines section to further elaborate upon what kind of posts one is not allowed to publish on the site.

Most videos were spam or included adult content, and a majority - 6.7 million - "were first flagged for review by machines" rather than humans, using technology introduced in June 2017, it said in an official blog post.

From this point forward, if Facebook deletes a post for any of the above reasons, the person who posted the content will be notified and given a chance to ask for a second review. "At Facebook, we have a meeting every two weeks of the content standards forum with our teams from around the world to discuss new developments", she said, adding that usually feedback here is integrated into the policy in another couple of weeks.

Facebook for years has had "community standards" for what people can post.

In a surprise attempt at transparency, Facebook has chose to reveal their 27-page community standards document which outlines what content is banned from their platform and why user accounts may be suspended for publishing certain content. Within 24 hours of initiating an appeal you should know whether Facebook plans to restore your content, or keep it off the platform for good. "We make mistakes because our processes involve people, and people are not infallible". Recognizing that mistakes can be made is probably a reason why Facebook chose to implement an appeal process.

Ms Cummiskey said Facebook was in the process of recruiting more people to help review content. We believe giving people a voice in the process is another essential component of building a fair system. Another initiative tentatively set to be launched in May would be "Facebook Open Dialogue", to get feedback on their policies, through events in Paris, Berlin, the United Kingdom, the US, India and Singapore. "You can click on that icon and get information about who that publisher is or who is behind that speech so that you can make a more informed choice", she said.

In addition the site now provides tools for those who wish to appeal if they've been subjected to a censorship decision by Facebook.