Sunday, 24 June 2018
Latest news
Main » Facebook Reveals Its Internal Rules For Removing Controversial Content

Facebook Reveals Its Internal Rules For Removing Controversial Content

26 April 2018

So on Tuesday, the latter chose to reveal its internal community standards and policies, which it uses to determine which posted content is acceptable, and which should be removed.

There was uproar, for example, in 2016 when it censored what is generally regarded as an iconic image from the Vietnam War of a naked child running from a napalm attack.

An example of what the appeals process will look like on the user's end.

Facebook will soon hold public consultations in Delhi to improve its community standards policy in an effort to provide transparency on how the social networking giant reviews and blocks content. "Now everybody out there can see how we're instructing these reviewers", says Monika Bickert, Vice President of Product Policy and Counterterrorism, said last week at a press briefing in Facebook's Menlo Park, California headquarters, according to an article on CNET.

"Starting today we're making transparent our internal guidelines for how exactly we define hate speech, violence, nudity, terrorism, and other content we don't allow".

"Based on this feedback, as well as changes in social norms and language, our standards evolve over time", she writes. "But we know there will always be people who will try to post abusive content or engage in abusive behavior". This is our way of saying these things are not tolerated.

More news: BJP announces first list of candidates for Karnataka polls

Objectional content includes some of the topics in other categories and more. They work 24/7 in over 40 languages, Facebook says.

Videos of people wounded by cannibalism are not permitted, for instance, but such imagery is allowed with a warning screen if it is "in a medical setting".

Facebook has faced fierce criticism from governments and rights groups in many countries for failing to do enough to stem hate speech and prevent the service from being used to promote terrorism, stir sectarian violence and broadcast acts including murder and suicide.

The social network has been caught up in a whirlwind of controversy since the start of the year.

Earlier this month, lawmakers called on Facebook chief Mark Zuckerberg to appear in Congress.

Quartz noted that some of the new rules were "clearly developed in response to a backlash Facebook received in the past". The community standards also does not affect the false information - Facebook does not prohibit it, but is trying to reduce its spread. Every week, our team seeks input from experts and organizations outside Facebook so we can better understand different perspectives on safety and expression, as well as the impact of our policies on different communities globally.

More news: Rahul Gandhi assails PM; Shah calls him 'failed leader'

The leading social network "took action" on 1.9 million pieces of IS or al-Qaeda-linked content in the first three months of this year - almost double the amount from the previous quarter, according to vice president of global policy management Monika Bickert and global head of counterterrorism policy Brian Fishman. Bickert said the company plans to bring this to New Delhi in June or July this year. It conducts weekly audits to review its decisions but recognizes mistakes are inevitable. Users will also be able to request review for posts that were flagged but not removed. If the company wants to make appeals more constructive, it needs to give users the option to appeal when the company doesn't take down content - frequently users can be frustrated when content they find offensive or harmful isn't removed from the site.

The previous public-facing version of Facebook's community standards gave a broad-strokes outline of the rules, but the specifics were shrouded in secrecy for most of Facebook's 2.2 billion users.

Moderators working for Facebook sift through millions of reports each week from users about inappropriate posts, groups or pages.

She added: "Everybody should expect that these [policies] will be updated frequently". "This is not a self-congratulatory exercise".

"You should, when you come to Facebook, understand where we draw these lines and what's okay and what's not okay", she told reporters.

More news: Theo Walcott nets Everton's victor

Facebook Reveals Its Internal Rules For Removing Controversial Content