Latest News

latest

Facebook's Moderation Guidelines Have Leaked and People Aren't Happy

RTE
Facebook's moderation and censorship policies have been the subject of much debate in recent months. Whether it's their failure to curtail the fake news behemoth, the spate of violent and disturbing content which has circulated via Facebook Live or the numerous instances in which they've taken down perfectly innocent, or even educational content, it's done their credibility absolutely no favours.

Past controversy pales in comparison to this latest development. Over the weekend, The Guardian obtained and published a list of guidelines Facebook use in the moderation handbook, as part of a wider investigation into the platform's ethics. You can read the full list here, but here are a few of the most worrying highlights:

"Remarks such as “Someone shoot Trump” should be deleted, because as a head of state he is in a protected category. But it can be permissible to say: “To snap a b***’s neck, make sure to apply all your pressure to the middle of her throat”, or “f*** off and die” because they are not regarded as credible threats."

"Some photos of non-sexual physical abuse and bullying of children do not have to be deleted or “actioned” unless there is a sadistic or celebratory element."

"Facebook will allow people to livestream attempts to self-harm because it “doesn’t want to censor or punish people in distress”."

Alongside the other information about Facebook's lenience towards animal cruelty and revenge porn, it paints a rather upsetting picture. Seemingly the platform has an almost zero tolerance attitude towards nudity (up to and including 'digital art') but is more than happy to let violent content circulate freely, with their only concession being to mark the worst stuff as 'disturbing'. As you might expect, these revelations are making people angry.

There were already calls going out for the platform to allow for independent regulation and they're even louder now. Facebook's only real response has been a reiteration of the fact that they're bringing on 3,000 more moderators over the coming months, and that they're still looking at ways to improve their machine learning technology to improve moderation. That response is a far cry from encouraging, given how many prior moderation issues have been down to the AI watchdogs either missing or mistakenly flagging content.

Now we know that Facebook have drawn grey areas in moderation categories that are pretty much black and white. It's hard to see any circumstance in which a video showing child abuse would be in any way admissible, but Facebook don't want to take that risk for fear of their global sharing figures taking a hit, it would seem.

The tepid response seems to suggest that Facebook have no plans to alter their policies, which makes the whole thing even more disturbing, as it demonstrates that Facebook are more concerned with protecting their own interests than recanting regulations which not only acknowledge that cruel and disturbing content is being shared on their platform, but actively allowing it to continue.


Callum is a film school graduate who is now making a name for himself as a journalist and content writer. His vices include flat whites and 90s hip-hop. Follow him @Songbird_Callum

Contact us on Twitter, on Facebook, or leave your comments below. To find out about social media training or management why not take a look at our website for more info: TheSMFGroup.com
Facebook's Moderation Guidelines Have Leaked and People Aren't Happy Reviewed by Callum Davies on Tuesday, May 23, 2017 Rating: 5

No comments:

All Rights Reserved by Social Songbird © 2010 - 2016
Powered By Blogger

Contact Form

Name

Email *

Message *

Powered by Blogger.