// Algorithm Schism - Mislabling Hate Crime and Inadvertent Racism - Social Songbird

Search

Latest News

latest

Algorithm Schism - Mislabling Hate Crime and Inadvertent Racism

I get it, alright Google, Facebook, all the rest of you? I understand, setting algorithms and recognition functions saves you a butt-load of time and money. The problem is that even with all the AI advances, sometimes you really do need a human touch, else you'll end up landing yourself in a whole heap of trouble. Both Google and Facebook (via Instagram) have learned that lesson the hard way recently.

irishtimes.com

Let's start with the less awful one, that of Instagram. On Wednesday, rapper and activist Talib Kweli posted an image on his feed, an old photograph of a car with the passenger flying a confederate flag and a sign reading 'N***er Go Home'. He captioned it 'aka states rights', in an attempt to make a point about racial inequality, and a rather important one given everything that's been going on in the wake of Charleston (people have been burning African American churches down, I kid you not). Someone over at Instagram clearly didn't get the memo, because the image was pulled, owing to an apparent 'community guidelines' violation.

Being who he is, Kweli didn't take it lying down, posting the image again on his Twitter and questioning why the photo sharing platform had deemed it inappropriate. To their credit (and likely as a result of the ensuing crap-storm), Instagram put it back up. It highlights the point though that sometimes even an ostensibly offensive image is admissible. It's the context that creates the offense, not the content. Of course, a set of codes doesn't know that, and has no way of understanding it.

Equally, a set of codes doesn't necessarily know when it's enacting a huge, ghastly racial slur. The Google Photos app contains an auto-tag function which, again, is a big time-saver, but in one unfortunate recent incident when Jacky Alciné uploaded a picture of himself and another (black) friend, only for them both to be tagged as gorillas. Obviously there was no malicious intent, or even anything to intend to be malicious, it was just a miscalculation based on shape and skin tone, but it highlights the issue even further.

Obviously there's nothing to be offended about in either case, but they both make striking examples of the shortcomings of recognition and categorisation software, compared to a nice, soft human brain. Google have tangled with this kind of thing before, both with their maps and their search engine. It's never pretty, but in both cases the solution was for a human eye to be cast over it. It's not worth getting bent out of shape over, but it's a timely reminder that at the current standard, even the most modern AI has no concept of tact, as Apple learned when, to their horror, they discovered the homophobic remarks that the Russian Siri was capable of spouting, given the wrong stimuli.

There's no direct solution to this other than just having better quality control (particularly to make sure kids don't see the wrong thing), but with everyone rushing to win the AI race, it's worth bearing in mind that emotional understanding is still a pretty distant frontier. Cultural sensitivity is even further off.


Callum Davies

Callum is a film school graduate who is now making a name for himself as a journalist and content writer. His vices include flat whites and 90s hip-hop. Follow him @CallumAtSMF

Contact us on Twitter, on Facebook, or leave your comments below. To find out about social media training or management why not take a look at our website for more info http://socialmediacambridge.co.uk/.
Algorithm Schism - Mislabling Hate Crime and Inadvertent Racism Reviewed by Unknown on Sunday, July 05, 2015 Rating: 5
All Rights Reserved by Social Songbird © 2012 - 2024

Contact Form

Name

Email *

Message *

Powered by Blogger.