FB Accidentally Shared A Rape Threat As An Advertisement For Instagram – Good Job, Robots

If you have a really popular Instagram post, Facebook—the company that owns the photo-sharing service—might grab your post and turn it into an advertisement. The ad isn’t really for everyone, according to The Guardian, but they’ll show it to your Facebook friends in hopes that they head over to Instagram themselves to see more.

Unfortunately, the robots that run this little operation aren’t particularly discerning, as reporter Olivia Solon found out firsthand in September.

Here’s an ad seen by Solon’s sister on Facebook:

Rather ironically, Solon took to Twitter to shame Facebook and Instagram after an algorithm designed by the latter social media services selected one of her posts—a screenshot of a death threat and rape threat—to promote their online services.

The Instagram post that ended up as an advertisement on Facebook was originally shared by Solon as a wake-up call about the type of hate mail women receive every day.

“This is an email I received this afternoon,” wrote Solon. “Sadly this is all too common for women on the Internet. I am sure this is just an idiot rather than any kind of credible threat but it’s still pretty vile.”

The Guardian reports that “it’s unclear why Instagram chose to highlight Solon’s hate mail to friends on Facebook,” but posited that the “three likes and more than a dozen sympathetic comments” may have alerted the algorithm that the post was particularly “engaging.”

Instagram apologized, clarifying that “the image was not used in a ‘paid promotion.'”

“This notification post was surfaced as part of an effort to encourage engagement on Instagram,” they explained. “Posts are generally received by a small percentage of a person’s Facebook friends.”

This is far from the first time a Facebook algorithm has humiliated its master. Earlier this month, the website made headlines for allowing ad-buyers to target people who “use the language ‘Jew hater’ or ‘How to burn Jews’ on their [profiles].”

Now the only question is whether the insensitivity of the algorithm means it’s one step closer or one step farther away from being a true human brain.


Please wait...