Honestly, guys, this shouldn’t be so hard. With billions of dollars, thousands of employees, and some of the finest engineering minds on the planet, Facebook just can’t seem to figure out how to stop profiting from advertising categories that don’t comply with its advertising principles. And a new damning discovery was just made by The Intercept.
This week, the outlet attempted to promote two articles on Facebook and discovered a “detailed targeting” category for pushing ads to users who have an interest in “white genocide conspiracy theory.”
This was a category that Facebook’s system had identified on its own, according to the company, and offered among its predefined suggestions for advertisers. Choosing it would serve an ad to a group it said consisted of 168,000 users.
While Facebook actively offering a convenient way to help spread a racist myth is shocking enough, The Intercept made it really easy for the social network to flag its ads and realise this isn’t a good way to use the platform. Ad campaigns of Facebook have to be approved, and this one was specifically named “White Supremacy — Test.” Still, the ads made it through the approval process.
The problem isn’t just that Facebook was potentially giving Nazis a helping hand, it’s that the company just seems to be so inept at setting policies and training its machines and humans. ProPublica has previously found Facebook enabling advertisers to target “Jew Haters” and to deliberately discriminate based on race for housing listings — a practice that’s illegal.
A Facebook spokesperson told Gizmodo that the “white genocide conspiracy theory” category was primarily used “reasonably.” They cited examples like a University lecture on America’s conspiracy culture. They also reminded us that it removed 5000 problematic targeting options that didn’t meet its standards in August. That point would feel more worthy of praise if not for the fact that it managed to let 5000 problematic targeting options sneak their way into the system.
Read the article by Rhett Jones on Gizmodo.