Following a months-long staffing spree, Facebook’s Independent Oversight Board announced the first six cases it’s taking on to deliberate Facebook’s sprawling policies surrounding content moderation.
Just to recap, the self-declared purpose of this Oversight Board is to act essentially as an unbiased panel meant to review some of Facebook’s more contentious content moderation decisions. As we’ve previously noted, each of the hires made to the Board thus far have some sort of track record championing human rights. At the end of the day, a Board that was shaped by Facebook’s own hand still hews towards Facebook’s principles, as in its charter which states it will “pay particular attention to the impact of removing content in light of human rights norms protecting free expression.” More speech solves bad speech as been the unofficial motto of Facebook, and many other online platforms, to what could be described as limited success. Still, the Board’s ultimate decisions are ones that are (supposedly) independent, and supersede those of Facebook itself.
The full list of cases (which you can read here) involve posts pulled from Instagram or Facebook proper for violating one of the platform’s community standards. These six run the gamut of controversial posts pulled from Facebook’s platform over its policies surrounding hate speech, nudity, and “dangerous organisations.”
Read the article by Shoshana Wodinsky in Gizmodo.