Facebook launches oversight board for content moderation


NBC News reports that “Social media users who believe their posts have been unfairly removed from Facebook or Instagram can now file an appeal to Facebook’s Independent Oversight Board, the company announced Thursday.”
Positioned as a “Supreme Court” for Facebook’s content moderation decisions, the external panel of 20 journalists, academics, lawyers and human rights experts will weigh in — and potentially override its content moderation decisions.

The board has up to 90 days to review cases submitted by users through its website after they have exhausted their content appeal options directly with Facebook. If the Board sides with the user, Facebook will restore the content and potentially re-evaluate its policies.

“The Oversight Board wasn’t created to be a quick fix or an all-encompassing solution,” said Helle Thorning-Schmidt, co-chair of the board and former prime minister of Denmark. But it aims to “offer a critical independent check on Facebook’s approach to moderating some of the most significant content issues.” By announcing the board on Thursday, Facebook has launched an unprecedented model of governance that no other social media outlet has created… “The Oversight Board has the potential to revolutionize how we think about the relationship between private corporations and our public rights,” said Kate Klonick, an assistant professor at the St. John’s University School of Law, who has published research on the Oversight Board. “It’s a step toward recognition that these transnational companies control our public rights in a way that governments don’t and that we need to create a participatory and democratic mechanism to inform those companies that those rights are protected….”

“Of all the criticisms that are lodged against Facebook, I think one of the biggest is that we can’t trust them,” Jamal Greene, a Columbia Law School professor and co-chair of the Oversight Board, said in an interview in September. “One of the aims of the Oversight Board is to try to establish an institution that can be trusted…” During test panels, there were times when board members noted that their decision could affect Facebook’s commercial model. For example, being more permissive about images containing some types of nudity on the platform could deter users in parts of the world with stricter cultural norms.

“The reaction has always been ‘Well, that’s not our problem, that’s Facebook’s problem,'” said board member Alan Rusbridger, former editor-in-chief of The Guardian. “So I don’t think anyone is coming into this thinking we’re here to help Facebook continue with life as normal.”