The oversight board apparently received more than 1 million complaints from Facebook and Instagram users in 2021. Most of the requests asked the committee to overturn content on the Meta app that was removed for violating rules against hate speech, violence and bullying. The committee issued decisions and explanations on 20 cases it described as “significant”. In 70 percent of the cases reviewed by the panel, it overturned Meta’s original decision.
“There is clearly a huge demand from Facebook and Instagram users who want some way to appeal content moderation decisions made by Meta and to an organization independent of Meta,” the committee wrote in its report.
The oversight committee’s most prominent decision so far has been whether to reinstate the account of former U.S. President Donald Trump, who was banned from Facebook after he encouraged a rebellion at the U.S. Capitol. The committee responded to the decision by asking Meta to clarify the rules it used to ban Trump’s account in the first place. “Facebook did not follow clear, published procedures in imposing this penalty,” the committee wrote at the time, adding that Facebook has no rules for “indefinite” bans like the one sent to Trump. .
In addition to the decision to set a kind of precedent for future policy enforcement, the committee made more general recommendations to Meta on how the company should consider specific aspects of content moderation and the rules it should set.
In less high-profile cases, the committee recommended that Meta tighten Facebook and Instagram’s anti-defamation rules, require the company to issue a transparent report detailing how well it is enforcing COVID-19-related rules, and ask for It prioritizes fact-checking governments that share health misinformation through official channels.
The oversight committee made 86 policy recommendations in its first year. Meta has implemented some of the committee’s recommendations for improving moderation transparency, including giving users more visibility when they violate the platform’s hate speech rules and informing them if AI or human moderation leads to enforcement decisions, ignoring others. decision. The annual report tracks these results, and it does reveal just how influential the group has been, and how often Meta has implemented or covered up its recommendations.
Oversight committees review content moderation cases from around the world, sometimes teasing out linguistic and cultural nuances that Meta itself fails to incorporate into its automated or non-automated moderation decisions. Facebook whistleblower Frances Haugen has repeatedly warned about the company’s ability to monitor its social platform in non-English-speaking markets. According to the report, half of the oversight committee’s decisions involve countries in the global south, including some in Latin America and Africa.
Initially, the committee only reviewed cases of users requesting the reinstatement of Instagram and Facebook content, but a few months later, the panel expanded to consider cases requiring content to be removed. Still, the oversight committee’s decision-making areas are limited to questions about individual posts, not the many other features people use on Instagram and Facebook.
The committee wrote that it wants to expand its powers to advise Meta on moderation issues that affect accounts and groups on its platform, not just individual posts. The oversight committee is currently in “conversations” with the company, which still has the final say on what the semi-independent advisory group can actually do.