Facebook Oversight Board Reports for First Time

John Lister's picture

Facebook and Instagram's "Oversight Board" received more than a million complaints about content moderation in its first year. But it investigated fewer than 100 and made public rulings in just 20 cases.

The board is made up of independent members who have industry expertise but aren't connected to Meta, the company which owns both Facebook and Instagram. It has the power to rule on content moderation decisions, with the sites having to follow its judgment. It can also make recommendations for policy changes.

Of the 1.1 million appeals from users, most involved content moderation for bullying and harassment, hate speech, and violence and incitement. The vast majority were appeals by users to have their posts reinstated, rather than people challenging the sites having rejected their complaints about content from other users. (Source: bbc.co.uk)

While the board can theoretically consider any complaint, there's clearly no way it can practically do so. Instead, it selects a tiny proportion that are symbolic of specific, contentious issues.

Trump Ban Upheld

The board originally planned to look at 130 cases, though 51 of these were dropped from investigation as Meta agreed immediately it had made wrong decisions. This included removing a photograph of a bare breast used to raise breast cancer awareness.

Of the remaining 79 cases, the board published detailed decisions in 20 cases, ruling against Meta 14 times. The most high-profile was a decision to uphold Facebook's banning of Donald Trump while saying the ban should not be indefinite.

Other decisions on alleged hate speech called for a more nuanced approach that took more account of context. The board also noted several cases where a ban on material that appeared to promote "Dangerous Individuals and Organizations" such as terrorist groups had been misapplied because the posts were actually critical of the groups in question.

Clearer Explanations Urged

The board also made 86 general recommendations, of which two-thirds have so far been implemented. This includes giving more detail when explaining to users why their content has been removed; translating the content policies into more languages; and telling users whether their content was removed based on automated or human moderation. (Source: oversightboard.com)

What's Your Opinion?

Have you appealed a Facebook moderation decision? Will the oversight board make any real difference? Can any moderation process ever be fair and consistent?

Rate this article: 
Average: 5 (3 votes)

Comments

kitekrazy's picture

It is a private organization and there is the freedom of speech issue.

Chief's picture

Why do you continue to fly the false flag about FB being a private company?

They became a very public concern the second they injected themselves beyond their mission - i.e. they created an editorial board and called them "fact checkers".

George Orwell would be proud.

Draq's picture

Sounds like things are moving in the right direction. Maybe. we'll see.

Now if we can also get more information aside from 'we detected suspicious activity' when accounts get locked for seemingly no logical reason too, that'd be great.