ArabicChinese (Simplified)EnglishFrenchGermanItalianPortugueseRussianSpanish
Business

Facebook Jail: Meta gives users who violate content up to 7 chances before being blocked

Facebook’s parent company Meta is cracking down on inflammatory content with a new set of rules – but it’s now prepared to give persistent offenders plenty of opportunities to see their mistakes.

In its latest move, the social media company explains why a user has violated content guidelines up to seven times.

After the eighth violation, the user’s account will be suspended and sent to “Facebook Jail,” a term coined by users to describe the ban from the social media platform.

Meta announced its policy change Thursday in response to feedback from its oversight board, which also raised the social media site’s shortcomings last December.

The new policy would prevent people from being “overly penalized” for complying with Meta’s content rules and result in “quicker and more effective action.”

“As part of the new system, we will focus on helping people understand why we removed their content, which has been shown to be more effective in preventing repeat offenders, rather than limiting their ability to post so quickly,” Monika Bickert, the vice president of content policy at Meta, wrote in a statement.

In the case of serious violations where the content involved terrorism, human trafficking, or other inappropriate content, the account will face immediate action, including account removal, Bickert said.

“The vast majority of users of our apps have good intentions. In the past, some of these people have ended up in “Facebook jail” without understanding what they did wrong or if they were affected by a content enforcement error,” Bickert said.

The company’s previous policies quickly imposed months-long bans on people’s accounts, even when their violations were minor or accidental.

While Meta’s current policy change gives “well-intentioned” users more room to use the platform, the company has had issues with lax enforcement in the past.

In 2019, Brazilian footballer Neymar shared explicit images of a woman who had accused him of rape before Facebook took it down to his millions of fans.

The same policies also allowed accounts to spread information about political figures like Hillary Clinton and Donald Trump that turned out to be false.

The Oversight Board appointed in 2020 found last December that Meta’s “cross-check” program, which gives preferential treatment to VIP accounts, was “structured to satisfy business interests” and was deeply flawed.

The board has given Meta over 30 recommendations to enforce fair and transparent policies.

It also said earlier this month that it changed the rules to allow expedited decision-making between two and 30 days when the content policy has been violated.

In response to the new policy change, the oversight body said it “welcomed” the move, but added that there was room to develop policies that go beyond this and only address “less serious violations”.

Meta didn’t return immediately wealth‘s request for comment.

Learn how to navigate and build trust in your organization with The Trust Factor, a weekly newsletter exploring what leaders need to succeed. Login here.

Related Articles

Back to top button
ArabicChinese (Simplified)EnglishFrenchGermanItalianPortugueseRussianSpanish