Meta’s Oversight Board, the autonomous entity established to assist Meta with content moderation decisions, responded to the social media company’s newly stated hate speech standards from January.
The Board said that Meta’s new policies were “announced hastily, deviating from standard procedure,” and urged the business to provide further information on its regulations.
The Board requested that Meta evaluate the effects of its new rules on at-risk user demographics, publicly disclose the results, and provide biannual updates to the Board.
The Board indicates it is engaging in negotiations with Meta to formulate its fact-checking policy in areas outside the United States.
In the weeks before President Donald Trump’s inauguration, Meta CEO Mark Zuckerberg initiated a revision of the company’s content control standards to facilitate “more speech” on Facebook, Instagram, and Threads.
In this initiative, Meta rescinded hate speech regulations that safeguarded immigrants and LGBTQIA+ individuals on its platforms.
The Board has provided Meta with 17 recommendations, which include assessing the efficacy of its new community notes system, elucidating its updated position on hateful ideologies, and enhancing the enforcement of its harassment policy violations.
The Board has requested that Meta adhere to its 2021 commitment to the UN Guiding Principles on Business and Human Rights by consulting with stakeholders affected by the proposed policy.
The Board says that Meta ought to have done this first.
The Oversight Board’s capacity to influence Meta’s overarching policy is limited.
Nevertheless, Meta is obligated to adhere to its decisions about individual postings, in accordance with the company’s established regulations.
If Meta provides the Board with a policy advisory opinion referral, as it has before, the organization might potentially influence Meta’s content moderation practices.
The Oversight Board criticized several of the new content policies announced by Zuckerberg earlier this year in decisions regarding 11 cases related to issues on Meta’s platforms, including anti-migrant speech, hate speech against individuals with disabilities, and the suppression of LGBTQIA+ voices.
The Board said that Meta’s policy adjustments in January did not influence the results of these determinations.
In two U.S. instances concerning videos of transgender women on Facebook and Instagram, the Board affirmed Meta’s decision to retain the material, notwithstanding user complaints.
The Board advises that Meta eliminate the phrase “transgenderism” from its Hateful Conduct policy.
The Board reversed Meta’s decision to retain three Facebook postings related to anti-immigration riots that occurred in the U.K. during the summer of 2024. The Board determined that Meta was sluggish in eliminating anti-Muslim and anti-immigration information that contravened the company’s policy on violence and incitement.