Meta has announced that it will implement supplementary measures to address accounts that share “unoriginal” content on Facebook, which refers to the repetitive repurposing of text, images, or videos from other sources.

Meta has already removed approximately 10 million profiles that were impersonating prominent content creators this year, according to the organization.

Additionally, it has implemented measures to prevent 500,000 accounts from engaging in “spammy behavior or fake engagement.”

These measures have included the demotion of the comments of the accounts and the reduction of the distribution of their content to prevent the accounts from monetizing.

The update from Meta was released mere days after YouTube announced that it was also clarifying its policy regarding unoriginal content, which includes mass-produced and repetitive videos.

These types of content have become easier to produce with the assistance of AI technology.

Meta, like YouTube, assures users that it will not penalize them for engaging with the content of others, including the creation of reaction videos, participation in trends, or the addition of their own perspective.

As an alternative, Meta concentrates on the reposting of content from others, whether it is on spam accounts or those that falsely claim to be the original creator.

The company stated that accounts that repeatedly reuse content from others will be denied access to Facebook monetization programs for a period of time and will experience a decrease in the distribution of their postings.

Facebook will also decrease the dissemination of duplicate videos to guarantee that the original creator receives credit and views.

Furthermore, the organization disclosed that it is currently conducting a trial of a system that incorporates hyperlinks into duplicate videos to direct viewers to the original content.

Meta is currently facing criticism from users on its various platforms, including Instagram, regarding the erroneous and excessive enforcement of its policies through automated methods.

Consequently, the update has been implemented. Meta is asked to address the issue of unjustly disabled accounts and its lack of human support in a petition that has garnered nearly 30,000 signatures.

This issue has resulted in users feeling abandoned and has harmed numerous small businesses. Despite the attention of the press and other prominent creators, Meta has not yet publicly addressed the issue.

Although Meta’s most recent enforcement is primarily directed at accounts that exploit the content of others for financial gain, concerns regarding unoriginal content are intensifying.

Platforms have been inundated with AI slime, a term that refers to low-quality media content produced using generative AI, as a result of the proliferation of AI technology.

Text-to-video AI tools have made it effortless to locate an AI voice overlaid on photos, video snippets, or other repurposed content on YouTube, for example.

It appears that Meta’s update is exclusively focused on reused content; however, its post implies that it may also be considering AI errors.

In a section where the company provides “tips” for creating original content, Meta recommends that creators should not merely “stitch together clips” or add their insignia when utilizing content from other sources.

Instead, they should concentrate on “authentic storytelling,” rather than brief videos that provide little value.

These types of unoriginal videos are also made simpler to produce by AI tools, although they do not explicitly state so. Low-quality videos frequently consist of a series of images or segments (either real or AI) with AI narration.

Meta also cautions creators in the post to refrain from reusing content from other apps or sources, a policy that has been in place for an extended period.

It is also noted that video captions should be of high quality, which may entail a reduction in the use of automated AI captions that are not edited by the creator.

Meta has stated that these modifications will be implemented progressively in the months ahead, allowing Facebook creators to adjust.

If creators believe that their content is not being distributed, they can access the new post-level insights in Facebook’s Professional Dashboard to determine the reason.

Additionally, creators will be able to determine whether they are at risk of content recommendation or monetization penalties by accessing the Support home screen from the primary interface of their Page or professional profile.

Meta typically discloses information regarding its content takedowns in its quarterly Transparency Reports.

Meta reported that 3% of its global monthly active users on Facebook were fraudulent accounts in the most recent quarter, and it had addressed 1 billion fake accounts from January to March 2025.

In the United States, Meta has recently refrained from fact-checking content itself in favor of Community Notes, which are similar to X.

These notes enable users and contributors to ascertain whether posts adhere to Meta’s Community Standards and are accurate.