Skip to Content

Why does my Facebook say review requested?

Why does my Facebook say review requested?

If you’ve recently seen a message on Facebook that says “Review Requested,” it means that some content you posted, shared, or commented on has been flagged by other users and sent to Facebook’s content moderators for review. This review is to determine if the content violates Facebook’s Community Standards.

Common Reasons for Review Requested

There are a few common reasons why Facebook may flag content for review:

  • The content contains nudity, graphic violence, hate speech, or threats of harm.
  • The content is reported as false news, misinformation, or a scam.
  • The content violates copyrights or trademarks.
  • The content promotes regulated goods like firearms, alcohol, or pharmaceuticals.
  • The content is spammy or repetitive.

Facebook relies heavily on user reports to flag potentially problematic content. Sometimes users report content that doesn’t actually violate policies, but Facebook still reviews it just to be sure. So getting the Review Requested message doesn’t always mean you did something wrong.

What Happens During Review

When content gets flagged for review, here’s what happens behind the scenes at Facebook:

  1. The content is queued up to be examined by a content moderator.
  2. A moderator reviews the content to determine if it violates Facebook’s rules.
  3. If the content is found to be in violation, Facebook may remove or restrict it. You will get a notification explaining this.
  4. If the moderator determines the content is fine, no action is taken and the content stays up.

This entire review process usually takes less than a day, but can sometimes take up to 24 hours if there is a backlog.

What You Can Do

If you see the Review Requested message, here are some things you can do:

  • Check Your Notifications – Facebook will notify you if any action is taken on your content. Check your notifications to see any updates on the review status.
  • Edit the Content – You can delete or edit the content before a final decision is made to potentially avoid any restrictions.
  • Appeal Decisions – If your content gets removed and you think it was a mistake, you can submit an appeal to Facebook.
  • Be Careful Next Time – Think carefully about Facebook’s rules regarding hate speech, nudity, harassment, etc before posting in the future.

Why Facebook Reviews Certain Content

Facebook has faced a lot of criticism for not doing enough to curb harmful content like fake news, election interference, hate speech, and livestreamed violence. In response, Facebook has hired thousands more moderators and expanded the types of content subject to review.

Here are some of the main reasons Facebook reviews certain types of content:

  • Hate Speech and Harassment – To protect marginalized groups from threats, intimidation, and abuse.
  • Graphic Violence – To avoid glorification of violence or videos that encourage criminal acts.
  • Terrorism and Extremism – To prevent use of the platform for radicalization and terrorism recruitment.
  • Misinformation – To stop viral spread of dangerous hoaxes and fake news, especially about elections, health, and public welfare.
  • Scams – To protect users from predatory scams, false advertising, and other deceptive practices.

Facebook’s review process aims to balance free speech with safety. They want to allow for a diversity of views, but prevent real-world harm. However, making these judgment calls is far from an exact science. Facebook will likely continue updating its policies as new concerns arise over its role in shaping public discourse.

Examples of Content Flagged for Review

Here are some examples of specific content that often gets flagged for review:

  • Posts with nudity, like breastfeeding photos or artistic nude images.
  • Comments containing racial slurs, sexist remarks, or threats of violence.
  • Videos promoting fraudulent get-rich schemes or miracle health cures.
  • Pages spreading conspiracy theories like QAnon or anti-vaccine propaganda.
  • Groups that glorify hate groups or organized crime.
  • Events promoting illegal activity like street racing.

These types of content frequently generate user reports and escalate to official review by Facebook’s moderators. The end result can be removal of the content or banning of the user or group responsible, depending on the severity.

Who Reviews the Flagged Content?

Facebook employs thousands of content reviewers located around the world who examine flagged posts and make decisions based on Facebook’s Community Standards and other internal guidelines. According to Facebook:

  • 15,000+ content reviewers assess reports that come in globally every week.
  • Reviewers undergo extensive training on Facebook’s policies.
  • Team has reviewers covering 50+ languages.
  • Additional staff focus on counterterrorism, child safety, and hate speech.

Facebook reviewers have backgrounds in fields like law enforcement, teaching, journalism, and social work to inform their content decisions. Reviewers have also grown more diverse along with Facebook’s global user base.

In addition to human reviewers, Facebook also utilizes automated systems to scan for policy violations at scale. But humans make the final call for content removal and account restrictions. This helps provide nuance and cultural context to guide decisions.

What Factors Determine Content Removal?

When reviewing flagged content, Facebook considers a variety of factors to decide whether to allow it or take it down. These include:

  • Severity of policy violation
  • Context of the situation
  • Type of content (text, photos, video, etc)
  • Post visibility and engagement
  • User history and prior violations
  • Danger of real-world harm
  • Applicable regional laws and norms

For example, implicit threats may be tolerated less than explicit ones, and nudity in an artistic context may be treated differently than pornography. Reviewers take a case-by-case approach considering both the letter and spirit of Facebook’s standards.

What Happens if You Violate Policies?

If Facebook determines you have violated their content policies, here are some consequences you may face:

  • Content removal – Individual posts, photos, videos, comments, etc deleted.
  • Account restrictions – Limits on posting, commenting, sharing for a period of time.
  • Page unpublished – Facebook and Instagram business pages made temporarily unavailable.
  • Account disable – Accounts suspended either temporarily or permanently.

Less severe violations typically result in content takedowns or temporary restrictions. Repeated or more serious offenses can lead to extended account disables or permanent bans from the platform.

Can You Appeal Enforcement Actions?

If Facebook restricts your profile or removes your content for a policy violation, you do have options to appeal these enforcement actions:

  • Appeal content removal – Request another review if you think the content was removed in error.
  • Appeal account restriction – Explain why a restriction was unjustified or should be lifted.
  • Request review of disabled profile – Ask for reinstatement of a disabled account, especially if permanent.

To submit appeals:
– Go to the Facebook Help Center

– Find options to “Report a Problem” or “Submit Appeal”
– Select relevant forms and provide details on the enforcement action

– Include explanation of why action was incorrect or excessive

Appeals are reviewed by different content specialists who take another look at the context and make a final determination. The process typically takes a few days. But reinstatement of accounts is not guaranteed, especially for severe or repeat violations.

Preventing Content from Being Flagged

Here are some tips to help prevent your Facebook posts from getting flagged and requiring review:

  • Carefully read Facebook’s Community Standards to understand what is and isn’t allowed.
  • Avoid posting content that is gratuitously graphic, obscene, or violent.
  • Don’t make specific threats against others or post content promoting criminal behavior.
  • Fact check information and don’t spread misinformation or false news.
  • Give proper attribution when sharing others’ content and comply with copyright laws.
  • Steer clear of hate speech targeting protected groups based on their identity.

Keep in mind cultural norms and sensitivities too. What you see as harmless humor may be offensive to others. If you ever have doubts about certain content, it’s safest not to post it.

Dealing with False or Unfair Reports

Sometimes people report content just to harass the poster or silence opinions they disagree with. This can result in frustrating reviews of perfectly appropriate content. Here are tips for dealing with potentially false or unfair reports against you:

  • Politely reach out to the user who reported you to understand their concerns.
  • Ask friends/followers to vouch for your content if they feel it was reported unfairly.
  • Provide context on aspects of the content that may have been misunderstood.
  • Highlight your past record of compliance with Facebook’s rules.
  • Offer to make reasonable edits to the content to address objections.
  • Submit an appeal if any enforcement action is taken, explaining why the reports were false.

Maintaining your reputation as a responsible user following Facebook’s guidelines can help combat any false reporting against you. Having allies who can validate your content helps too. With persistence and patience, you can often resolve unfair review requests in your favor.

Changes to Facebook Review Process Over Time

Facebook’s content review process has evolved significantly over the years:

  • 2004-2010 – Minimal moderation relying on user reports and some automation.
  • 2011-2016 – Hired thousands of reviewers to handle reviews mainly for nudity, harassment, threats.
  • 2016-2018 – Expanded hate speech rules and terrorism-related content reviews.
  • 2018-Present – Additional focus on election integrity, misinformation, data privacy, live video.

Recent changes include:

  • Banning certain forms of extremism like white nationalism.
  • Labeling and fact checking false news and misinformation.
  • Restricting reach of groups promoting harmful conspiracy theories.
  • Increasing transparency around advertising policies and political ad spending.

As Facebook usage patterns and societal concerns continue to evolve, we can expect ongoing adjustments to the company’s content moderation practices, policies, and enforcement actions.

Controversies Related to Facebook Reviews

Facebook has faced controversies regarding its content reviews including:

  • Accusations of Bias – Conservatives often accuse Facebook of unfairly targeting right-leaning content for removal or fact checks.
  • Moderator PTSD – Reviews of traumatic content like murder and child abuse takes mental toll on moderators.
  • Role in Politics – Facebook struggled balancing free speech and misinformation during 2016 and 2020 US elections.
  • Hate Speech Definition – How to define hate speech related to nationalism, racism, and bigotry remains hotly debated.
  • Too Much Power – Critics argue Facebook has too much control in deciding what content stays up or gets taken down.

Facebook points to efforts to increase transparency, accountability, and oversight of its review process. But major challenges remain in applying policies fairly at Facebook’s scale. Content decisions often involve complex tradeoffs with winners and losers depending on the outcome.

Conclusion

In summary, the “review requested” message on Facebook indicates some of your content was flagged by users and submitted for review by Facebook’s content moderators based on potential violations of their Community Standards. Facebook tries to balance enabling free expression with enforcing responsible posting behavior on their platform. While the review process is imperfect and controversial at times, most users agree some moderation is necessary to curb real harms like hate, violence, and misinformation. Understanding Facebook’s content policies, thinking carefully before posting, and appealing actions you believe were incorrect can help minimize problems stemming from the review process. With vigilance and constructive feedback to Facebook, we can continue moving towards an online world that is open and safe for all.