Skip to Content

How do I report community standards on Facebook?

How do I report community standards on Facebook?

Facebook has established Community Standards to help ensure safety, privacy, dignity and authenticity on the platform. These standards outline what is and isn’t allowed on Facebook, and they apply to everyone who uses Facebook. If you see something on Facebook that you believe violates these Community Standards, you can report it to Facebook for review.

How to Report Content That Violates Facebook Standards

There are a few ways to report potentially violating content that you come across on Facebook:

  1. On a post: Click the three dots in the upper right corner and select “Find Support or Report Post.”
  2. On a photo: Click the three dots in the upper right corner and select “Find Support or Report Photo.”
  3. On a page: Click the three dots at the top and select “Find Support or Report Page.”
  4. On a profile: Click the three dots at the top and select “Find Support or Report Profile.”
  5. On a group: Click the three dots next to the group name and select “Report Group.”
  6. On a comment: Hover over the comment and click the arrow, then “Report.”
  7. On a Marketplace listing: Click the three dots and select “Report Listing.”
  8. On a Fundraiser: Click the three dots and select “Report Fundraiser.”
  9. On an ad: Click the three dots and select “Hide Ad” or “Report Ad.”

This will bring up a menu where you can select which Community Standard you believe the content violates. You’ll also have the option to block the user or page responsible for the violating content.

Community Standards Facebook Upholds

Facebook’s Community Standards cover the following areas:

  • Violence and Criminal Behavior
  • Safety
  • Objectionable Content
  • Integrity and Authenticity
  • Respecting Intellectual Property
  • Content-Related Requests
  • Suicide and Self-Injury
  • Sensitive Content
  • Sexual Exploitation of Adults
  • Sexual Solicitation

When you report violating content, you’ll select which of these categories best represents the issue. Some key things that Facebook does not allow include:

  • Bullying or harassment
  • Hate speech
  • Graphic violence
  • Nudity or sexual activity
  • Spam
  • Terrorist propaganda
  • Misinformation or fake news
  • Scams
  • Unauthorized sales of regulated goods

What Happens When You Report Content

Once you submit a report, Facebook reviews the content in question to determine if it violates their Community Standards:

  1. The content is first reviewed by Facebook’s automated systems.
  2. Anything identified as potentially violating is escalated to content moderators.
  3. Facebook prioritizes reports based on their severity, so more harmful content is reviewed first.
  4. If the content is found to violate standards, Facebook removes it and takes appropriate action against the account responsible.
  5. For violations that don’t warrant account deletion, Facebook issues a warning and may limit account privileges temporarily.
  6. Facebook informs you when action is taken on content you reported.

Keep in mind that just because something is offensive does not necessarily mean it violates Facebook’s rules. Differences of opinion, satire and political discourse are generally allowed. Facebook aims to encourage expression while also providing a safe environment.

How To Best Report Violating Content

To help Facebook most effectively review reports, keep these tips in mind when reporting violating content or accounts:

  • Prioritize dangerous content – Reports involving threat of violence, self-injury, bullying or sexual exploitation are more urgent for moderators to review.
  • Include details – Provide specific details on why you are reporting something in the description box.
  • Link to content – If reporting a photo, video or comment, include the direct link to the piece of content in your report.
  • Report once – Do not submit multiple reports on the same piece of content.
  • Check Community Standards – Review Facebook’s standards to ensure what you are reporting violates their rules.

What To Do If Your Content Is Removed

If content you posted gets removed by Facebook, here are some steps you can take:

  1. Check the Community Standards to understand what rule your post violated.
  2. Review Facebook’s decision – you can request another look at the removal.
  3. Edit or remove what was violating from the post and re-share it.
  4. Share feedback with Facebook if you believe this was an incorrect content action.
  5. Reach out to Facebook support if your account faces restrictions due to multiple violations.

Conclusion

Facebook provides an important platform for expression and connection, but also relies on its community members to report content and accounts that violate its standards. If you come across something concerning on Facebook, submit a report to bring it to their attention. Understanding Facebook’s policies, providing context in your reports and focusing on dangerous content helps improve the effectiveness of the reporting process.

Community Standard Examples of Violations
Violence and Criminal Behavior Terrorist propaganda, organized hate, assault, murder
Safety Bullying and harassment, criminal sexual solicitation, fraud
Objectionable Content Hate speech, graphic violence, adult nudity
Integrity and Authenticity Misinformation, spam, fake accounts
Respecting Intellectual Property Copyright or trademark infringement
Content-Related Requests Posts threatening physical harm, hate speech
Suicide and Self-Injury Promoting suicide, eating disorders, self-mutilation
Sensitive Content Graphic violence, nudity, pornography
Sexual Exploitation of Adults Non-consensual intimate imagery, sex trafficking
Sexual Solicitation Inappropriate sexual requests to minors

This table provides examples of content violations under each of Facebook’s Community Standards categories that users should look out for and report if encountered on the platform.

Tips for Reporting Effectively

  • Prioritize dangerous content like threats of violence or exploitation.
  • Provide specific details on why you are reporting something.
  • Include direct links to violating content when possible.
  • Avoid submitting multiple reports on the same content.
  • Double check that the content violates Facebook’s rules.

Following these tips helps Facebook’s moderators review and act on reports more efficiently.

What to Do If Your Content is Removed

  1. Review the Community Standards to understand what rule was violated.
  2. Request another look if you think the removal was a mistake.
  3. Edit or remove whatever was violating and re-share the content.
  4. Provide feedback to Facebook if you believe the removal was incorrect.
  5. Contact Facebook support if your account faces restrictions.

These steps can help troubleshoot instances where Facebook may have made an error in removing your content.

Conclusion

Reporting content is vital for upholding standards on Facebook. When providing reports:

  • Focus on dangerous, harassing, or exploitative content.
  • Be detailed in your report descriptions.
  • Link directly to violating content when available.
  • Avoid duplicate reports.
  • Ensure a violation of Facebook rules before reporting.

Following Facebook’s guidelines helps maintain a safe community and your participation in reporting content matters.