Skip to Content

How do I report a post on Facebook and get it removed?

How do I report a post on Facebook and get it removed?

Facebook provides users with the ability to report posts that violate its Community Standards. When you report a post, it alerts Facebook’s content moderation team to review the content and determine if it should be removed.

There are a few steps involved in reporting a post on Facebook:

  1. Locate the post you want to report.
  2. Click on the three dots in the upper right corner of the post.
  3. Select “Report Post” or “Find Support or Report Post.”
  4. Choose the reason for reporting the post from the options provided.
  5. Provide any additional details about why you are reporting the post.
  6. Click “Submit” or “Send Report.”

Facebook’s content moderators will then review your report to decide if the post violates any of Facebook’s policies. If it does, Facebook will remove the post.

When should you report a post on Facebook?

There are a few main reasons you may want to report a post on Facebook:

  • The post contains nudity or sexual content.
  • It includes hate speech, threats of violence, or bullying.
  • It contains graphic violence or animal abuse.
  • It’s spam or a scam.
  • The post contains false information or fake news.
  • It infringes on your intellectual property rights.
  • The account posting it is fake or impersonating someone.

Essentially, any post that violates Facebook’s Community Standards should be reported. These standards prohibit content like hate speech, graphic violence, nudity, and harassment. Reporting posts that violate these rules helps keep Facebook safe and welcoming for everyone.

How to locate the post you want to report

To report a post on Facebook, you first need to locate the specific post. Here are some tips on finding the post:

  • Go to the person’s profile page where they posted it. Click on “Posts” on desktop or “Posts” under their cover photo on mobile to see all their posts.
  • If it was posted in a Group, go to that Group page and search for the post.
  • If it was posted on your News Feed, click on the three dots at the top right of the post and select “Find post.” This will take you to the original post where you can report it.
  • Use Facebook’s search bar to search for keywords, phrases or the name of the person who posted it.
  • Check your Activity Log for interactions with the post and click “See Post” to view the original.

Once you’ve located the specific post, you can move on to reporting it properly using Facebook’s tools.

How to click the three dots to access the report option

After locating the post you want to report, the next step is accessing the report option. On Facebook, you can report posts by clicking on the three dots found in the upper right hand corner of any post.

Here is how to access the three dots menu:

  1. Click anywhere on the post to open it fully.
  2. Look in the top right corner of the post.
  3. You should see three vertical dots lined up next to each other. This is referred to as the “three dots” menu.
  4. Click directly on the three dots.
  5. A drop down menu will appear with various options.
  6. “Report Post” or “Find Support or Report Post” should be one of the options.

On desktop, the three dots icon will appear white on a gray background. On mobile, it will be a darker shade of gray. Hovering over or tapping the icon makes the drop down menu appear.

This menu provides quick access to all the possible actions you can take on a Facebook post, including reporting it. The report option may also be listed as “Give Feedback on This Post” depending on the type of post.

Choosing a reason for reporting from the available options

When you select “Report Post” or “Find Support or Report Post”, you will be presented with a list of options for why you are reporting the post.

These options include:

  • It’s annoying or not interesting
  • Spam
  • Scam
  • False news
  • False information that could contribute to voter suppression
  • Hate speech or symbols
  • Violence or dangerous organizations
  • Bullying
  • Harassment
  • Threats
  • Sexual exploitation
  • Regulated goods
  • Unauthorized sales
  • Something else

Carefully review the list and select the option that best represents why you are reporting the post. For example, choose “Hate speech or symbols” for racist, sexist or discriminatory posts. Select “False information that could contribute to voter suppression” for misinformation about voting.

Choosing the most accurate reporting category will help Facebook’s moderators better understand the issue and take appropriate action.

Providing additional details on why you are reporting the post

After selecting the reporting category, Facebook will give you the option to provide additional details on why you are reporting the post.

You should always take advantage of this option and include as many specifics as possible, such as:

  • Screenshots or links showing the concerning content
  • Names or usernames of people involved
  • Descriptions of harassing or threatening behavior
  • Background on a scam or false information being spread
  • Dates when the post was shared or incident occurred
  • Sections of text that violate policies

The more context and evidence you can provide, the easier it will be for Facebook’s reviewers to understand the problem and take action. Even if the content seems obviously bad, include details on exactly what Community Standards it violates.

How Facebook reviews reported posts

Once you complete the reporting process, here is what happens behind the scenes:

  1. Your report is added to a queue for Facebook’s content moderation team.
  2. Moderators review the post using Facebook’s Community Standards as a guideline.
  3. If they decide the post violates standards, they will remove (take down) the post.
  4. In some cases, they may also disable the account that posted it.
  5. You will get a notification letting you know the result of your report.

Moderators try to review all reports within 24 hours. However, it may take longer depending on how many reports Facebook receives.

Posts with the most serious violations, such as terrorism, child exploitation, or imminent real world harm, get prioritized first. Other reports are dealt with in the order they are received.

What happens when a post gets removed

If Facebook’s moderators decide a post violates policies after it has been reported, they will remove the post. This means it will no longer be visible to anyone on Facebook.

Specifically, a removed post:

  • Disappears from Facebook site and apps
  • Is no longer included in Search results
  • Cannot be shared or interacted with
  • Is only visible to the person who posted it
  • Shows a message saying it was removed for violating standards

The original poster may get an email or notification explaining why their post was removed. They have the option to appeal the decision if they disagree that the post violated standards.

Posts may also be deleted by the person who shared them. A deleted post functions the same as a removed one – becoming inaccessible on Facebook. The difference is the original poster chose to delete it themselves.

What to do if the post is not removed

In some cases, Facebook may decide not to remove content you reported. The main reasons are:

  • After reviewing the post more closely, they determine it does not actually violate any policies.
  • The post falls into a gray area and Facebook wants to allow freedom of expression.
  • They do not have enough context or details to make a judgment call.

If you feel Facebook made a mistake by not taking action on your report, you have a few options:

  1. Click “Request Review” on the notification saying no action was taken. This prompts moderators to re-review the post.
  2. Report the post again with more details explaining why you think it violates policies.
  3. Direct message or email the Page or person who posted it and explain your concerns about the content.
  4. Block the account posting violating content so you no longer see what they share.

Providing additional context, evidence and reason in a respectful way can sometimes lead to the content being removed if it does in fact violate standards. But in some cases, you may need to unfollow or block the poster if Facebook leaves it up.

Steps to report a post on Facebook

Here is a quick recap of the steps involved in reporting a post on Facebook:

  1. Locate the post – Search for the post on a Page, in a Group or on your News Feed. You can use Facebook search to find keywords or phrases in posts.
  2. Open the three dots menu – Click the three dots icon in the upper right corner of the post and select “Report post” or “Find Support or Report Post.”
  3. Choose a reporting category – Review the options and select the one that best represents your reason for reporting.
  4. Give details – Use the text box to provide additional information to moderators on why the post violates standards.
  5. Submit the report – Click submit or send to officially flag the post for Facebook to review.
  6. Check notifications – Watch for a notification telling you if moderators removed the post based on your report.

Reporting posts is an important way to help keep Facebook safe and limit the spread of harmful content. Following Facebook’s reporting guidelines gives moderators the details they need to properly enforce standards.

How to report a Facebook profile

In addition to reporting individual posts, you can also report entire Facebook profiles that violate Community Standards. Here’s how to report a profile:

  1. Go to the profile page you want to report. For friends, click on their name in your friends list. For public figures, go to their official Page.
  2. Click on the three dots at the upper right corner of their cover photo.
  3. Select “Report” from the dropdown menu.
  4. Choose from categories like “pretending to be someone,” “harassment” or “inappropriate Page.”
  5. Explain in details what policies the profile violates or harassing behavior they have shown.
  6. Click submit to send the report for review.

Profiles reported this way are reviewed by Facebook’s Community Operations team in the same way as reported posts. If they find violations, Facebook may disable the profile.

Reporting profiles that impersonate others, harass people or violate privacy laws helps keep bad actors off Facebook. Make sure to include any supporting details you have in the report.

How to report Facebook events, groups, or pages

Inappropriate events, groups and pages can also be reported to Facebook for review using these steps:

To report a Facebook event

  1. Go to the event page
  2. Click on the three dots next to the event name
  3. Select “Report” from the menu
  4. Choose a category that fits the issue, like “prohibited goods” for an event promoting illegal drug sales
  5. Add any details in the text box
  6. Click submit

To report a Facebook group

  1. Go to the group page
  2. Click on the three dots next to the group name
  3. Select “Report Group”
  4. Pick a reason like hate speech, harassment, etc.
  5. Explain the issue in the text box
  6. Click “Submit Report”

To report a Facebook page

  1. Go to the page you want to report
  2. Click on the three dots next to the page name
  3. Choose “Report Page”
  4. Select a reason like “unauthorized sales”
  5. Give details on the problem
  6. Click “Report Page” to submit

Reporting inappropriate events, groups and pages is important to maintaining the integrity of the Facebook community.

You may have to block or unfollow violating accounts

If Facebook decides not to remove content you reported or disabling violating profiles, you may need to take matters into your own hands by blocking or unfollowing those accounts. This can prevent you from seeing the unwanted content on your feed.

  • To block someone, click on the three dots at the top right of their post or profile and select “Block.”
  • To unfollow someone without blocking, hover over Following on their profile and click Unfollow.

While reporting gives Facebook a chance to review content, blocking and unfollowing immediately remove the content from your view. Use these tools as needed to control what you see in your feed.

Conclusion

Reporting posts, profiles and other content is vital to keeping Facebook a safe place for everyone. If you see something that violates Facebook’s rules or harasses others, take the time to properly report it using the detailed steps outlined here. The more context you provide, the more likely moderators can understand the problem and take action – including removing posts or disabling accounts if needed. With hundreds of millions of people using Facebook, we all must do our part.