Skip to Content

What does Facebook do when you report someone?

What does Facebook do when you report someone?

Facebook provides users with the ability to report other users or content that violates their Community Standards. When you report someone or something on Facebook, it sends the report to Facebook’s content moderation team for review. Here’s a quick overview of what happens when you report someone on Facebook:

  • You file a report through Facebook’s reporting system, specifying the content, account, or behavior you want to report.
  • The report goes to Facebook’s content moderation team (tens of thousands of human reviewers) for assessment.
  • The moderation team reviews the report to determine if it violates Facebook’s Community Standards.
  • If a violation is found, the moderation team may remove the content, disable the account, or take other actions based on the severity of the violation.
  • In some cases, no action is taken if a violation cannot be verified.
  • The user who submitted the report may receive a notification from Facebook on the outcome of the report.

So in summary, when you report someone on Facebook, it triggers a human review process by Facebook’s content moderation team. If they find a policy violation, they will take action ranging from content removal to account disablement. But not all reports will necessarily lead to enforcement action.

Why Does Facebook Allow Users to Report Others?

Facebook provides users with reporting tools for a few key reasons:

  • To maintain the safety and security of their platform – Reporting helps Facebook identify policy violations that put users at risk.
  • To enforce their Community Standards – Reporting helps Facebook enforce their rules on hate speech, harassment, nudity and other prohibited content.
  • To give users control – Reporting provides a mechanism for users to notify Facebook of issues and request intervention.
  • To improve their moderation – Reporting gives Facebook insights into issues arising in the user community.

Without user reports, Facebook would have a much harder time identifying and acting on content that violates their standards. User reporting helps Facebook moderate at the massive scale of its user base.

What Kinds of Things Can You Report on Facebook?

Facebook allows users to report a wide range of issues through its reporting system, including:

  • Harassing, threatening or bullying behavior
  • Hate speech, threats of violence or criminal behavior
  • Nudity or sexual content
  • Fake or impersonation accounts
  • Spam, scam or false advertising posts
  • Terrorist propaganda or recruitment
  • Self-harm content
  • Violence including animal abuse or human death
  • Sale of regulated goods like firearms or drugs

Essentially any content or behavior that seems to violate Facebook’s Community Standards can and should be reported through the platform’s reporting system. This empowers users to notify Facebook of issues that need attention.

How Do I File a Report on Facebook?

Reporting something on Facebook just takes a few clicks, here are the basic steps:

  1. Go to the profile of the user or page of the content you want to report.
  2. Click the three dots icon next to the content or in the upper right of a profile.
  3. Select “Report” or “Report Page” from the dropdown menu.
  4. Choose the option that best describes the issue and add any additional details.
  5. Click submit to file the report with Facebook for review.

You can report posts, comments, photos, videos, pages, profiles, groups and events. Facebook offers reporting options tailored to the specific type of content.

What Happens When You Report Someone on Facebook?

When you report a user or content on Facebook, here is a general overview of what happens behind the scenes:

  1. The report is entered into Facebook’s Community Operations queue for review.
  2. Content is assessed by a Facebook moderator against their Community Standards.
  3. If the content is deemed to violate standards, the moderator takes enforcement action.
  4. That enforcement may include content removal, disabling the account, removing the post capability, or depending on severity, permanent disabling the account.
  5. The user who filed the report may then receive a notification that action was taken.
  6. If no violation is found, the reporting user is notified that no action will be taken.

So in summary, a report triggers human review by Facebook’s moderation team, which will then take action if they deem a policy was violated. The reporting user will get notified of the outcome.

What Types of Enforcement Action Does Facebook Take?

When content or an account is found to violate Facebook’s policies after a user report, Facebook has a range of enforcement actions they may take. These include:

  • Removing the content – This could be taking down an individual post, photo, video or comment.
  • Disabling account functionality – Facebook may disable certain features like posting or commenting abilities for a set period of time.
  • Requiring account verification – For repeat violators, Facebook may require the user submit a government ID and prove account ownership.
  • Temporary disabling of the account – This blocks all access to the account for a set period, from 24 hours up to 30 days.
  • Permanent disabling of the account – For severe or repeat violations, Facebook may permanently disable the account’s access and functionality.

Facebook escalates enforcement actions based on the severity and frequency of violations. The goal is to modify behavior rather than immediately delete accounts in most cases. But for severe abuses, permanent disabling ensures the policy-violating behavior cannot continue.

What Happens When You Get Reported or Disabled on Facebook?

If you are on the receiving end of a report or your account gets disabled, here’s what generally happens:

  • You will receive a notification from Facebook detailing the policy violation and enforcement action being taken.
  • You have the option to appeal the action if you think it was taken in error.
  • The appeal will be reviewed within 24 hours by a member of Facebook’s team.
  • If your appeal is successful, the enforcement action will be reversed. If not, the action will stand.
  • For account disabling, you have the option to request a review after a set time period, ranging from 24 hours to 30 days depending on severity.
  • If your account is permanently disabled, you can continue appealing the decision, but Facebook may reject these appeals if they have strong evidence supporting the permanent disabling.

So in essence, you will be notified of enforcement actions taken against you. You can appeal these actions and request reviews, but Facebook makes the final determination based on their standards.

What Happens if You Report Someone Anonymously?

Facebook does allow submitting reports anonymously without attaching your name or profile to it. Here’s what happens when you report someone anonymously:

  • The report still goes into Facebook’s queue and will be reviewed by moderators.
  • The report may be given lower priority than non-anonymous reports though.
  • Facebook can still take the same enforcement actions based on an anonymous report.
  • You will not receive any notifications about the outcome of the report since it is anonymous.
  • The person/content you reported will not know who filed the report.
  • Facebook emphasizes that anonymous reporting should only be done for privacy reasons, not to harass others.

Anonymous reporting is intended for cases where the reporter fears retaliation or privacy violations if their identity is revealed. Facebook tries to avoid abuses of anonymous reporting, but still allows it to enable reporting about sensitive issues.

What Types of Reports May Be Ignored or Rejected by Facebook?

While Facebook encourages reporting content that seems to violate policies, some types of reports are likely to be ignored or rejected upon review. These include:

  • False or insincere reports – Facebook may reject reports intended solely to harass others or silence differing opinions vs addressing actual violations.
  • Differing opinions – Having a different opinion from someone is not itself a violation of Facebook policy.
  • Private disputes – Facebook is unlikely to get involved in private disputes or drama between users.
  • Re-sharing old posts – Resurfacing and reporting old posts that are now in compliance is often rejected.
  • Things Facebook doesn’t restrict – There are views like pro or anti-vaccination that Facebook decides not to moderate.
  • Posts you just don’t like – Simply not enjoying or liking a post is not grounds to report it.

Facebook is unlikely to take action on reports that reflect disputes between users, attempts to silence differing views, or moderation of topics outside their current standards.

What Should You Do if Someone Makes a False or Abusive Report Against You?

If someone makes an insincere, false or abusive report against you, here are some recommended steps:

  1. Directly reach out to the person requesting they withdraw the report, if possible.
  2. Thoroughly appeal Facebook’s action citing evidence contradicting the report.
  3. File a report yourself against the person falsely reporting you.
  4. Avoid retaliating as you may then be subject to a counter-report.
  5. Request a review after a disabling as soon as the time period allows.
  6. Submit an unrelated request seeking human review of the disabling.

The best recourse is communicating directly with the false reporter requesting they withdraw their complaint. Failing that, directly contesting the report and submitting your own report about abuse by the person may resolve the situation eventually.

What Steps Can You Take if Your Account Gets Disabled?

If your Facebook account ends up disabled, either temporarily or permanently, here are some steps to try:

  • First thoroughly read Facebook’s notice to understand exactly why they disabled your account.
  • Calmly contest the disabling through Facebook’s appeals process citing any evidence you have contradicting their decision.
  • Wait the specified time period and request an additional review.
  • Submit a new request for review unrelated to the disabling reason (e.g. password reset).
  • If permanently disabled, continue appealing, waiting 30 days between each appeal.
  • Seek assistance from Facebook’s contacts for disabled accounts provided in the notice email.
  • Avoid opening a new account as Facebook may flag this as evasion of their enforcement action.

With patience and persistence, some disabling actions can be overturned or converted to temporary rather than permanent status. However, if evidence clearly shows a severe violation of their policies, reversal of the disabling is very unlikely.

Can You Sue Facebook for Disabling Your Account?

There is really no grounds to sue Facebook if they enforce their platform policies and disable your account – even permanently. Here are the key realities around suing Facebook over account disabling:

  • Facebook’s Statement of Rights and Responsibilities gives them the right to terminate accounts at their sole discretion.
  • Courts typically recognize Facebook’s rules as a binding contract users agree to.
  • Facebook as a private platform is not required by law to maintain your account.
  • Only if disabling reflected illegal discrimination might a lawsuit have grounds, and it would still be a challenging case.
  • Users have no property or due process rights to an account on Facebook’s private platform.
  • Filing a suit will almost certainly result in quick dismissal as courts consistently uphold Facebook’s rights here.

While users may be understandably upset over losing access to their account, Facebook’s legal right to enforce their posted rules is well established. Lawsuits attempting to force Facebook to maintain accounts almost never succeed.

Can a Facebook Account Be Disabled Forever?

Yes, Facebook can and does permanently disable accounts for severe or repeat violations of their policies. Here are some key facts around permanent disabling of an account:

  • Permanent disabling completely removes your access to your account, all content you posted, and prevents re-activating it.
  • Common reasons include major privacy violations, hate speech, bullying, and threatening real harm.
  • Accounts tied to actual criminal investigations or government requests may also get permanently disabled.
  • Once your appeal options are exhausted on a permanent disabling, there is little recourse to revive the account.
  • Creating a new account to evade the disable may result in additional enforcement action.
  • While rare, permanent disabling is within Facebook’s rights – you agree to it as a condition of having an account.

Facebook will exercise permanent disabling only for the most severe abuses or at the request of law enforcement. But it is possible, so understanding their policies is important to maintaining access to your account.

Are There Any Alternatives to Reporting Someone on Facebook?

While reporting is the main mechanism Facebook provides for addressing concerns, there are a few additional options you can consider as alternatives in some cases:

  • Blocking the user – You can proactively block a harassing user from interacting with you.
  • Unfriending or unfollowing – Removing a friendship or following the user removes their content from your feed.
  • Muting the user – You can mute an annoying user’s posts without them knowing.
  • Direct communication – Messaging the user your concerns may lead to resolution without getting Facebook involved.
  • Restricting audience – Adjusting your privacy settings can remove someone from seeing certain posts.

In many cases, direct communication or use of privacy controls and blocking can manage difficult users without necessitating involvement by Facebook through reporting. This allows you to curate your own experience.

Key Takeaways on Reporting Someone on Facebook

Here are some key tips to remember about what happens when you report someone on Facebook:

  • Reporting triggers human review by Facebook’s moderation team against their posted standards.
  • Content removal, account disabling, or no action may result from a report.
  • You can report a wide range of concerning content through Facebook’s reporting system.
  • Facebook acts more aggressively on reports regarding threats, violence, nudity or harassment.
  • Those reported can appeal enforcement actions taken by Facebook.
  • Repeated severe violations can lead to permanent disabling of accounts.
  • Lawsuits attempting to reverse Facebook moderation decisions almost always fail.

Understanding the reporting and moderation process on Facebook enables you to effectively notify them of issues while avoiding abuse of the system. Reporting should be thoughtful and reserved for clear violations to have the greatest impact. With great power comes great responsibility!

Conclusion

Facebook’s reporting system empowers users to notify them of potential violations of their Community Standards. This enables Facebook to investigate concerning content and accounts and take action when appropriate. Knowledge of what happens when you report someone on Facebook helps ensure you use this tool effectively. But it’s also important to keep in mind alternatives like blocking or unfollowing if you simply want to curate your own feed.