Skip to Content

Are Facebook moderators traumatized?

Are Facebook moderators traumatized?

Facebook relies on an army of content moderators to review posts and decide what content violates policies against hate speech, violence, nudity, and other problematic material. These moderators see some of the worst content on the internet as part of their jobs. There have been reports that this constant exposure to disturbing and potentially traumatizing content has led to mental health issues for moderators. This article will examine whether the claims that Facebook moderators are traumatized by their work are true.

What do Facebook moderators do?

Facebook moderators review posts that have been flagged as potentially violating Facebook’s content policies. This can include posts with hate speech, graphic violence, adult content, and more. Moderators look at the post and surrounding context and decide whether to leave it up, take it down, or escalate it for further review.

Some key facts about Facebook moderators:

  • There are around 15,000 moderators reviewing content for Facebook
  • Many work for third-party contracting firms like Cognizant and Accenture, not directly for Facebook
  • They may review thousands of potentially rule-breaking posts per day
  • Turnover rates among moderators are high

It is an extremely demanding job that requires moderators to look at very disturbing content regularly while making quick judgment calls on whether that content violates policies.

What kind of disturbing content do moderators see?

Facebook moderators are exposed to some of the most graphic, violent, hateful, and disturbing content on the platform. This can include:

  • Violent deaths and injuries
  • Physical and sexual abuse of children
  • Animal cruelty and torture
  • Graphic pornography and nudity
  • Hate speech and threats against protected groups
  • Terrorist propaganda and recruitment videos

Moderators may have to review videos and images of violence, abuse, and other traumatic content as part of deciding whether it violates Facebook’s rules. Even just reading violent or hateful text posts can take an emotional toll over time.

What are the mental health impacts on moderators?

Many moderators have reported suffering mental health consequences from constant exposure to graphic and disturbing content. Some of the mental health issues experienced include:

  • Post Traumatic Stress Disorder (PTSD) – Moderators can experience trauma symptoms like flashbacks, nightmares, anxiety attacks, dissociation, etc.
  • Depression – Constant immersion in negative environments can lead to feelings of hopelessness, sadness, lack of motivation, and other depression symptoms.
  • Sleep disorders – Many moderators suffer from insomnia after spending all day looking at disturbing content.
  • Addiction – Some moderators turn to drugs, alcohol, and other addictive behaviors to cope with the stress.
  • Suicidal thoughts – There have been reports of moderators experiencing suicidal ideation and suicide attempts.

These mental health consequences stem directly from the working conditions and extreme content moderators have to handle. Without proper support, the job can take a significant psychological toll.

What steps has Facebook taken?

After reports surfaced about moderators’ mental health, Facebook has taken some steps to try to improve conditions, including:

  • Increasing wages and adding more support staff
  • Providing more counseling and mental health support
  • Reducing the amount of graphic imagery moderators have to see in one sitting
  • Giving moderators more control over exposures, such as blurring images

However, many moderators still feel that Facebook needs to do much more to address the root causes behind their trauma and make meaningful improvements to health and safety practices. More transparency and accountability are needed.

What are moderators calling for?

Moderators and their advocates have made several demands for how Facebook can better address their wellbeing:

  • Form an independent ethical oversight board to hold Facebook accountable
  • Limit moderator shifts to 4 hours for exposure to graphic content
  • Provide unlimited access to mental health support with trauma-informed care
  • Increase wages and improve benefits like vacation time
  • Allow moderators to permanently opt-out of seeing certain content
  • Publicly report on moderator mental health and company improvements

Implementing changes like these could significantly improve working conditions, morale, and overall mental health. But progress has been slow so far.

Examples of moderator stories

To illustrate the toll of the job, here are some examples of real moderators’ experiences:

Selena Scola – Diagnosed with PTSD

Selena Scola worked as a Facebook moderator for 9 months before being fired. She reviewed thousands of disturbing images and videos per day. She began experiencing panic attacks, insomnia, depression, and PTSD symptoms. Her mental health deteriorated to the point she brought a lawsuit against Facebook and the contractor she worked for over the lack of mental health support. Facebook tried to argue her PTSD was pre-existing, which Scola strongly disputed. The case was eventually settled out of court.

Chris Gray – “No amount of money can make up for that kind of damage”

Gray worked as a Facebook moderator in Ireland for 6 months before quitting due to the mental health impacts. He had no prior mental health issues but began having panic attacks and suicidal thoughts, which he attributes directly to the working conditions and extreme content. He said no support was provided, and his employer tried to downplay the issue. They described content moderators as being “resilient” to viewing graphic material. After quitting, Gray became an activist pushing for tech companies and governments to address moderator mental health.

Mitch – Constant nightmares

Mitch (last name withheld) worked for a Facebook contractor as a moderator for over a year. He described having constant nightmares related to the disturbing posts he had to review, including seeing images of dead bodies when he closed his eyes. He said he didn’t receive any warnings or preparation for the nature of the content. He also wasn’t allowed to talk to coworkers or friends/family about the specifics of the job. He suffered panic attacks, paranoia about being harmed, and difficulty focusing.

Research studies on moderator mental health

In addition to anecdotal reports, several research studies have investigated the mental health impacts on content moderators:

University of California, Berkeley (2020)
– Surveyed 121 Facebook moderators working for Accenture
– Found very high rates of depression (37%), anxiety (50%), PTSD symptoms (18%)
– 12% had suicidal thoughts in past month
– Low job satisfaction

Keegan Hankes, Data & Society (2018)
– Interviews with 17 current/former moderators
– Most reported psychological damage, depression, trauma symptoms
– Felt social media companies hid working conditions

Leticia Bode et al., Georgetown (2020)
– Measured psychological conditions among 114 moderators
– More emotional exhaustion, secondary trauma, depression compared to population
– Linked to volume of disturbing imagery seen and unrealistic expectations

These studies demonstrate many moderators suffer adverse mental health effects directly related to their work exposure. The rates of PTSD, depression, anxiety far exceed general population levels.

Is Facebook doing enough?

While Facebook has taken some positive steps, most experts agree there is still significant room for improvement when it comes to supporting moderator mental health. Ongoing criticisms include:

  • Mental health resources are inadequate and hard to access
  • Provided therapists often lack training in trauma-informed care
  • Most initiatives focus on treating issues versus prevention
  • Problematic quota expectations for reviews remain in place
  • Safety measures inconsistently implemented across regions/contractors
  • Lack of full transparency around wellbeing and safety practices

Facebook has also strongly resisted attempts to allow independent auditing of its moderator support systems. More external oversight is needed.

Overall, while the situation has improved slightly, Facebook moderators continue facing mentally damaging conditions without enough protection or support from the company that profits off their labor. Much more progress is still required to ethically address moderator wellbeing.

What are other tech companies doing?

Facebook is not alone in employing content moderators and contributing to mental health issues. Here is a brief overview of how other major tech companies’ compare in protecting moderator wellbeing:

Company Key Measures Taken
YouTube – Limiting consecutive viewing of graphic material to 4 hours max
– Providingmakers wellbeing guides and resiliency programs
– Increasing mental health staffing
Twitter – Enhanced counseling and wellness benefits
– Allowing moderators to step away from content at any time
Reddit – No graphic content quotas
– Anonymous group chat support sessions
TikTok – Restricting viewing time of distressing videos
– Partnered with mental health organizations

While Google, Twitter, Reddit and others have made improvements, critics argue tech companies overall still lag behind other professions with high trauma exposure like counselors or first responders when it comes to protecting mental health.

What more should be done?

There are several steps that Facebook, government regulators, and the tech industry as a whole can take to better protect moderator mental health:

  • Form independent ethical review boards to audit and oversee companies’ wellbeing practices.
  • Pass laws to classify content moderators as high-risk workers requiring additional safeguards and support.
  • Limit consecutive time viewing graphic content to 2-4 hours daily maximum.
  • Hire mental health professionals with trauma-care expertise to support moderators.
  • Provide unlimited paid leave time for moderators needing mental health breaks.
  • Allow moderators to permanently opt-out from viewing specific types of disturbing content.
  • Fund research by academics and non-profits into best practices for mitigating psychological harms.

Technology companies have a large responsibility to address this issue due to directly profiting off the labor of content moderators. But government oversight will likely be needed as well to mandate meaningful reforms industry-wide.

Conclusion

In summary, the evidence strongly suggests that Facebook content moderators are at high risk of developing psychological trauma, PTSD, depression and other issues directly related to exposure to graphic and disturbing content. While Facebook has taken some positive steps, much more remains to be done to truly protect moderator mental health and prevent harm. Comprehensive solutions require Facebook and the entire tech industry to make major reforms, potentially aided by regulation. Without proper support, content moderation cannot be seen as ethical labor. Companies’ profits should not come at the cost of employees’ mental wellbeing. More progress is urgently needed to address this crisis among tech’s vulnerable hidden workforce.