Skip to Content

Does Facebook hire moderators?

Does Facebook hire moderators?

Yes, Facebook does hire content moderators to review posts, photos, videos, and other content on their platforms like Facebook and Instagram. Moderators help enforce Facebook’s Community Standards by identifying and removing content that violates policies.

Why does Facebook hire moderators?

Facebook hires moderators because with over 3 billion monthly active users across its platforms, it cannot rely on automated systems alone to monitor all user-generated content. Human review is needed to provide nuance in evaluating the context of posts and to make complex judgment calls. Moderators play a crucial role in keeping Facebook’s platforms safe and promoting healthy communities.

Some key reasons Facebook employs content moderators include:

  • Volume of content – With billions of users posting status updates, photos, videos, comments etc daily, human review is needed to properly assess all content.
  • Nuance – Automated systems have limitations in understanding context and nuance in user posts. Human moderators are needed to evaluate subtle meanings.
  • Complex judgments – Decisions on whether content violates policies or should be allowed often require nuanced and subjective judgments. Humans are needed for complex cases.
  • Local laws and norms – Moderators with local knowledge help enforce policies in line with local laws and cultural norms across different countries.
  • Evolving policies – As Facebook introduces new policies or updates existing ones, human reviewers help provide feedback and ensure enforcement matches policy intent.

By employing a global team of moderators with diverse backgrounds and language abilities, Facebook is able to keep better pace with the vast amount of content generated across its platforms worldwide.

What are the roles and responsibilities of Facebook moderators?

Facebook content moderators are responsible for reviewing posts, photos, videos, and other content reported by users as violating policies. Their main roles include:

  • Reviewing content flagged as potentially violating Facebook’s Community Standards or regional laws.
  • Making judgment calls on whether flagged content should be removed or left up based on Facebook’s policies.
  • Deleting content confirmed to violate Facebook’s rules, such as hate speech, nudity, harassment, terrorism, and graphic violence.
  • Escalating complex cases to specialized teams at Facebook for additional review.
  • Providing feedback to improve Facebook’s automated moderation systems.
  • Searching for policy-violating content that may not have been reported yet.

In addition to evaluating content, moderators may also review user appeals if their content was taken down. They would need to assess if content was removed in error and should be restored.

Moderators work closely with Facebook policy teams to ensure they stay up-to-date on policy changes and how to enforce new or updated rules. They receive initial and ongoing training on content categories, policy nuances, and how to make consistent judgments.

What are Facebook’s moderator guidelines?

Facebook provides extensive guidelines and training materials to content moderators to help them accurately apply policies and rules. While the full guidelines are not publicly available, some key elements include:

  • Criteria for evaluating different content policy categories like violence, nudity, bullying, terrorism, and graphic content.
  • Guidance on how to judge context and intent when making decisions about content.
  • Culture-specific guidance to help moderators understand local laws, customs, and norms.
  • Decision trees to guide moderators through policy-based reasoning to arrive at take-down or leave-up actions.
  • Examples of violating and acceptable content to illustrate policy standards.
  • Requirements for consulting specialized teams on complex cases like newsworthy exceptions or celebrity accounts.

Moderators are expected to be well-versed in the guidelines and apply them consistently when making moderation decisions. They receive ongoing training and testing to ensure they maintain high standards over time.

What tools do Facebook moderators use?

Facebook provides moderators with specialized tools and software to help them efficiently review high volumes of content and take action against policy violations. Some of the tools include:

  • Review platforms – Allow moderators to view reported posts, images, videos, comments, profiles etc. and take actions like delete, ignore, escalate, or restore.
  • Policy guidance – Integrates Facebook’s guidelines, rules, and decision trees to provide real-time policy guidance.
  • Queues – Intelligently triages content into categorized queues for moderators (e.g. harassment queue, graphic violence queue).
  • Search – Allows moderators to proactively search for potentially violating content using keywords.
  • Workflow automation – Streamlines repetitive tasks to help moderators work more efficiently.
  • Reporting dashboards – Provides metrics and reporting to track productivity, accuracy, and quality.

Having robust tools and stable systems optimized for the moderation workflow is essential for enabling Facebook’s human reviewers to handle huge volumes of content and act quickly against policy violations. The tools aim to maximize productivity while also providing consistency in applying standards.

How are Facebook moderators trained?

Facebook provides extensive training to prepare moderators for reviewing content on its platforms:

  • Classroom training – Moderators undergo at least 2 weeks of in-person classroom training on policies, procedures, tools, and practice scenarios.
  • Culturalization training – Moderators receive training on local cultural norms of the market they will be moderating.
  • On-floor training – New moderators observe expert moderators on the review floor and receive coaching before working independently.
  • Quality assurance – Moderators get regular feedback on accuracy to prevent errors and ensure consistent understanding of policies.
  • Ongoing training – Regular refresher trainings as policies and processes evolve over time.

In some cases, moderators may receive psychological training to handle traumatic content they may be exposed to on the job. The extensive training is aimed at setting moderators up for success before they start working independently. Ongoing training and calibration is also critical to maintain high standards over time.

What qualities make a successful Facebook moderator?

Some key qualities and skills of effective Facebook moderators include:

  • Detail-oriented – Able to carefully review nuanced content and make consistent policy-based judgments.
  • Decisive – Can make quick and firm decisions on whether content should be removed or left up.
  • Calm under pressure – Stays focused when handling large volumes of emotionally-charged content.
  • Communication skills – Can clearly escalate complex cases and provide suggestions to improve policies.
  • Cultural awareness – Sensitive to different cultural norms and local context when applying policies.
  • Collaborative – Works well with teammates and cross-functional teams.
  • Ethical and principled – Dedicated to protecting user safety and privacy.
  • Tech-savvy – Learns new tools and systems quickly.

The ideal moderator has a strong sense of responsibility, operational excellence, and empathy for protecting the Facebook community. Curiosity, resilience, and adaptability are also helpful traits in this rapidly evolving role.

What are the challenges of content moderation?

Facebook moderators face a number of challenges inherent to the role:

  • Volume – Moderating billions of posts across multiple platforms requires efficiency and rigorous quality control.
  • Emotional toll – Reviewing hate speech, violence, and other toxic content can negatively impact mental health.
  • Fast-evolving policies – Keeping up with frequent policy changes requires continuous learning agility.
  • Inconsistent content – Making subjective judgment calls on borderline content open to interpretation.
  • Mistakes and criticism – Intense public scrutiny of moderation decisions and pressure for accuracy.
  • Stress – Tight performance metrics, quotas, and productivity demands contribute to a high-pressure environment.

Ensuring moderator well-being, providing a supportive work environment, enforcing reasonable expectations, and allowing for human error help avoid burnout in this emotionally demanding role.

How does Facebook support its moderators?

Recognizing the challenges of content moderation, Facebook provides support to moderators through:

  • Competitive pay and benefits – Moderators earn at least $15 per hour plus benefits like medical insurance.
  • Wellness resources – Access to counselors and coaching for emotional, mental, and physical health.
  • Shorter shifts – Maximum of 4 hours reviewing sensitive content like violence or nudity.
  • Regular breaks – Flexible but frequent breaks from reviewing content throughout the day.
  • Peer support – Groups and structured time for moderators to share difficult experiences.
  • Feedback channels – Ways for moderators to suggest improvements to policies, tools, or procedures.

While more can always be done, Facebook does seem to be investing in moderator care and burnout prevention as scrutiny of the role increases. Happy, healthy moderators ultimately create better outcomes for end users.

How many content moderators does Facebook employ?

Facebook employs over 15,000 content moderators either directly or through vendor partners according to its latest published figures from 2019. They span full-time employees, contractors, and outsourced vendors.

Some key facts on Facebook’s content moderator workforce:

  • Over 15,000 moderators globally as of 2019.
  • Moderators cover over 50 languages.
  • The workforce has doubled in size each of the past two years.
  • Largest moderation sites are in the U.S., India, Philippines, Ireland, and Poland.
  • 20% of moderators are full-time Facebook employees.
  • 80% are contractors or employed by outsourcing vendors.

Given Facebook’s frequent policy and product changes, the moderator workforce likely continues to expand each year. Investing in content review at this scale highlights Facebook’s recognition that both humans and technology are essential for community integrity.

Facebook Moderator Workforce Size (2019)

Classification Size
Full-Time Employees 3,000+
Contractors 11,500+
Outsourced Vendor Roles 1,000+
Total Moderators 15,500+

Where are Facebook moderators located?

Facebook employs content moderators globally across its offices and vendor sites to provide 24/7 coverage across time zones and local language expertise. Some of Facebook’s largest moderation sites include:

  • United States – New York, Phoenix, Austin, Seattle, Washington DC
  • Europe – Dublin, Warsaw, Barcelona, Berlin, Paris
  • Asia – Hyderabad, Gurgaon, Singapore
  • Latin America – Mexico City, Bogota

Smaller moderation teams may also be based in Canada, Turkey, Italy, Kenya, and other countries. Having global sites enables Facebook moderators to cover more languages, cultural contexts, and time zones.

It also allows content to be reviewed in the region it originated from. This helps pick up on local slang, trending memes, breaking news events, and personalities influencing discourse. Moderators familiar with local culture generally make better judgment calls aligned with community norms.

What is Facebook’s hiring process for moderators?

Facebook has a selective, multi-step hiring process to onboard qualified moderators:

  1. Prescreening – Candidates complete an application assessing basic qualifications, languages, and communication abilities.
  2. Testing – Applicants who pass prescreening complete linguistic, cultural, and policy comprehension tests.
  3. Interview – Shortlisted candidates interview with Facebook moderation specialists.
  4. Reference checks – Facebook reviews references to confirm candidates have integrity and community-centric values.
  5. Training – Final hires undergo 2+ weeks of training on tools, policies, and procedures.
  6. Evaluation – New moderators are evaluated during training and onboarding before working independently.

The hiring process is designed to assess both hard skills through tests and interviews, as well as soft skills like discretion, work ethic, and temperament that are crucial for the role. Extensive training and quality assurance help set new moderators up for success before they start impacting real users.

What is the work environment like for moderators?

Facebook invests in its moderation sites to create professional, supportive work environments:

  • Modern office facilities with ergonomic workstations and standing desks.
  • On-site counselors and wellness rooms for breaks from difficult content.
  • Peer support networks to share emotional burdens with colleagues.
  • Managers experienced in trauma-exposed industries like first responders.
  • Workplace culture encouraging openness, trust, and empowering moderators.
  • Emphasis on psychological safety to learn and make judgments without fear.
  • Regular surveys to get moderator feedback on improving their experience.

While the nature of the job makes moderation an inherently high-stress role, Facebook does seem to be trying to enhance working conditions and provider greater support. There is still room for improvement though according to many insiders and critics.

What is Facebook’s moderator retention rate?

While Facebook does not share exact turnover rates, various reports estimate the average tenure for outsourced moderators is between 6 months to 2 years due to high burnout. For Facebook’s full-time employee moderators, tenure is likely longer given their more integrated business role and higher investment in their well-being.

Some factors impacting retention include:

  • Demanding quota-based performance targets
  • Minimal control over assignments and workflow
  • Lack of social support system in high pressure environment
  • Insufficient mental health resources
  • Unclear career development opportunities

However, Facebook does seem to be trying to improve retention by raising wages, adding benefits, shortening shifts, allowing more breaks, and providing more psychological resources. While moderator burnout remains an industry-wide issue, Facebook will need to continue innovating on workload management and employee support to stabilize its workforce.

Conclusion

Facebook employs over 15,000 content moderators worldwide to support its community integrity efforts. Moderators have the critical yet challenging responsibility of reviewing billions of posts to protect users from harmful content. While moderator well-being remains an area needing improvement, Facebook is investing more in moderator support, facilities, compensation and benefits. Content moderation at this immense scale is complex and ever-evolving, requiring ongoing coordination between Facebook’s policy teams, AI systems, and human reviewers working tirelessly to keep their platforms safe.