Content Moderation Company Found Responsible for Worker’s Psychological Damage: A Turning Point in the Industry

Spanish authorities are putting pressure on Meta’s content moderator subcontractor model.

In a groundbreaking ruling by a court in Spain, a Barcelona-based subcontractor for Meta, the parent company of Facebook and Instagram, has been held responsible for causing psychological damage to one of its workers. This court decision marks the first time that a content moderation company has been found accountable for the mental disorders suffered by an employee.

According to local press reports, the court case was brought by a 26-year-old Brazilian worker who has been undergoing psychiatric treatment for the past five years due to exposure to extreme and violent content on Facebook and Instagram. The content consisted of disturbing images such as murders, suicides, terrorism, and torture. As a result of this exposure, the worker experienced a range of psychological issues, including panic attacks, avoidance behaviors, excessive worry about illnesses, disturbed sleep, difficulty swallowing, and a significant fear of death.

The court acknowledged that the worker’s mental health problems were not a common illness, but rather a direct result of their work. However, Meta’s subcontractor attempted to dismiss any responsibility by categorizing the worker’s condition as a general ailment. It is high time that Meta and other social media platforms recognize and address the magnitude of this problem, rather than denying its existence. It is only when they confront and accept this horrific reality that things will start to change.

The court ruling has been hailed as a major victory for workers who experience mental health issues as a consequence of their job. The law firm representing the worker, Espacio Jurídico Feliu Fins, emphasized the need for Meta and other companies to revise their approach and strategies towards content moderation.

🔍 Related Q&A:

Q: Is content moderation outsourcing a common practice in the industry? A: Yes, the outsourcing of content moderation to third-party subcontractors is a common practice in the industry. These subcontractors typically employ low-paid workers who act as human filters for violent and disturbing content.

Q: Are there other instances where content moderators have suffered psychological harm? A: Yes, there have been several cases where content moderators have developed mental health issues due to their exposure to violent and graphic images. For example, Meta settled a $52 million class-action lawsuit in the United States in 2020, where content moderators alleged that their work led to post-traumatic stress disorder.

Q: What steps are companies taking to support content moderators’ mental health? A: Companies like Meta claim to have certain provisions in place to support content moderators’ mental health. These include on-site support with trained practitioners, access to private healthcare from the first day of employment, and technical solutions to limit exposure to graphic material.

The outsourcing of content moderation to third-party companies has been an ongoing concern for years. Stories of underpaid workers being exposed to traumatic content have raised ethical questions about the responsibility of social media platforms. Although Meta has declined to comment on the recent ruling against its subcontractor in Spain, the company has defended its approach to content moderation by highlighting the support and screening tools it provides to subcontractors.

However, it is important to consider whether the implementation of support services and screening tools can be undermined by stringent productivity quotas imposed by subcontractors. In practice, these quotas may make it challenging for content moderators to access adequate support while meeting the demands of their employers.

Moreover, reports from Barcelona-based newspapers have indicated that a significant percentage of workers at Meta’s subcontractor, CCC Barcelona Digital Services, have taken time off work due to psychological trauma caused by reviewing toxic content. Workers have criticized the level of support provided by their employer as insufficient. Additionally, exacting performance targets and shift work can further contribute to the development of mental health issues among content moderators.

🔍 Further Reading:

  1. The Dark Side of Content Moderation: The Impact on Workers’ Mental Health
  2. Social Media Giants and the Moral Responsibility of Content Moderation
  3. AI Solutions for Content Moderation: Balancing Efficiency and Psychological Well-being
  4. The Psychological Toll of Watching Violence on Social Media

It is crucial that companies like Meta prioritize the mental well-being of their content moderators and ensure a supportive and safe work environment. Legal rulings that impose responsibilities on third-party content reviewers to safeguard workers’ mental health may serve as a turning point in the industry. While content moderation will continue to be a challenging task, the health and well-being of those responsible for protecting users from harmful content must be prioritized.

In conclusion, the recent court ruling in Spain highlights the need for social media platforms to reevaluate their strategies and acknowledge the impact of content moderation on the mental health of workers. The outsourcing of content moderation should not come at the expense of employees’ well-being. As the industry moves forward, it is crucial to strike a balance between protecting users and ensuring the safety and mental health of content moderators.

💬 Readers, what are your thoughts on the responsibilities of social media companies towards content moderators? Share your opinions below! And don’t forget to hit that share button to spread awareness of this important issue.