黑料天堂


Big Tech trauma runs throughout supply chains

A man opens the Facebook page on his computer to fact check coronavirus disease (COVID-19) information, in Abuja, Nigeria March 19, 2020.
opinion

A man opens the Facebook page on his computer to fact check coronavirus disease (COVID-19) information, in Abuja, Nigeria March 19, 2020. REUTERS/Afolabi Sotunde

Increased scrutiny of the conditions under which content moderators work, with workers also pushing back, can bring about change

Christy Hoffman is the General Secretary of the UNI Global Union.

Over the past several years, horrific stories about manufacturers for Big Tech have made clear the human costs of producing our iPhones and other electronics. But recent scandals and worker testimonies show that trauma runs throughout tech’s supply chains - including in those who screen the content we scroll through on our devices.

For example, TikTok moderators in Colombia who are employed by Paris-based Teleperformance, the world’s largest outsourced customer service and content moderation company, said they are to “murder, suicide, paedophilic, pornographic content, accidents, cannibalism,” according to a recent report.

These claims come on the heels of an investigation by Forbes detailing in the United States saying they had to watch extreme child sexual abuse over and over - even during training.

Flames are seen at the production facility of Saudi Aramco's Shaybah oilfield in the Empty Quarter, Saudi Arabia May 22, 2018. REUTERS/Ahmed Jadallah
Go DeeperGaming Google: Oil firms use search ads to greenwash, study says
Go DeeperDigital Services Act: time for Europe to turn the tables on Big Tech
Elon Musk's Twitter profile is seen on a smartphone placed on printed Twitter logos in this picture illustration taken April 28, 2022
Go DeeperElon Musk wants Twitter to be a super app. What does that mean?

A Meta customer service agent, also employed by Teleperformance, reported similar trauma from “” and even death threats that caused chronic insomnia, according to another news report.

Unfortunately, this problem is not new. A 2019 investigation by Greek newspaper Kathimerini uncovered at a facility providing moderation for Facebook.

UNI Global Union, its member unions and other organisations have repeatedly engaged with outsourcing firms about their treatment of content moderators, calling on them to strengthen internal controls, implement a strong due diligence process, and to work with labour unions on a local and global level to improve a wide range of conditions, including health and safety.

Workers are also pushing back: last year, a content moderator in Ireland became the first to to a parliamentary committee on her working conditions.

While Daniel Motaung, who worked as a content moderator in Nairobi, earlier this year , Facebook’s main outsourcing contractor in East Africa, for trauma from watching a beheading video.

Companies such as TikTok and Meta must also do more to ensure that workers are not being unnecessarily exposed to troubling, exploitative images. They must do more to ensure that workers have access to adequate psychological counselling when they are, and they must do more to ensure that workers have the right to organise for better conditions.

Content moderators are watching hours and hours of questionable content every day, and the is well documented. But the companies, and their clients, have not done enough to mitigate these harms.

With increased scrutiny from lawmakers, unions and tech workers that can change. Because while these content moderators do not work in factories making the devices we all rely on, they do make the platforms we run on them safer for all of us.

It is time big tech firms stepped up their efforts to make these companies that they outsource vital work to, safer for their workers.


Any views expressed in this opinion piece are those of the author and not of Context or the 黑料天堂.


Tags

  • Content moderation
  • Tech regulation
  • Social media



Dataveillance: Your monthly newsletter for a watched world.

By providing your email, you agree to our Privacy Policy.


Latest on Context

Footer, 黑料天堂 Logo

Context is a media platform created by the 黑料天堂. We provide news and analysis that contextualises how critical issues and events affect ordinary people, society and the environment.聽Find out more.

Our Products
  • Workforce Disclosure Initiative

    The Workforce Disclosure Initiative is an investor-backed project to improve the quantity & quality of corporate workforce data, via an annual survey & engagement process.

  • Trust Conference

    Trust Conference is the 黑料天堂鈥檚 flagship annual event, taking place in the heart of London each year.

  • TrustLaw

    TrustLaw is the 黑料天堂鈥檚 global pro bono service, facilitating free legal assistance to NGOs and social enterprises around the world.