In solidarity with content moderators

Screen Shot 2020 09 25 at 13.38.44
Image: Still from The Moderators, film by Adrian Chen and Ciaran Cassidy.

Content moderation and content creation make each other possible. Without content moderation on YouTube, it would be much harder for Creators to make a living as Creators, because the platform would be overrun with content that would drive advertisers (and probably viewers) away.

But as the quantity of content on the platform has grown over the years, the need for moderators has also grown — and so have the challenges facing the company to manage the moderation process and ensure that moderators are adequately trained, supported, and protected from the psychological trauma that almost inevitably results from watching hours and hours of horrible video content every day.

As usual in matters YouTube, reporters Julia Alexander and colleagues at The Verge have some of the most comprehensive coverage of the most recent developments, including a new lawsuit in the US from a former YouTube moderator who now has post-traumatic stress disorder (PTSD) from doing the work, and the latest news about YouTube’s plans for using automated systems instead of people to handle more content moderation:

The terror queue” (2019)

Former YouTube content moderator sues the company after developing symptoms of PTSD” (Sep 2020)

A new lawsuit may force YouTube to own up to the mental health consequences of content moderation” (Sep 2020)

YouTube is about to age-restrict way more videos: AI moderation measures will be used to automatically age-restrict certain content” (Sep 2020)

FairTube stands in solidarity with content moderators working for YouTube and for all online platforms. It is not enough to achieve fairness and transparency for creators and safety for users; the entire “ecosystem” must be healthy, fair, and safe for all participants. Safety for users and fairness for creators cannot come at the cost of horrible psychological risks and burdens for content moderators, who make a crucial contribution to making the entire ecosystem financially sustainable. We hope that platform operating companies, with support from experienced moderators and the expert practitioners and researchers who have been working in this space for years, can quickly develop strategies, processes, and structures to ensure that content moderators are appropriately guided and protected while they do their difficult but crucial work.