Content moderation is the internet’s not-so-secret, dirty little secret. Content moderators are working around the world, and around the clock, to scrub the internet of horrific content. Most moderators work for low pay and with little or no health care benefits. The content they are exposed to leaves them vulnerable to a number of different mental health issues, including post-traumatic stress disorder. Their work is often hidden from users and is de-emphasized by the technology industry.
This Note explores potential solutions to the labor and employment issues inherent in content moderation work and suggests that there could be a path forward that both empowers and protects workers and leaves technology companies less vulnerable to litigation, bad press, and governmental regulation. An approach that combines corporate and worker-driven social responsibility is the most promising.