moderator, job, thankless, social media, pages, tag, Facebook, content, third party, company, sexual abuse, child, terrorist, animal abuse, cruelty, material, image, PG 13, PTSD, confidentiality, class-action, diagnosis, damages, US, trauma, horrific, wage, fair trade, paranoia, nightmare, employee, protection, employer, computer, laptop, coffee

Facebook Moderators Seek Millions in Damages from Content

A moderator’s job must seem thankless. They sit on social media sites or forum pages and tag or even remove content deemed inappropriate or outside of the acceptable parameters. This work must get dull and irritating but must also have its moments of humor or interest.

Yet, a group of moderators from Facebook decided that their work was causing them serious problems due to the graphic and disturbing nature of much of the material they were forced to view and remove.

In May 2020, Facebook agreed to pay $52 million to its content moderators. The suit was brought forward by third-party contractors who alleged that the company had failed in its duty to protect the moderators against the severe psychological injuries that could occur from handling the content. For example, videos of terrorist acts (including beheadings), child sexual abuse, severe animal cruelty, and an abundance of other violent images or materials were forced on the moderators.

Rather than finding themselves deleting spammy messages or inflammatory comments, they were looking at videos and photos that would upset them for hours or even weeks after. One moderator went on record to explain that she had developed post-traumatic stress disorder from the work she found herself doing for eight or more hours per day.

Doing so meant that she broke the confidentiality agreement signed when the job at Facebook was taken, but she felt it necessary in the face of the conditions. The class-action lawsuit will not, however, make any of the content moderators rich. This is because the massive payout sees only around $1,000 going to each of the moderators.

If they did get a formal diagnosis of PTSD or another condition directly related to the work they did for Facebook, their payout might go as high as $50,000.

What does this mean for other social media companies? Is this case the foundation for other claims against similar firms? Most legal experts agree that this has likely opened the door for future issues.

Many might wonder if there is validity to the idea of PTSD developing from managing inappropriate content. One of the content moderators who broke the confidentiality agreement shared their experience to prove their point. After having been forced to watch a video of a teen girl being gang-raped and then seeking counseling, this moderator was told that there was a counselor available – once every quarter of the year.

Noting that this was a total lack of support for the moderators, that moderator joined the lawsuit. As The Washington Post noted, the moderators claimed that they had nothing to protect or support them from the “trauma associated with spending eight hours a day reviewing horrific images to keep them off social media platforms.”

The payments agreed to by Facebook go only to U.S.-based content moderators working for third-party firms. These groups, however, also assign work to subsidiaries or WhatsApp and Instagram and include over 10,000 workers.

So, will tens of thousands of others doing similarly challenging work be able to bring forward similar cases? In a word: yes. The suit, and its outcome, may trigger other similar claims against sites like Twitter or YouTube that both use the services of content moderators. However, all of the social media giants have bulked up their global workforce of content moderators with workers ranging from Ireland to India. Most are done through third-party contracts, and workers will receive much less in wages and no protections against harmful content.

Would earning less than $20 per hour be a fair trade for developing paranoia, nightmares, and the inability to sleep? This lawsuit says no, and it is likely that other similar cases will arise until protections and support are in place for this challenging work meant to make social media a safe space for all.