Technology and media company Verge has published a detailed report on Facebook and YouTube’s ‘content moderation’ processes in which people have to sign a form that says:

“I understand the content I will be reviewing may be disturbing, It is possible that reviewing such content may impact my mental health, and it could even lead to Post Traumatic Stress Disorder (PTSD). I will take full advantage of the weCare program and seek additional mental health services if needed. I will tell my supervisor/or my HR People Adviser if I believe that the work is negatively affecting my mental health.”

However, this policy is not for everyone, here is the reason why.


According to the report, only new joiners were asked to sign the forms, whereas existing employees were being sent the form as an update. In addition, this policy is being implemented in a company called ‘Accenture.’

The Accenture is a multinational company that provides content moderation services to various social media platforms like Facebook, Twitter, and Google. The job of content moderation is outsourced because a huge amount of Data is posted on social media platforms. It is certainly not possible for the company to moderate all content on its own.

However, the nature of the job is gruesome because the moderators have to view highly sexual and violent content which results in severe mental health issues. The reports come as Facebook faces lawsuits from former content moderators over a range of mental health issues in California and Ireland.

Cases of PTSD and other mental health issues have been on the rise among content moderators. In 2019 The Verge published a behind the senses report of the Facebook Moderators.

One moderator quoted in that report said he “sleeps with a gun by his side” after doing the job.

Mental health experts say understanding the psychological strains of this job does not mitigate its risks.