Facebook Moderates has begun to take over content moderation, which it previously outsourced, to ensure the mental health of its workers and its users.
The coronavirus crisis has had a significant impact on technology companies, which have seen it force it to cancel events – the MWC was the first on a considerable list – or to change forecasts and postpone launches. In addition, companies such as Facebook, Google or Twitter have joined forces to combat disinformation regarding the disease. But it has also had unexpected consequences; Facebook, without going any further, has begun to hire content moderators – a job it previously outsourced – to ensure the mental health of those who will now have to deal with increasingly harsh posts and those who make these posts.
According to Casey Newton in The Interface, Mark Zuckerberg explained the new measures in a call with the US media. When it comes to moderation, she will hire full-time workers to qualify for the mental health services and programs that they would otherwise access. On the other hand, at a time like this there needs to be someone attentive to what users may mark as troubling.
The role of the moderators of this social network has always been complex, both due to the complexity of their work and the content to which they had to be exposed. The company was widely criticized after a report by The Verge revealed the conditions in which these employees worked and, although Mark Zuckerberg’s company assured that it would take action, it never came to directly assume this role.
It’s The Side Of Extreme This Is How Facebook Moderates In Times Of Coronavirus Rarely Seen, But That’s Why Is Needed
The problem is that too much content is produced for Facebook to directly moderate it, at least without making a huge investment in centers, training employees and hiring them in each country.
To this must be added that they are exposed to explicit videos of content as hard and violent as death, accidents or animal abuse, which carries an emotional charge that affects their mental health. The report that revealed this situation spoke of drugs, constant vigilance that measured the time to go to the bathroom and post-traumatic stress situations that caused some employees to come to their posts with weapons.
Now all of this should change. The coronavirus has forced to move work home and privacy prevents subcontractors – who, among other measures, forced their workers to leave their phones at the entrance – allow moderation remotely. Facebook promised to continue paying these companies as long as the situation continues.
Zuckerberg himself acknowledged that isolation could lead to increased depression and similar problems, and there will be those who turn to Facebook to vent. Therefore, it is important that someone is listening, although at the same time it is also important that those who listen have the necessary tools to deal with it.
Thus, over the next few weeks, attention will be given to this type of content, without ignoring the rest. Information on the disease will also be provided and advertisements seeking to benefit from the situation have been banned. Artificial intelligence that is already used to detect prohibited images or discourses will play a big role, but the rest will be in the hands of Facebook and its moderators. Those that remain.