How companies should help their content moderation experts

Tuesday, February 20, 2018

At a glance, content moderation might seem like an easy task. You'd think that workers would only have to look through images, videos, and comments then hit either a Reject or an Accept button. Sure, that may be the standard repetitive process among these companies, but it's hardly that easy, especially for those who have to sit down and do the dirty work of sifting through every piece of content.

But who's in charge of keeping these illegal and offensive content away from you and your children?

Who indeed?

content moderation experts working pointing to computer screen Many content moderators are regularly exposed to the darker side of the Internet, braving through the mundane and the depraved so you don't have to.
Companies with an online outlet use content moderation. Small businesses may include some form of it as part of their digital marketer or social media manager's list of tasks. On the other hand, larger firms with an international customer base (such as Facebook and YouTube) either build an in-house team for it or rely on outsourcing content moderation.

Tasked to go over every piece of content submitted to a company's website or social media pages, the content moderator's true responsibility is ultimately to help the company maintain a good online reputation. That's a critical responsibility that, in a nutshell, entails deciding whether a video or a comment:

•     Fairly represents your company;
•     Is appropriate for your demographic;
•     Is indeed true; and
•     Doesn't attack or purposely offend anyone else.

Content moderators are thrust into a position that encompasses responsibility and danger. Sure, there are companies that only require their employees to confirm information within customer identification cards or legal documents, but other moderation activities, unfortunately, are riskier. Many content moderators are regularly exposed to the darker side of the Internet, braving through the mundane and the depraved so you don't have to.

Seeing how crucial their responsibilities are, what must companies do for its content moderators?

•     Psychological tests during recruitment

psychologist holding clipboard speaking to young content moderation analyst

Aside from the battery of tests to verify applicants' skills and proficiencies, companies must also include psychological assessments. While mental health shouldn't be the only qualifier in one's competence in the industry, it can be indicative of a moderator's readiness to take on content moderation since it might put them in harm's way. After all, there may be underlying psychological concerns or traumatic experiences that can easily be triggered by images or videos.


•     In-house psychologists

empathetic psychologist speaking to sad crying content moderation expert

In-house psychologists must be readily available to help content moderators deal with various negative themes they may encounter, including murder, any form of abuse, sexual assault, exploitation of children, suicide, gore, etc. This helps in looking after the well-being of your employees and aids in any legal concerns that might ensue from having to deal with such content.

One-time or repetitive exposure to disturbing images and videos can lead to depression, PTSD, anxiety, and other psychological concerns. This is largely to blame why big content moderation accounts have high employee attrition rates. Desensitization often happens to moderators, but it doesn't mean that they're truly okay. What's more, not providing for the mental health of your employees can lead to legal cases against your business.


•     A fair assessment of openness and careful discernment

content moderation analyst hands clasping in front of computer

One of the challenges in content moderation is recognizing that while claims can be verified, opinions are definitely subjective. What one group wholeheartedly believes to be true may be utterly unacceptable to another. As such, a content moderator scratching off user-generated content (UGC) that may be considered offensive may be misconstrued as an act of censorship.

Objectivity is key in content moderation. In some cases, posts that may appear to be breaking community guidelines may be uploaded to bring awareness to, say, the reality of war-stricken countries, police brutality, sexual assault, and more.

When hiring and training moderators, moderators must be discerning and objective. Openness to differing cultures, modes of expression, culture, art, and the like must be inherent, or else, moderation will impede freedom of speech. At the same time, they must not be too lax and should always consider the protection of users.

Discernment is of grave importance for content moderation seeking to end or safeguard against fraud. Many times, these will include legal documents. So, watchful and skilled moderators are needed to weed out the legitimate applications and submissions from those who only wish to one-up businesses.