But who's in charge of keeping these illegal and offensive content away from you and your children?
Who indeed?
Many content moderators are regularly exposed to the darker side of the Internet, braving through the mundane and the depraved so you don't have to. |
Tasked to go over every piece of content submitted to a company's website or social media pages, the content moderator's true responsibility is ultimately to help the company maintain a good online reputation. That's a critical responsibility that, in a nutshell, entails deciding whether a video or a comment:
• Fairly represents your company;
• Is appropriate for your demographic;
• Is indeed true; and
• Doesn't attack or purposely offend anyone else.
Seeing how crucial their responsibilities are, what must companies do for its content moderators?
• Psychological tests during recruitment
Aside from the battery of tests to verify applicants' skills and proficiencies, companies must also include psychological assessments. While mental health shouldn't be the only qualifier in one's competence in the industry, it can be indicative of a moderator's readiness to take on content moderation since it might put them in harm's way. After all, there may be underlying psychological concerns or traumatic experiences that can easily be triggered by images or videos.
• In-house psychologists
In-house psychologists must be readily available to help content moderators deal with various negative themes they may encounter, including murder, any form of abuse, sexual assault, exploitation of children, suicide, gore, etc. This helps in looking after the well-being of your employees and aids in any legal concerns that might ensue from having to deal with such content.
One-time or repetitive exposure to disturbing images and videos can lead to depression, PTSD, anxiety, and other psychological concerns. This is largely to blame why big content moderation accounts have high employee attrition rates. Desensitization often happens to moderators, but it doesn't mean that they're truly okay. What's more, not providing for the mental health of your employees can lead to legal cases against your business.
• A fair assessment of openness and careful discernment
One of the challenges in content moderation is recognizing that while claims can be verified, opinions are definitely subjective. What one group wholeheartedly believes to be true may be utterly unacceptable to another. As such, a content moderator scratching off user-generated content (UGC) that may be considered offensive may be misconstrued as an act of censorship.
Objectivity is key in content moderation. In some cases, posts that may appear to be breaking community guidelines may be uploaded to bring awareness to, say, the reality of war-stricken countries, police brutality, sexual assault, and more.
When hiring and training moderators, moderators must be discerning and objective. Openness to differing cultures, modes of expression, culture, art, and the like must be inherent, or else, moderation will impede freedom of speech. At the same time, they must not be too lax and should always consider the protection of users.
Discernment is of grave importance for content moderation seeking to end or safeguard against fraud. Many times, these will include legal documents. So, watchful and skilled moderators are needed to weed out the legitimate applications and submissions from those who only wish to one-up businesses.