BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

How To Ethically Navigate Content Moderation In A User-Generated World

Forbes Technology Council

Alex Ross is cofounder & COO at Hire Horatio CX. Horatio CX manages the CX & other customer needs for today's fastest-growing brands.

Content moderation is defined as “the practice of monitoring user-generated content and approving (or removing) it based on company policy and guidelines.”

According to data from the Transparency Market Research group, the content moderation solutions market is expected to reach $11.8 billion by 2027. As global regulations continue to change and interconnectedness for brands becomes increasingly more important, content moderation solutions will start to modernize and become more complex and nuanced in nature. Many CX experts predict that content moderation services will need to continue maximizing human potential by being complemented by advanced and “smart automation technologies.”

When working with content moderation, human agents can get bogged down. Research has shown they can even be negatively impacted by some of the information they have to screen out. In 2021, Facebook was ordered to pay a $52 million settlement to its vendor-based content moderators in a class-action lawsuit over psychological damage.

The ethics of content moderation are an important consideration for companies that rightly value the employee experience of their agents. With agents’ well-being as a priority, here are a few methods to navigate content moderation services in a smart and ethical manner:

Content moderation technology should support specific and different contexts.

A 2022 report from the Harvard Kennedy School Shorenstein Center on Media, Politics and Public Policy argues for “ethical scaling for content moderation.” Moreover, the report delves into several challenges facing AI-assisted content moderators, one of which is “the need for cultural contextualization in detection systems [which is currently]

a widely acknowledged limitation since there is no catch-all algorithm that can work for different contexts.”

The authors surmise that “lack of cultural contextualization has resulted in false positives and over-application.” This is true, and it behooves CX leaders to find modern and ever-evolving technologies which make nuanced content moderation a priority. Furthermore, cultural contexts and applying different methods of understanding (geographically speaking) should remain a priority when evaluating content moderation.

Provide support for human agents.

More important than making sure technology is supported and ever-evolving is spending capital and allocating proper resources to ensure that human agents feel supported and are offered proper respite from the challenges that often come with the tasks of content moderation. Your team’s emotional well-being should be tended to on a minute-to-minute basis.

According to a 2021 research report from the University of Virginia, an "emotional toll [for human content moderators is] a simple lack of appreciation where moderators did not feel sufficiently valued for their work contributions." Weekly check-ins and a deliberately structured way of reporting unsettling situations are necessary for your team. Beyond this, it is critical that on-site counseling or psychological resources be available to team members. There should also be mental screenings and transparency when hiring workers for content moderation.

Set boundaries for your team.

With content moderation, there should be a strict time allowance for this particular task. Allowing moderators sufficient time and space to decompress is fundamental. Also, having green spaces or other activities to keep a balance in day-to-day is well-advised. A call-to-action from a 2018 public health summit suggested that while these tips may initially sound commonplace, they are fundamental in making your teams’ well-being a priority.

Conclusion

A 2021 Virginia Tech research paper on The Psychological Well-Being of Content Moderators noted, “Human abilities still exceed state-of-the-art AI for many data processing tasks, including content moderation. While AI will continue to improve, the use of human computation enables companies to deliver greater system capabilities today.”

Bearing this in mind, it is up to companies, big and small, to tend to the emotional well-being and support of their human agents when delving into the task of content moderation. A community approach, where agents never feel alone in the daunting task of moderation, is always preferable. Empowering teams to be proactive about their mental health and having adequate time and resources to unpack these tasks is necessary as content moderation continues to evolve and become increasingly important in today’s digital world.


Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?


Follow me on LinkedInCheck out my website