Content Moderators Must Be Protected And Work In A Safe Environment

Updated on

Content moderators at tech leaders Facebook, Inc. (NASDAQ:FB) and Cognizant Technology Solutions Corp (NASDAQ:CTSH) are breaking their NDAs to expose shocking working conditions, while one investigative reporter portrayed content moderators as being traumatized at “sweatshops in America.”

One content moderation company, WebPurify with offices in Irvine, California, and Hyderabad, India says they’re starting to see a shift as companies are carefully considering the work conditions of their moderation partners.

[REITs]

Q2 hedge fund letters, conference, scoops etc

WebPurify provides content moderation expertise for many of the world’s most respected organizations and Fortune 500 companies, including top e-commerce platforms, children’s sites, dating sites, social sharing, and gaming apps.

Joshua Buxbaum, the co-founder of WebPurify, says that content moderators are providing an essential service to the online community and deserve better.

“A safe and supportive work environment has always been a core value of our business, so we are pleased to see companies beginning to focus on work conditions vs. solely looking at cost,” he says.

WebPurify’s content moderation team takes pride in what they do and recognize that they are on the frontlines protecting the public.

And protecting a company’s image hangs in the balance.

Buxbaum says his team often see terrible things and they bear the brunt of it to protect strangers on the other side of that screen.

However, they’re not too prideful to take advantage of the various mental health programs WebPurify has in place.

He says moderators are often rotated to less severe projects to take a break from seeing the “bad stuff” every day.

“We realize this is a unique and hazardous job and at WebPurify we’ve built the safest environment we could to protect our moderation team best,” he says.

Buxbaum points out that the cost of moderation is a difficult pill for some businesses to swallow because the expense doesn’t translate directly to profits.

The use of AI isn’t the answer to spare moderators from the risks of psychological trauma because this technology used alone, without live moderation, often comes up short when moderating user-generated content to keep brands protected.

The result is some company’s seeking the most inexpensive solution while disregarding the quality of the reviews and working conditions of their moderators.

Buxbaum says he’s pleased with the shift in the past year, and as the media focuses on this issue, companies are now asking potential moderation partners the critical questions.

“We’re happy to answer questions about the details of our mental health program for our moderators, requesting a tour of our facilities, or asking for information about our quality control measures,” he says.

He adds that it’s ironic that the purpose of a moderation team is to protect a company’s reputation. However, by disregarding the working conditions of those moderators, some company’s reps suffer tremendously.

Article by WebPurify

Leave a Comment