Content Moderators Must Be Protected And Work In A Safe Environment

Content moderators at tech leaders Facebook, Inc. (NASDAQ:FB) and Cognizant Technology Solutions Corp (NASDAQ:CTSH) are breaking their NDAs to expose shocking working conditions, while one investigative reporter portrayed content moderators as being traumatized at “sweatshops in America.”

Content Moderators

Claudio_Scott / Pixabay

One content moderation company, WebPurify with offices in Irvine, California, and Hyderabad, India says they’re starting to see a shift as companies are carefully considering the work conditions of their moderation partners.

[REITs]

Q2 hedge fund letters, conference, scoops etc

WebPurify provides content moderation expertise for many of the world’s most respected organizations and Fortune 500 companies, including top e-commerce platforms, children’s sites, dating sites, social sharing, and gaming apps.

Joshua Buxbaum, the co-founder of WebPurify, says that content moderators are providing an essential service to the online community and deserve better.

“A safe and supportive work environment has always been a core value of our business, so we are pleased to see companies beginning to focus on work conditions vs. solely looking at cost,” he says.

WebPurify’s content moderation team takes pride in what they do and recognize that they are on the frontlines protecting the public.

And protecting a company’s image hangs in the balance.

Buxbaum says his team often see terrible things and they bear the brunt of it to protect strangers on the other side of that screen.

However, they’re not too prideful to take advantage of the various mental health programs WebPurify has in place.

He says moderators are often rotated to less severe projects to take a break from seeing the “bad stuff” every day.

“We realize this is a unique and hazardous job and at WebPurify we’ve built the safest environment we could to protect our moderation team best,” he says.

Buxbaum points out that the cost of moderation is a difficult pill for some businesses to swallow because the expense doesn’t translate directly to profits.

The use of AI isn’t the answer to spare moderators from the risks of psychological trauma because this technology used alone, without live moderation, often comes up short when moderating user-generated content to keep brands protected.

The result is some company’s seeking the most inexpensive solution while disregarding the quality of the reviews and working conditions of their moderators.

Buxbaum says he’s pleased with the shift in the past year, and as the media focuses on this issue, companies are now asking potential moderation partners the critical questions.

“We’re happy to answer questions about the details of our mental health program for our moderators, requesting a tour of our facilities, or asking for information about our quality control measures,” he says.

He adds that it’s ironic that the purpose of a moderation team is to protect a company’s reputation. However, by disregarding the working conditions of those moderators, some company’s reps suffer tremendously.

Article by WebPurify




About the Author

Jacob Wolinsky
Jacob Wolinsky is the founder of ValueWalk.com, a popular value investing and hedge fund focused investment website. Prior to ValueWalk, Jacob was VP of Business Development at SumZero. Prior to SumZero, Jacob worked as an equity analyst first at a micro-cap focused private equity firm, followed by a stint at a smid cap focused research shop. Jacob lives with his wife and four kids in Passaic NJ. - Email: jacob(at)valuewalk.com - Twitter username: JacobWolinsky - Full Disclosure: I do not purchase any equities anymore to avoid even the appearance of a conflict of interest and because at times I may receive grey areas of insider information. I have a few existing holdings from years ago, but I have sold off most of the equities and now only purchase mutual funds and some ETFs. I also own a few grams of Gold and Silver