Honolulu Star-Advertiser

Saturday, December 14, 2024 77° Today's Paper


News

Scarred Facebook moderators to get $52M for mental health

ASSOCIATED PRESS / MARCH 29, 2018
                                The logo for Facebook appears on screens at the Nasdaq MarketSite, in New York’s Times Square.

ASSOCIATED PRESS / MARCH 29, 2018

The logo for Facebook appears on screens at the Nasdaq MarketSite, in New York’s Times Square.

Facebook Inc. agreed to pay $52 million to settle a lawsuit by content moderators who alleged they suffered psychological scars from repeated exposure to disturbing material including images of child sex abuse and terrorism, according to a statement issued by lawyers for the plaintiffs.

The settlement will cover more than 10,000 current and former content moderators in California, Arizona, Texas and Florida, who can receive $1,000 for medical screening as well as additional payments for treatment if required, the lawyers said.

Facebook will also provide on-site coaching and tools that allow the moderators more control over how they view the images to mitigate their exposure to the material, according to the proposed settlement.

“We are so pleased that Facebook worked with us to create an unprecedented program to help people performing work that was unimaginable even a few years ago,” Steve Williams, one of the attorneys representing the moderators, said in Tuesday’s statement. “The harm that can be suffered from this work is real and severe.”

Facebook didn’t immediately respond to a request for comment on the settlement.

In the wake of the 2016 presidential election, Facebook rushed to expand efforts to police its platforms, trying to keep political misinformation, graphic violence, terrorist propaganda, and revenge porn off the sites. This has entailed both new technology, and thousands of new workers. Facebook as of last year had about 15,000 content reviewers, almost all of whom worked not for Facebook itself but for staffing firms like Accenture and Cognizant.

The settlement comes after many content moderators were sidelined by the pandemic because of security and privacy concerns over the work being done from home. Facebook has been slowly bringing the moderators back online but the company is investing more in artificial intelligence over the long term than in humans.

By participating in online discussions you acknowledge that you have agreed to the Terms of Service. An insightful discussion of ideas and viewpoints is encouraged, but comments must be civil and in good taste, with no personal attacks. If your comments are inappropriate, you may be banned from posting. Report comments if you believe they do not follow our guidelines. Having trouble with comments? Learn more here.