Content Moderator Sues Facebook for PTSD
A former Facebook content moderator is suing the site's operators claiming the work mentally harmed him. Daniel Motaung says the low-paid work left him with post-traumatic stress disorder.
Motaung is suing Facebook's owner Meta along with Sama, the contracting company that hired him for the work. He says he was misled by a job ad that implied content moderation was a small part of a wider customer service role.
He was recruited in South Africa and relocated to work in Nairobi where he was paid the equivalent of $2.20 an hour. He says this relocation made it more difficult for himself and fellow workers to leave the job if they didn't want to continue.
Sama denies these claims, saying "It is completely inaccurate to suggest that Sama employees were hired under false pretences or were provided inaccurate information regarding content moderation work." (Source: bbc.co.uk)
Horrific Content
Motaung says his work involved reviewing videos posted to the site to see if they broke Facebook's content rules. He says he regularly saw extremely disturbing content including a beheading and material involving children.
TIME investigated claims made by other workers at the facility. It found content moderators have a target of averaging just 50 seconds to review each video, regardless of its length. Senior staff then spot-check reviews and moderators must have made the "correct" decision at least 84 percent of the time. (Source: time.com)
15 Second Deadline
Internal guidelines suggested many reviews didn't even get this brief period. The guidelines said that if there were no signs of unsuitable material in data such as the title, thumbnail and comments, and if the video hadn't already been reported or flagged, then the moderator should only watch the first 15 seconds of the video.
It's another sign of the logistical difficulties in making sure online material doesn't breach site rules. The sheer volume of content posted brings serious challenges to effective human review, but artificial intelligence doesn't yet seem reliable enough to filter content.
What's Your Opinion?
How should Facebook moderate content to avoid harmful material getting online? Will automated moderation ever be reliable enough? Is there any way to have human moderators without risking their mental health?
Most popular articles
- Which Processor is Better: Intel or AMD? - Explained
- How to Prevent Ransomware in 2018 - 10 Steps
- 5 Best Anti Ransomware Software Free
- How to Fix: Computer / Network Infected with Ransomware (10 Steps)
- How to Fix: Your Computer is Infected, Call This Number (Scam)
- Scammed by Informatico Experts? Here's What to Do
- Scammed by Smart PC Experts? Here's What to Do
- Scammed by Right PC Experts? Here's What to Do
- Scammed by PC / Web Network Experts? Here's What to Do
- How to Fix: Windows Update Won't Update
- Explained: Do I need a VPN? Are VPNs Safe for Online Banking?
- Explained: VPN vs Proxy; What's the Difference?
- Explained: Difference Between VPN Server and VPN (Service)
- Forgot Password? How to: Reset Any Password: Windows Vista, 7, 8, 10
- How to: Use a Firewall to Block Full Screen Ads on Android
- Explained: Absolute Best way to Limit Data on Android
- Explained: Difference Between Dark Web, Deep Net, Darknet and More
- Explained: If I Reset Windows 10 will it Remove Malware?
My name is Dennis Faas and I am a senior systems administrator and IT technical analyst specializing in cyber crimes (sextortion / blackmail / tech support scams) with over 30 years experience; I also run this website! If you need technical assistance , I can help. Click here to email me now; optionally, you can review my resume here. You can also read how I can fix your computer over the Internet (also includes user reviews).
We are BBB Accredited
We are BBB accredited (A+ rating), celebrating 21 years of excellence! Click to view our rating on the BBB.
Comments
Daniel Motaung will not get a penny.
If he isn't an outright fraudster he certainly is not intelligent enough to be a content moderator.
He relocated from South Africa to Nairobi for the equivalent of $2.20 an hour? That is 2721 miles and 55 hours driving time!
Contracting companies have only one purpose and that is to remove any liability from an employer for anything and they are very good at it.
"Sama denies these claims, saying "It is completely inaccurate to suggest that Sama employees were hired under false pretences or were provided inaccurate information regarding content moderation work.""
Game over Daniel Motaung will lose.
Facebook clearly searched the whole world and found the best content moderators in South Africa and Nairobi. /sarc off
All Facebook has to do is look like they are doing something, they don't have to be successful at it.
Will automated moderation ever be reliable enough?
The question is could automated moderation ever keep up with the total insanity of humans?
No, probably not.
Is there any way to have human moderators without risking their mental health?
No, but millions of people view this content every day :)
Moderation Buzz
I agree with much of buzzallnight's post. But, being young and inexperienced, or even stupid, is not a reason to suggest someone is a fraud. That's not a fair comment on Motaung. I suspect job opportunities in the part of the world are few and far between, so again, it is unfair to criticize Motaung to taking the moderation job. I give him credit for trying. I can't imagine some of the horrific content he must have seen that he never, ever expected to have to filter.
I hope Motaung is successful in his lawsuit. I hope he gets million$ in settlement. Then Facebook will realize it's cheaper to hire enough people to do the work in the first place, rather than over work and under pay an inadequate number of works to do such important work. Fuck Facebook. They are more the problem these days than they are the solution.