YOU can spend your day in your pyjamas surfing the internet and earning up to six figures — so long as you don’t mind watching suicide, rape and murder.
Facebook announced Wednesday that it will hire an additional 3,000 “content moderators” around the world in a bid to keep the worst of the web off its social media platform. That’s on top of the existing 4,500 employees who already work to identify hate speech, pornography, child exploitation and other violent and disturbing acts.
The move follows a string of gruesome episodes that were lifestreamed on Facebook, including a father in Thailand who hanged his 11-month-old baby girl before killing himself. The video lingered on the site for roughly 24 hours before it was finally removed.
“This is important,” CEO Mark Zuckerberg said in a post announcing the move. “Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate.”
But in trying to keep the web’s underbelly at bay, Facebook and other social media companies are subjecting a growing group of workers to trauma and emotional distress. Critics say the job can be debilitating, leaving moderators with “permanent videos in their heads” and a need to stay away from everyday items-turned-potential triggers, like knives and computers.
“They’re exposed to the worst things you can imagine,” said Ben Wells, a lawyer who is representing two former Microsoft moderators who claim they developed post-traumatic stress disorder. “If you can imagine the worst person in the world and what they would want to watch, that’s what they’re exposed to, whether it’s on a daily basis or very frequently. And some of this stuff you just cannot un-see and there are things you can’t get rid of. It can be debilitating.”
http://www.news.com.au/technology/online/social/the-new-facebook-job-that-no-one-really-wants/news-story/42926607ca30e772e922cf3a0cabab6b