Following outrage over growing incidents of posting violent videos by users on its network, Facebook is hiring 3,000 content reviewers worldwide over the next year to monitor and remove content more quickly.
Recently, the social media giant has attracted strong criticism for violent and objectionable content. Users have streamed shootings, murders, rapes, hate speech and assaults on Facebook. These posts remained on the site as recorded videos, often for days before being taken down.
In a Facebook post Wednesday, Facebook CEO Mark Zuckerberg announced that the social network will invest in people and tools to help remove objectionable content more quickly.
“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook – either live or in video posted later. It’s heartbreaking, and I’ve been reflecting on how we can do better for our community,” he said.
Recently, a man broadcasted footage of himself killing his 11-month-old daughter, an elderly man was murdered in Cleveland in a video later uploaded to Facebook, and a teenage girl was sexually assaulted in a livestreamed video.
Zuckerberg added, “If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner – whether that’s responding quickly when someone needs help or taking a post down.”
Facebook will keep working with local community groups and law enforcement to help those in need, either because they’re about to harm themselves, or because they’re in danger from someone else.
Zuckerberg said, “Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate.”
The move would significantly expands the company’s current workforce of content reviewers of 4,500.
by RTT Staff Writer
For comments and feedback: firstname.lastname@example.org