Facebook has added 3,000 more moderators to check on content that comes across its Live feeds. This comes in wake of the rise in murders and assaults that have been broadcasted and left up on users profile pages.
Facebook chief says social network will invest in people and tools to remove content more quickly.
But instead of scrutinising content before it is uploaded, Facebook relies on reporting tools used by the social network’s 1.86 billion users and a team of people at Facebook to review reported posts and content and retroactively remove them from the site.
Zuckerberg said: “Over the next year, we’ll be adding 3,000 people to our community operations team around the world – on top of the 4,500 we have today – to review the millions of reports we get every week, and improve the process for doing it quickly.
“If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner – whether that’s responding quickly when someone needs help or taking a post down.”
Whether these tools and the extra manpower will be enough to curb the rise of objectionable and extremist content on Facebook remains to be seen. Most Facebook users can simply press a button and immediately begin live broadcasting of whatever is in front of their smartphone’s camera.
For now, technology companies including Facebook have been unable to discern objectionable content in real-time, leaving retroactive alerting and removal their primary tool.