Views / Metro Views

Column

Metro News globe

Metro Views

Vicky Mochama: The voice of Metro News.

Vicky Mochama asks: Who will watch out for Facebook's new offensive content watchers?

News the social network is hiring 3,000 people to monitor for the kind of horrific content that's been making headlines indicates need for improved mental health supports.

Getty Images

In response to a spate of murders, sexual assaults and suicides streamed on Facebook Live, Mark Zuckerberg announced that the company will hire 3,000 more people to monitor content on the site.

It’s a necessary step that should be applauded.

But who will watch out for the watchers?

There is plenty that is troubling in the world, from gender-based violence to mental health issues, and these existing societal problems are finding an audience on Facebook.

It comes down to a couple thousand people at the companies we all use — Facebook, Twitter, Instagram — to see the worst so the rest of us don’t have to.

More Views from Vicky:

Facebook already has 4,500 staff to moderate the site. That’s nearly a quarter of its workforce dedicated to reviewing the posts, photos, comments and live videos of over 1.23 billion daily users.

The deluge isn’t just the celebratory parts of people’s daily lives — brunches, birthdays and bar mitzvahs — but also their traumas and terrors.

In Thailand, a man murdered his baby daughter before killing himself, all streamed live. A sexual assault in Chicago and a murder in Cleveland were both posted live to Facebook. A Nunavut man streamed his desire to die by police; he died hours later. And in Manitoba, a community is reeling after a teen girl was killed, and video appearing to show young people attacking the victim was shared repeatedly on the social network.

The volume of content is overwhelming. And increasingly, the people who delete objectionable content are overseas in places like the Philippines.

Facebook isn’t the only company struggling to cope, and theirs won’t be the only employees burdened by the work of keeping the Internet relatively clean.

While there is technology that identifies child pornography, its results still have to be verified by human beings.

In a case filed last December, two men are suing Microsoft for the PTSD and related mental health issues they are experiencing after moderating content for the company. Their work in keeping violent images, especially child abuse, off the Internet has, they allege, made them unable to work and be fully functioning members of their families.

Their complaint alleges that the company’s mental health supports weren’t sufficient.

There isn’t yet technology to remove the kind of violent and damaging live-streaming video that has been making headlines. Companies like Facebook have to rely completely on human staff.

Whatever it does about Facebook Live, it shouldn’t do so at the expense of the mental health of its employees.

Facebook does have a process to support content reviewers. But as it hires more people — and potentially, more overseas support in countries with less robust health services — it must ensure that the issues it’s fighting online don’t end up doing more harm in real life.

More on Metronews.ca