From Car And Truck Accidents To On-The-Job Injuries, We Are Here To Help You

Facebook’s live video problem

On Behalf of | May 19, 2017 | Workers' Compensation |

While posting videos on Facebook was a popular activity, the social media giant decided to up the ante, allowing their users the opportunity to broadcast live video. Users were able to document the simplest of tasks to the most significant events in their lives. The possibilities are endless.

Therein lays the problem.

Apparently, a 4,500-member community operations team is not large enough to review potentially objectionable content. Employees in the understaffed department are already under fire for inappropriately censoring content and failing to remove extreme video footage.

Mark Zuckerberg wants to add another 3,000 employees, a decision not based on significant growth, but a troubling trend. Users are live-streaming acts of murder, rape and suicide for all Facebook users to see.

Facebook does not scrutinize content before its uploaded. Instead, the company relies on users to report inappropriate posts. From there, moderators review and remove the footage if it violates community standards that forbid nudity, hate speech or glorified violence.

While details about the positions are scarce, the primary responsibility is all too clear. Moderators face tasks described at best as grueling. At worst, it could result in severe psychological trauma.

That possibility brings up the issue of workers’ compensation.

A job that involves watching entire videos of violent crimes and deaths can affect people, if not desensitize them. With Facebook Live being a relatively new officering, the long-term impact on workers’ mental health is unknown.

Few comparisons to Facebook’s job openings exist. Recently, moderators at Microsoft sued their employer, alleging that viewing images showing “indescribable sexual assaults” and “horrible brutality” led to post-traumatic stress disorder (PTSD).

Microsoft disputes the claims.

Repeated exposure to extreme content may also lead to “secondary trauma”, which is a condition similar to PTSD, but the witness is looking at images of what happened and put themselves in the victim’s place.

In addition to the psychological toll on Facebook moderators, they could also carry a heavy burden of judgment. On a daily basis, they must distinguish between the appropriate and inappropriate. One wrong decision and their employer could be accused of infringing on freedom of speech.

Automation is not a valid option as of yet. Algorithms cannot determine intent. Artificial intelligence is nowhere near that level of sophistication.

A Facebook spokeswoman claims that every employee, regardless of their responsibilities, is offered psychological support and wellness resources. Programs are in place to support people in these difficult jobs that are evaluated annually.

When it comes to the misuse of Facebook Live, Mark Zuckerberg has admitted, “We have a lot of work to do.”

So do his growing number of moderators. But, at what cost to their mental health and ability to work?