Investigation finds Facebook mods fail to remove illegal content such as extremist and child porn

0
82
728

That Facebook is fighting against a tide of objectionable and illegal content is well known. That the task of moderating such content is a difficult and unenviable one should come as news to no one. But an investigation by British newspaper The Times found that even when illegal content relating to terrorism and child pornography was reported directly to moderators, it was not removed.

More than this, the reporter involved in the investigation found that Facebook’s algorithms actively promoted the groups that were sharing the illegal content. Among the content Facebook failed to remove were posts praising terrorist attacks and Islamic State, and others calling for more attacks to be carried out. Failure to remove illegal content once reported is, under British law, a crime in itself.

The reporter created a fake Facebook profile and was able to use this to very quickly find content that few people would regard as acceptable — beheadings, child abuse pedophile cartoons and more. The Times says that when Facebook was first told about the content, the social network’s response was to say that much of it did not violate community guidelines. It was not until the reporter informed Facebook that the content was being reported as part of a newspaper investigation that moderators removed additional posts.

View more  Early stage: This surveillance startup wants to put cameras in your neighborhood

In a boilerplate statement, Facebook’s Vice President of Operations Justin Osofsky said:

We are sorry that this occurred. It is clear that we can do better, and we’ll continue to work hard to live up to the high standards people rightly expect of Facebook.

All of the content reported by The Times has now — according to Facebook — been removed.

Reuters says that London’s Metropolitan Police declined to say whether Facebook was under investigation for its failure to remove the content, but said: “Where material breaches UK terrorism laws, the Counter Terrorism Internet Referral Unit (CTIRU) will, where possible, seek the removal of the content by working with the relevant internet hosting company.”

Image credit: Fotos593 / Shutterstock

Source

728

LEAVE A REPLY