It is a fact that Facebook has become the breeding ground for malicious pedophiles and child sex addicts. Despite claims from the social network, we have seen countless examples so far where Facebook failed to take preventive measures or responded aptly whenever the presence of such illicit content involving children was reported to it. The same has happened in the latest of such incidents.
Reportedly BBC identified around dozens of images that were not just indecent but also displayed scenes of child sexual abuse on the social media network. On one of the Facebook pages where the content was available, users were making arrangements on swapping the obscene material.
More: Child Pornography Case: 73-year old Pedophile gets 300 years sentence
BBC identified pages that were dedicated for males who were sexually interested in children, pictures of underage children in extremely questionable (read sexualized) poses and lewd comments were posted beside them; various child sexual abuse oriented groups bearing names like Hot XXXX Schoolgirls where stolen pictures of real-life kids were posted; child pornography related video with a share request, etc.
According to Damian Collins, the chairman of the Common media committee, the content moderation system implemented by Facebook is apparently not that effective or efficient. He claimed so because even after notifying the social network’s community managers about the content, over 80% of the images weren’t deleted (just 18 out of 100 images were removed).
Not just this, in a rather surprising turn of events Facebook contacted UK’s National Crime Agency (NCA) and reported about the journalists who discovered these images on Facebook citing the reason that it was in violation of the Crime Prosecution Service (CPS) and Association of Chief Police Officers’ (ACPO) guidelines.
“When provided with examples of the images, Facebook reported the BBC journalists involved to the police and canceled plans for an interview,” revealed BBC in its post.
BBC was certainly not expecting this sort of a reaction to such a large and sensitive issue. According to Collins, their intention was not to violate the law but to “help clean up the network” from this type of content. To do this, they tried to test the claim from Facebook that users need to use the Report button to notify the company about the presence of illegal, obscene and offensive content.
More: Police Dogs Trained To Sniff Out Hard Drives Coming For Pedophiles
The BBC used this button and alerted the social network about the 100 pictures that fell under the abovementioned categories. However, Facebook replied that the remaining undeleted 82 images did not breach its community standards.
Source: BBC Tech | Investigation: BBC UK
DDoS attacks are increasing, calculate the cost and probability of a DDoS attack on your business with this DDoS Downtime Cost Calculator.