Facebook guidelines give discomforting insight into moderation policy

Recently, the Guardian released certain guideline files belonging to Facebook which mentioned that the social media giant does not delete videos which portray death, self-harm or abuse since doing so will violate its policy of not censoring its community.

Leaked guidelines suggest Facebook is apparently understaffed

The leaked guidelines suggest that the moderators at Facebook who are responsible for going through each and every video posted by the ever-growing community of users, do not find it practical to determine which videos are to be deleted and which to be kept.

Facebook has more than two billion users, and the staff which looks after such content has 4500 individuals. As such, it is clear that the staff is outnumbered and we all know the sheer number of videos and content that keeps getting posted on Facebook.

Videos to create awareness

According to the officials, not every video is supposed to be deleted in the first place. That is, if someone has posted a video portraying some abuse, it is saved for later so that Facebook can use it to create awareness regarding the issue.

This is one of the rules created by Facebook that not all content is to be deleted just because it might be disturbing. Furthermore, it was stated by Facebook that if a video contains child abuse, the image of the child and any content which is suggestive or shows sadism, will be removed. 

Nevertheless, content in which there is no sexual child abuse or which is not indicative of any acts of sadism will be kept safe so that the child may be saved or identified. Also, Facebook permits the live broadcast of videos in which people attempt to suicide or harm themselves in any way. This is so that Facebook can either save the individual through its support community or allow the individual to show his or her distress.

Similarly, videos of abortion will also be shared or kept safe as long as the video does not have any inappropriate content. Also, a language which is explicit will not be censored as well, since Facebook sees the use of such language as a medium through which people release their frustrations.

In every case, there will be mechanisms by which the audience will be protected from seeing content that might be inappropriate for them. However, any content or language that is attempted to disparage the president of the United States will be censored since, by law, the president is given special protection. The sheer size of content makes it impossible for Facebook moderators to decide

According to the head of global policy management at Facebook, the large size of content that keeps getting posted makes it impossible for the staff to decide as to which content should go and which should stay. Also, the fact that the appropriateness of a certain type of content is a subjective matter complicates things further. A video may be suitable for one user but not may be so for the other.

Nonetheless, Facebook said that it is doing everything it can to make sure that its community feels safe and secure. It further said that the community should also feel empowered by taking the onus of responsibility for reporting inappropriate content accordingly.

DDoS attacks are increasing, calculate the cost and probability of a DDoS attack on your business with this DDoS Downtime Cost Calculator.

Related Posts