Facebook’s team is currently working on a brand new tool aimed at reducing the incidences of harassment on the social network by identifying one of the sources
This new feature will automatically alert the user if it detects that his or her account is being impersonated by some and if their profile picture and username are being exploited.
The users will quickly receive alert about any such activity and will be asked to identify if the account is a fake profile or does not belong to an impersonator at all. This alerting procedure will be automated, but the impersonating accounts will be manually reviewed by the team at Facebook.
The team started testing this procedure in November last year and it has already gone live to around 75% of the world. Facebook, however, plans to expand its scope very soon, according to its Global Safety head Antigone Davis.
It must be noted that impersonation of user accounts isn’t a very widespread and common issue as far as Facebook is concerned. Nonetheless, it happens to be one of the key sources of harassment at this platform even after the strict policies of the social network against it.
Davis further told Mashable that the company “heard feedback prior to the roundtables and also at the roundtables that this was a point of concern for women and it’s a real point of concern for some women in certain regions of the world where it [impersonation] may have certain cultural or social ramifications.”
He also noted that this new feature can be deemed as a step forward in the ongoing efforts from the team at Facebook in making the female users from all over the world feel more safe using Facebook. In this regard, a series of roundtable conferences around the world with Facebook users, NGOs, activists to collect data and feedback to identify ways through which the social network could address privacy and safety related issues in a much comprehensive manner.
The result of these efforts is that two other safety-related features are currently in the pipeline and undergoing testing phase at Facebook. The company is trying to devise new, more effective ways of reporting non-consensual explicit images and development of photo checkup tools. Since 2012, Facebook has banned the exchange of non-consensual intimate images as well. Yet, the feature that they are currently testing is aimed at making the reporting process more compassionate towards victims of abuse.
This means, not only a user will have the option of reporting about nudity on Facebook but they will also have the privilege of reporting the picture in question as inappropriate and identifying themselves as the picture’s subject. This will aid in surfacing outside sources links, such as gaining the attention of support groups for helping out victims of abuse along with getting information about the probable legal options. It will also trigger the reviewing process that usually happens when nude pictures or content is reported.
According to Davis, users in countries like India where this new feature is currently in the testing phase, it is also an issue to help them become familiar with using these features. This gap will be bridged by the Photo Checkup feature because it will provide users with a step-by-step privacy settings review process of their photos. Photo Checkup tool is at the moment live in South America, Africa, India and Southeast Asia.