Tech giant Facebook has launched tools to make it easier for users to report "revenge porn."
Facebook said that its new photo-matching tool would automatically prevent images from being shared again once they have been banned. Also, the accounts of people sharing revenge porn may also be disabled.
Revenge porn refers to the sharing of sexually explicit images on the Internet, without the consent of the people present in the pictures to extort or humiliate them. The practice disproportionately affects women, who are sometimes targeted by former partners.
Facebook has been sued in the past in the United States and elsewhere by people who claim the company should have done more to prevent the practice. In response, the social networking company made clear in 2015 that images "shared in revenge" have since been forbidden on the platform, and users have the ability to report posts that violated the terms of service.
However, in a new update, users of the world's largest social network will now be able to see a new option to report a picture as inappropriate specifically stating that it is a "nude photo of me," Facebook said in a statement. The company also said that it is launching an automated process to prevent repeated sharing of banned images. A photo-matching software will keep the pictures off the core Facebook network as well as its Instagram and Messenger services, the company said.
The company met representatives from more than 150 women's safety organizations last year and decided that it needed to do more, said Antigone Davis, global head of safety at Facebook, in a phone interview. Davis further said that a specially trained group of Facebook employees would review each reported image.