Facebook has increased its efforts to battle so-called revenge porn, announcing a tool that will identify any offending images and prevent them from being re-shared in the future across the social network’s platforms.
In a blog post from Facebook’s Head of Global Safety, Antigone Davis, the social media giant said the new measures would be implemented across Facebook and its messaging service ‘Messenger’, as well as Instagram.
Facebook said the new tools addressed their concerns about the sharing of intimate images of people without their permission, known as revenge porn.
“We’ve designed our tools to help people in these situations,” it said.
Unlike previous reporting tools, photo-matching technology is to be included, according to Facebook, thwarting further attempts to share the image.
“Specially-trained representatives” will review any reported images which, when verified, will be removed and the offending account disabled, allowing for an appeals process.
The new tools do not describe how Facebook is tackling the issue of revenge porn being shared within private groups where the subject is unaware of the posting.
In March, the private Facebook group ‘Marines United’ caused controversy after images of nude female marines were shared.
READ MORE: Marine nude photo-sharing scandal prompts new social media policy
Facebook closed the group, but could not prevent the images from being re-shared on sites it did not control. Other wings of the military, including the Navy, were also revealed to be involved in similar scandals.
Facebook said the new tools were developed in partnership with safety experts and is part of its areas of focus to build a global community. The National Network to End Domestic Violence, Center for Social Research, the Revenge Porn Helpline (UK) and the Cyber Civil Rights Initiative all provided input into the tools.
No date has been announced for when the new tools will be implemented.