When someone’s intimate images are shared without their permission it can be devastating. To protect victims, it’s long been our policy to remove non-consensual intimate images (sometimes referred to as revenge porn) when they’re reported to us — and in recent years we’ve used photo-matching technology to keep them from being re-shared. To find this content more quickly and better support victims, we’re announcing new detection technology and an online resource hub to help people respond when this abuse occurs.

 

Finding these images goes beyond detecting nudity on our platforms. By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram. This means we can find this content before anyone reports it, which is important for two reasons: often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared. A specially-trained member of our Community Operations team will review the content found by our technology. If the image or video violates our Community Standards, we will remove it, and in most cases we will also disable an account for sharing intimate content without permission. We offer an appeals process if someone believes we’ve made a mistake.

This new detection technology is in addition to our pilot program jointly run with victim advocate organizations. This program gives people an emergency option to securely and proactively submit a photo to Facebook. We then create a digital fingerprint of that image and stop it from ever being shared on our platform in the first place. After receiving positive feedback from victims and support organizations, we will expand this pilot over the coming months so more people can benefit from this option in an emergency.

“We are thrilled to see the pilot expand to incorporate more women’s safety organizations around the world, as many of the requests that we receive are from victims who reside outside of the US.” – Holly Jacobs, Founder, Cyber Civil Rights Initiative (CCRI)

We also want to do more to help people who have been the targets of this cruel and destructive exploitation. To do this, we’re launching “Not Without My Consent,” a victim-support hub in our Safety Center that we developed together with experts. Here victims can find organizations and resources to support them, including steps they can take to remove the content from our platform and prevent it from being shared further — and they can access our pilot program. We’re also going to make it easier and more intuitive for victims to report when their intimate images were shared on Facebook. And over the coming months, we’ll build a victim support toolkit to give people around the world more information with locally and culturally relevant support. We’ll create this in partnership with the Revenge Porn Helpline (UK), Cyber Civil Rights Initiative (US), Digital Rights Foundation (Pakistan), SaferNet (Brazil) and Professor Lee Ji-yeon (South Korea).

READ
South Korea, US-Developed Solar Coronagraph Installed at International Space Station

Our work fighting this abuse and supporting victims wouldn’t be possible without the help of international experts. Today, on the sidelines of the 63rd U.N. Commission on Status of Women, we’ll hold an event with Dubravka Šimonović — the U.N. Special Rapporteur on violence against women — that brings together some of these victim advocates, industry representatives and nonprofits. We’ll discuss how this abuse manifests around the world; its causes and consequences; the next frontier of challenges that need to be addressed; and strategies for deterrence. We’re looking forward to this event and we’re thankful for these partnerships as we continue to team up on this important issue.

Source : https://newsroom.fb.com/news/2019/03/detecting-non-consensual-intimate-images/