Meta launches tool to stop revenge porn from spreading on Facebook and Instagram – but users concerned of being victimized must submit images and videos to a hashing database to make a case

- Advertisement -


  • The tool comes from a global website called StopNCII.org, which stands for ‘Stop Non-consensual Intimate Images’.
  • Concerned people can post or post their intimate images or videos on Facebook or Instagram, make a case through the website
  • This is done by uploading the images or videos to the website
  • Meta says the content has been converted to digital fingerprints, allowing them to clearly identify or detect content and claims that no human eye sees them.

- Advertisement -

Meta on Thursday launched a new tool that blocks the spread of revenge porn on Facebook and Instagram, but requires people to submit their sexually explicit photos and videos to a hashing database.

When someone sees their intimate images or videos posted or may be posted on a social media platform, they can make a case through a global website called StopNCII.org, which means ‘non-consensual Stop intimate images’.

advertisement

Each photo or video submitted receives a digital fingerprint, or unique hash value, that is used to trace and track a copy that was shared or attempted to be posted without the individual’s permission it was done.

However, the website was built with 50 global partners and sharing your intimate images and videos with a third party website may not sit well with most users, but Meta says they do not have access to the original images. will not access or store copies of. ,

- Advertisement -

A META spokesperson told DailyMail.com in an email that ‘only the person submitting the case to StopNCII.org has access to their images/videos’ and that ‘all that is needed to calculate the hash of an image in the browser’ Counts take place, which means ‘the images never leave the person’s device.’

“Only cases are submitted to StopNCII.org and the hashes of the person’s images or videos are shared with participating tech companies such as Meta,” the spokesperson said.

scroll down for video

Meta on Thursday launched a new tool that blocks the spread of revenge porn on Facebook and Instagram, but requires people to submit their pornographic photos and videos to a website.

“Only the hash, not the images themselves, is shared with StopNCII.org and participating technology platforms,” ​​Antigone Davis, global head of security for Meta, shared in a statement. blog post,

‘This feature prevents further dissemination of that NCII material and keeps those images safe in the possession of the owner.’

StopNCII.org is based on a pilot program launched in Australia in 2017, when it asked the public to take pictures of themselves to generate hashes that could be used to locate similar images on Facebook and Instagram.

And this foundation is being used to stop revenge porn.

When someone sees their intimate images or videos being posted or posted on a social media platform, they can make a case through a global website called StopNCII.org.

When someone sees their intimate images or videos being posted or posted on a social media platform, they can make a case through a global website called StopNCII.org.

Meta was originally going to set up Facebook to prevent people from spreading their intimate images or videos, but sensitive media would have been reviewed by human moderators during the process before they were converted into unique digital fingerprints. Go, NBC News Report.

Knowing this, the social media firm opted to bring in a third party, StopNCII, which specializes in image-based abuse, online safety and women’s rights.

“Only the hash, not the image itself, is shared with StopNCII.org and participating technology platforms,” ​​Davis wrote.

‘This feature prevents further dissemination of that NCII material and keeps those images safe in the possession of the owner.

StopNCII.org is for adults over the age of 18 who think an intimate image of them may be shared, or has already been shared, without their consent.

A report in 2019, released by NBC, Meta identifies approximately 500,000 cases of revenge porn every month.

To combat the influx of revenge porn, Facebook employs a team of 25 people to help take vet reports and photos in conjunction with an algorithm developed to identify nude images.

But with StopNCII.org, human moderators are replaced with hashes that can trace and identify images — after potential victims share explicit content with Meta.

What is revenge porn on Facebook?

The vast majority of revenge porn falls under Facebook’s rules on nudity.

In March 2015, however, the social network brought out specific community guidelines to address the growing problem of revenge porn.

The section titled ‘Sexual Violence and Exploitation’ deals specifically with this topic.

The guidelines say: ‘We remove content that threatens or promotes sexual violence or exploitation.

This includes sexual abuse and sexual assault of minors.

‘For the safety of victims and survivors, we also remove images or videos depicting incidents of sexual violence and images shared as revenge without the permission of the people in the images.

Our definition of sexual abuse includes solicitation of sexual material, any sexual material involving minors, threats to share intimate images and offers of sexual services.

‘Where appropriate, we refer this material to law enforcement.’

advertisement

,

- Advertisement -
Mail Us For  DMCA / Credit  Notice

Recent Articles

Stay on top - Get the daily news in your inbox

Related Stories