Meta has launched a service to protect against porn revenge. The project asks users to submit their own nude photos.

Monitoring Desk

CALIFORNIA: Meta, which owns Facebook and Instagram, announced the creation of StopNCII.org, a service that will help stop the spread of intimate images without their consent.

Meta co-developed the project with the Revenge Porn hotline, and has also received support from 50 non-profit organizations. He, as the developers say, will be able to protect users from the so-called revenge porn.

The company called StopNCII.org “the first global initiative of its kind aimed at safely helping people concerned about their intimate images.” The service uses a special technology that converts nude photos and videos into a unique digital code – a hash.

Images will be read directly from users’ devices, and not transferred to other services. Companies can then search for similar images by code and delete them.

StopNCII.org is based on technology developed as part of a Facebook and Instagram pilot in 2018. The developers clarified that if an intimate photo is cut off or filters are applied to it, then its hash will change and the service will not be able to recognize it.

Earlier, Apple was going to introduce special software into the smartphones of American users that could scan photos on devices. It is expected that the software will analyze images that can capture footage of violence against minors.

The company has previously scanned photos uploaded to iCloud for child abuse.

In September, Apple delayed the launch of the project to improve it.