Instagram May Add a ‘Nudity Protection’ Filter to Safeguard Users

instagram logo

Instagram appears to be testing a new feature that would cover photos that may contain nudity in Direct Messages and safeguard users from unwanted content exposure.

The “nudity protection” setting was spotted by Alessandro Paluzzi, a developer known for reverse engineering apps and finding early versions of upcoming updates.

The new nudity protection option would enable Instagram to activate the nudity detection element in iOS, which scans incoming and outgoing messages on a user’s device to detect potential nudes in attached images.

If the nudity protection feature is selected, Instagram will automatically blur an image if the app detects a photo with nudity in Direct Messages. The app will then send a notification to the user indicating that they have received an image that may contain nudity, offering a button to access the content, if desired.

According to the screenshot shared by Paluzzi, nudity protection is an option that can be turned on and off in iOS settings.

In Paluzzi’s screenshot, Instagram makes an effort to reassure users that the company “can’t access the photos” and it is simply “technology on your device [that] covers photos that may contain nudity.”

This message suggests that Instagram does not download and examine images in direct messages. Instead iOS technology on an Apple device will be able to access messages and filter based on the content.

However, Apple has tried to assure users that it is not downloading the images and that this filtering is done by Artificial Intelligence (AI) and data matching, which does not trace or track the particulars of a user’s online interactions.

Nonetheless, the news of the nudity protection feature is a significant step for Instagram’s parent company, Meta which has been working to increase protection for younger users.

Meta has faced serious questions about its efforts to keep younger users safe on its platforms. In June, Meta was served with eight different lawsuits that contend the company deliberately adjusted its algorithm to hook young people.

Earlier this month, Meta was fined a record $402 million for letting teenagers set up accounts on Instagram that publicly displayed their phone numbers and email addresses.


Image credits: Header photo licensed via Depositphotos.

Discussion