Instagram is said to be testing a new feature that will prevent users from being exposed to unsolicited nudes that drop in their Instagram direct messages. Reportedly called ‘Nudity Protection’, the feature is said to be in the works currently and could roll out to users in the coming weeks. The company confirmed testing the feature in several publications after an app developer named Alessandro Paluzzi published an early image of the feature.
The Instagram tool under development will help protect users from unwanted messages and photos in their messages. According to the preview shared by Paluzzi, Nudity protection is a technology on your device that covers photos that may contain nudity in chats. The company in the preview also confirms that won’t have access to these photos.
A Meta spokesperson while talking to The Verge, said that technology will neither allow Meta to access these photos nor any third parties. The company has confirmed working closely with experts on this tool and has ensured that the new feature will preserve people’s privacy along with giving them control over the messages they get.
The company has also confirmed that it will be an opt-in feature and that the photo-sharing app won’t impose it on users by default.
Meta has shared that it will share other features about the same in the upcoming weeks. Other than this, there is no other information pertaining to this feature.
In a report published by a British NGO named Center for Countering Digital Hate, it was found that Instagram’s tools failed to act on at least 90 per cent of image-based abusive DMs which were sent to women on a daily basis. Even the messages which were sexual in nature were not blurred out.