Apparently, the unwanted nudes on Instagram are numbered. Evidence points out that the social network is working on a tool that detects and covers these images when they are sent through the platform’s chat. Upon receiving the photo, the user can choose whether or not to view its content.
Tests of the new feature were discovered by mobile developer Alessandro Paluzzi, who posted an image of the tool on his Twitter.
According to the print, in addition to covering the nude sent, the feature has local processing, without involving Instagram’s servers.
In addition, to help users who have received or otherwise interacted with sensitive content, intentionally or not, the tool will provide some useful protection tips in these situations.
It is worth mentioning, however, that for those who dispense with this protection, there is nothing to worry about. The tool will be optional and can be activated or deactivated in the platform settings.
Instagram messages hide abusive content
This is not the first time that Instagram has taken steps to protect its users from sensitive or offensive content in the social network’s DMs.
In addition to separating conversations with your friends from messages sent by strangers, since last year platform implemented a filter which detects words, emojis and other aspects of text deemed undesirable in a chat.
In this way, when Instagram identifies a submission with abusive content, it relocates the conversation to “Hidden Requests”, a subfolder of “Contact Requests”, which makes accessing messages more difficult. Separated from the main tabs, the folder can still have its conversations viewed, deleted, blocked or even accepted, but it prevents its contents from being accidentally revealed.
In addition, if you want to erase its content all at once, without having to read the Directs, the user can open the folder and click on the “Delete All” option.