Instagram to Test Features That Blur Messages Containing Nudity in Move to Boost Teen Safety

Instagram, under its parent company Meta, announced on Thursday its plans to test features aimed at blurring messages containing nudity to enhance teen safety and deter potential scammers from accessing them. This move comes amidst growing concerns regarding harmful content on social media platforms, particularly its impact on young users’ mental health. The tech giant intends to deploy on-device machine learning to analyze images sent through Instagram’s direct messages for nudity, with the protection feature set to be enabled by default for users under 18. Additionally, Meta will notify adults to encourage them to activate the feature.

Meta emphasized that the nudity protection feature would operate even in end-to-end encrypted chats, ensuring privacy, unless users opt to report such content. Unlike Meta’s other messaging platforms like Messenger and WhatsApp, direct messages on Instagram are not currently encrypted, although the company plans to introduce encryption for this service. Furthermore, Meta disclosed ongoing efforts to develop technology capable of identifying accounts potentially involved in sextortion scams, along with testing new pop-up messages for users who may have interacted with such accounts.

In January, Meta announced plans to increase content restrictions for teens on Facebook and Instagram, with the aim of limiting their exposure to sensitive topics like suicide, self-harm, and eating disorders. The company’s actions come in response to mounting legal and regulatory scrutiny, including lawsuits from attorneys general of 33 US states alleging misrepresentation of platform risks and inquiries from the European Commission regarding child protection measures.

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker