Newsnews

Meta Introduces Stricter Messaging Restrictions And Enhanced Parental Controls For Teens

meta-introduces-stricter-messaging-restrictions-and-enhanced-parental-controls-for-teens

Meta, the parent company of Facebook and Instagram, has announced the implementation of new direct messaging (DM) restrictions for teens on its platforms. These restrictions aim to enhance the safety and privacy of young users.

Key Takeaway

Meta is introducing stricter messaging limitations for teens and empowering parents with enhanced controls, underscoring its commitment to improving the safety and privacy of young users amidst regulatory scrutiny and legal challenges.

New Messaging Restrictions for Teens

Under the new limitations, Instagram will prevent adults from messaging users under the age of 16 (and in certain regions, under 18) if they do not follow them. This expands the existing restrictions that apply to adults over 18. Meanwhile, on Messenger, users will only receive messages from their Facebook friends or contacts.

Enhanced Parental Controls

Meta is also introducing more robust parental controls, empowering guardians to approve or disallow changes in default privacy settings made by teens. Previously, guardians could only receive notifications when teens modified these settings without the ability to take action. With the new controls, guardians can block teens from making certain changes, such as switching their account from private to public or adjusting settings related to direct messaging.

Focus on Teen Safety

Meta is committed to prioritizing the safety of teen users, as evidenced by its efforts to prevent exposure to unwanted and inappropriate content. The company plans to introduce a feature that will discourage teens from viewing such images in their DMs, even in end-to-end encrypted chats.

Regulatory Scrutiny and Legal Challenges

While Meta continues to implement these measures, it faces regulatory scrutiny and legal challenges related to child safety and privacy. The company recently received a formal request for information from EU regulators regarding its efforts to prevent the sharing of harmful content. Additionally, Meta is confronting a civil lawsuit in New Mexico and a federal lawsuit filed by over 40 US states, both alleging harm to minors’ mental health through its platforms.

Leave a Reply

Your email address will not be published. Required fields are marked *