Search
Close this search box.

EU Investigates Meta’s Child Safety Practices on Facebook and Instagram: What You Need to Know

Meta, the parent company of Facebook and Instagram, is currently under investigation by the European Commission (EC) for its handling of child safety on social media platforms. The EC is looking into whether Meta violated the Digital Services Act (DSA) by potentially fueling social media addiction among children and failing to implement robust safety and privacy measures.

Key points from the investigation include:

– Concerns about how Facebook and Instagram algorithms may stimulate behavioral addictions in children and create ‘rabbit-hole effects.’
– Questions about Meta’s age-assurance and verification methods.
– Evaluation of whether Meta effectively addresses risks from its platforms’ interfaces, exploiting vulnerabilities among minors.

This scrutiny is essential to protect children’s well-being and rights while using online platforms. The investigation will also assess Meta’s measures to block minors from accessing inappropriate content, provide age verification tools, and offer simple yet robust privacy settings by default.

The EU’s DSA, in effect since February 17, requires large online platforms like Meta to take additional steps to combat illegal content and ensure public safety online. Meta has stated its commitment to creating safe online experiences for young people through the development of tools and policies over the past decade.

As the industry faces this challenge together, Meta looks forward to sharing details of its work with the European Commission. Stay tuned for updates on how Meta plans to enhance child safety on Facebook and Instagram.

Share on:

Leave a Reply

On Key

Related Posts