Facebook, Instagram Face EU Probe Over Suspicions of Failing to Impose Child Protection Policies

By

Facebook, Instagram Face EU Probe Over Suspicions of Failing to Protect Child Safety
This picture taken on October 5, 2020 in Toulouse, southwestern France, shows logos of US social networks Facebook and Instagram on the screens of a tablet and a mobile phone. LIONEL BONAVENTURE/AFP via Getty Images

The European Union has initiated new investigations on Thursday targeting Facebook and Instagram.

There are suspicions that these platforms are not adequately safeguarding children online, which would be a violation of the strict digital regulations set by the bloc for social media platforms.

Meta Faces EU Digital Scrutiny for Child Protection Compliance

The parent company Meta Platforms is facing another round of scrutiny under the EU's Digital Services Act, a set of regulations aimed at cleaning up online platforms and protecting internet users. This is the latest development in Meta Platforms' ongoing challenges.

According to News 10, the commission is also investigating Meta's implementation of age verification tools to ensure the protection of young users and prevent them from accessing Facebook or Instagram, or being exposed to inappropriate content.

Users must be at least 13 years old to create an account on the platforms. The investigation is also examining whether the company is adhering to DSA regulations that mandate a strong emphasis on privacy, safety, and security for minors.

EU Investigates Meta's Algorithms, Rabbit-Hole Effect on Children

The EU has highlighted its primary concerns regarding the algorithms that can potentially lead to behavioral addictions in children, causing what is known as the "rabbit-hole effect." Additionally, they have expressed interest in Meta's age assurance and verification methods.

The rabbit-hole effect describes the phenomenon where users are exposed to harmful content and subsequently receive recommendations for increasingly extreme variations of similar content from the algorithm.

A study conducted by Standford University in 2023 and published in the peer-reviewed journal Science Advances examined the impact of YouTube algorithms on user behavior. The study revealed that the algorithms seldom lead users into deep "rabbit holes".

On the other hand, the study discovered that YouTube, a platform owned by Google, does suggest alternative or extremist content to approximately 3% of users who actively search for it, AP News reported.

The commission is also investigating whether the company is adhering to DSA rules that mandate a strong emphasis on privacy, safety, and security for minors.

Tags
Facebook, Instagram, Meta, EU

© 2024 VCPOST.com All rights reserved. Do not reproduce without permission.

Join the Conversation

Real Time Analytics