EU Investigates Meta Over Child Safety
Home > Tech > Article

EU Investigates Meta Over Child Safety

Photo by:   Free pik
Share it!
Diego Valverde By Diego Valverde | Journalist & Industry Analyst - Fri, 05/17/2024 - 08:36

The European Commission has formally launched an investigation into Meta, the parent company of Facebook and Instagram, citing concerns regarding the platforms' potential to induce addictive behaviors among children and negatively impact their mental health. This investigation marks a significant enforcement action under the Digital Services Act (DSA), which mandates stringent online safety measures, particularly for protecting minors.

Back in April 30, 2024, the European Commission had initiated formal proceedings focused on issues related to deceptive advertising, the handling of political content, notice and action mechanisms, data access for researchers, and the absence of an effective third-party real-time civic discourse and election-monitoring tool in advance of the European Parliament elections, pointing out the European Union’s concerns about Meta. The current inquiry stems from concerns that Facebook and Instagram may exploit the vulnerabilities of younger users, fostering addictive behaviors and exposing them to harmful content. EU Commissioner for the Internal Market, Thierry Breton, emphasizes that Meta's compliance with DSA mandates is under scrutiny. "We are not convinced that it has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram," says Breton.

The DSA, which became enforceable in February 2024, has the primary objective to ensure that large online platforms adhere to robust safety standards, thereby mitigating risks associated with disinformation, online scams, and child exploitation. The act's introduction follows growing apprehension about the influence of powerful digital companies, often described as "too big to care" about societal well-being and public safety, according to the EU website.

The investigation, according to the EU announcement, will particularly focus on the called "rabbit hole" effects induced by algorithmic content recommendations. These effects are known to immerse users, especially minors, in a continuous stream of negative content, such as unrealistic body images, which can exacerbate mental health issues. Additionally, the efficacy of Meta's age-verification mechanisms will be closely examined by the commission due the simple ways to circumventing verification controls. 

Meta has expressed its commitment to creating safe, age-appropriate online experiences. A spokesperson highlighted the company's decade-long efforts in developing over 50 tools and policies aimed at protecting young users. "We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission," the spokesperson said.

Against this backdrop, it should be noted that the commission is also exploring the integration of the European digital identity wallet for enhanced age verification. This tool, currently in its testing phase, aims to streamline identity verification across the EU, facilitating safer online interactions. Businesses are advised to monitor these developments closely as they may necessitate adjustments in compliance strategies and user verification processes.

If the European Commission finds Meta’s measures insufficient, it can impose fines up to 6% of the company’s global turnover (around US$2.4 billion). In addition, immediate actions could include on-site investigations and executive interviews, with no deadline publicly fixed to complete the investigation.

Photo by:   Free pik

You May Like

Most popular

Newsletter