Meta slapped with child safety probe under sweeping EU tech law
Facebook parent company Meta on Thursday was hit with a major investigation from the European Union into alleged breaches of the bloc's strict online content law over child safety risks.
The European Commission, the EU's executive body, said in a statement that it is investigating whether the social media giant's Facebook and Instagram platforms "may stimulate behavioural addictions in children, as well as create so-called 'rabbit-hole effects.'"
The commission added it is concerned about age verifications on Meta's platforms, as well as privacy risks linked to the company's recommendation algorithms.
"We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them," a Meta spokesperson told CNBC by email.
"This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission."
The commission said its decision to initiate an investigation comes of the back of a preliminary analysis of risk assessment report provided by Meta in September 2023.
Thierry Breton, the EU's commissioner for internal market, said in a statement that the regulator is "not convinced [that Meta] has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms."
The EU said it will carry out an in-depth investigation into Meta's child protection measures "as a matter of priority." The bloc can continue to gather evidence via requests for information, interviews, or inspections.
The initiation of a DSA probe allows the EU to take further enforcement steps, including interim measures and noncompliance