Brussels, Belgium — The European Union has initiated fresh investigations into Meta Platforms, the parent company of Facebook and Instagram, over allegations of failing to protect children online in accordance with the bloc’s stringent digital regulations. Announced on Thursday, the probes could result in substantial fines amounting to up to 6 percent of Meta’s annual global revenue if violations are confirmed.
The EU’s concern centers on the algorithmic systems employed by Facebook and Instagram to recommend content, which may inadvertently exploit the vulnerabilities and inexperience of younger users. The European Commission suspects that these algorithms could promote addictive behavior among children and expose them to increasingly disturbing material, a phenomenon often referred to as the “rabbit hole” effect.
“We are not convinced that Meta has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms,” stated European Commissioner Thierry Breton in a social media post, referencing the EU’s landmark Digital Services Act (DSA) which mandates tech companies to tackle illegal and harmful content more robustly.
The Commission also questions the adequacy of Meta’s age-assurance and verification methods, expressing concerns that children might access inappropriate content due to insufficient safeguards.
In response, a Meta spokesperson emphasized the company’s commitment to child safety online. “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them,” the spokesperson said. “This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”
This investigation is not the first time Meta’s platforms have come under EU scrutiny. Earlier, Facebook and Instagram were also investigated over allegations of inadequate measures to prevent foreign disinformation campaigns, particularly in the lead-up to the EU elections.
The outcome of these investigations could have significant implications for tech companies operating in the EU and might set precedents for regulatory approaches to online child safety worldwide. For businesses and investors in Asia, where social media usage is extensive, these developments signal a growing global emphasis on digital accountability and could influence future regulatory practices in the region.
Reference(s):
Facebook and Instagram face EU investigation over child safety risks
cgtn.com