- The EU is investigating Meta over concerns its platforms are fueling addiction among minors.
- Regulators said they would also look into the company’s age-verification tools.
- It comes as Meta faces growing legal pressure in the US over the impact of its platforms on kids.
Meta is facing more questions over whether it’s doing enough to protect kids on Facebook and Instagram.
The EU said on Thursday it had opened an investigation into the social media giant over concerns that Meta’s platforms are creating “rabbit-hole” effects and fueling harmful addiction among 𝘤𝘩𝘪𝘭𝘥ren.
The European Commission, the bloc’s regulatory body, said it would also investigate whether Meta’s age-verification tools are stopping minors from accessing inappropriate content.
“We have concerns that Facebook and Instagram may stimulate behavioural addiction and that the methods of age verification that Meta has put in place on their services is not adequate,” said competition commissioner Margrethe Vestager.
Meta could be fined up to 6% of its global annual turnover if found to have violated the EU’s Digital Services Act, which regulates the protection of minors online.
The investigation comes as the company is confronted with growing legal pressure in the US over the potential impact of its platforms on young people.
A lawsuit filed by 33 states in California last October claimed that the company deliberately designed Facebook and Instagram to be addictive to teenagers.
Unredacted documents from that lawsuit later claimed Meta knew it had millions of underage users on its platforms, and that the company “routinely continued to collect” 𝘤𝘩𝘪𝘭𝘥ren’s personal information.
The lawsuit also accused Meta CEO Mark Zuckerberg of ignoring calls from Meta executives to tackle 𝘤𝘩𝘪𝘭𝘥 safety concerns on Instagram.
At the time, Meta told Business Insider that the complaint “mischaracterizes our work using selective quotes and cherry-picked documents.”
The company also faces a second European investigation into its handling of Russian disinformation.
“We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them,” said a Meta spokesperson.
“This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”