The European Union has issued a warning that Meta may be failing to prevent children under the age of 13 from accessing its platforms, including Facebook and Instagram. This raises concerns about exposure to inappropriate content and the possibility of significant financial penalties.
EU Investigation Findings
Following an investigation under the Digital Services Act, EU regulators determined that Meta's safeguards were ineffective. They noted that children could easily bypass age restrictions by entering false birth dates. Additionally, reporting tools for underage users were found to be difficult to access and use.
Henna Virkkunen stated, "Terms and conditions should not be mere written statements, but rather the basis for concrete action to protect users, including children."
Meta's Policies and Enforcement
Meta's own policies require users to be at least 13 years old. However, EU officials said enforcement measures were insufficient and failed to adequately assess risks posed to younger users. If the findings are confirmed, the EU could impose a fine of up to six percent of Meta's global annual turnover. However, Meta could avoid penalties by implementing corrective measures.
Meta's Response
Meta has rejected the claims, stating it already has systems in place to detect and remove underage users. The company said it will continue to engage with EU authorities on the matter.
Broader EU Efforts
The investigation, launched in May 2024, is part of broader EU efforts to regulate Big Tech and improve online safety for children. Regulators are also examining issues such as platform design, including features they describe as potentially "addictive," and their impact on users' wellbeing.
The EU is considering further measures, including a bloc-wide age limit on social media use, as pressure mounts following similar restrictions introduced in other countries.



