The European Commission has issued a preliminary finding that Meta Platforms may be violating the Digital Services Act (DSA) due to insufficient safeguards preventing users under 13 from accessing Facebook and Instagram. The Commission’s investigation focuses on whether Meta has effectively identified, assessed, and mitigated risks to minors, particularly regarding minimum age enforcement.
According to the Commission, Meta’s systems allow users to enter false birthdates during signup without meaningful verification, and post-registration detection or removal of underage accounts appears limited. Reporting tools for underage user concerns are reportedly hard to access and inconsistently acted upon. Henna Virkkunen, Executive Vice-President for Tech Sovereignty, Security and Democracy at the European Commission, criticized Meta’s enforcement as doing “very little” to uphold its own minimum age rules.
If these preliminary findings are confirmed, the Commission could impose a non-compliance decision and fines of up to 6% of Meta’s global turnover, along with potential periodic penalties to ensure compliance. Meta has disputed the findings, stating that Instagram and Facebook target users aged 13 and older and already employ measures to detect and remove underage accounts. Meta also highlighted the complexity of age verification as an industry-wide challenge and promised forthcoming additional measures.
Why it matters
This enforcement action marks a significant moment in the European Union’s efforts to protect children online through the Digital Services Act. It signals increased scrutiny of global platforms’ age verification systems and their actual effectiveness in safeguarding minors. The Commission’s stance reinforces that platform terms of service must translate into effective, practical measures rather than mere written policies.
The potential fine underscores the financial risks for large tech firms that fail to comply with EU digital safety standards, while raising the bar for how social media companies must handle child protection in Europe.
Background on enforcement and age verification
The Commission’s findings come amid broader EU digital safety initiatives focusing on minors. Earlier in 2025, the Commission preliminarily found TikTok exposed teenagers to risks linked to addictive platform design. Alongside enforcement, the Commission recently recommended EU member states adopt a voluntary age verification app employing privacy-preserving cryptographic methods to confirm users’ age thresholds without exposing personal data.
This voluntary verification tool reflects ongoing efforts to develop robust, privacy-compliant age assurance mechanisms. Platforms not integrating this app will need to prove their alternatives are equally effective. However, some member states prefer national age verification models, indicating potential fragmentation in the enforcement landscape.
Assessment of Meta’s risk reports and suggested improvements
The Commission criticized Meta’s internal risk reports as incomplete, insufficiently transparent, and lacking serious evaluation of risks related to underage access. External studies estimate that about 10% to 12% of EU children under 13 use Facebook or Instagram, a factor Meta reportedly underrepresents in its assessments.
The DSA requires very large online platforms like Meta to publish annual risk assessments and mitigation strategies. Per the Commission’s 2025 Guidelines on the Protection of Minors, these measures should include making minors’ accounts private by default, modifying recommender systems to limit harmful content exposure, and disabling features promoting excessive use.
The Commission expects Meta to adopt clear methodologies for risk evaluation, enhance age verification beyond self-declared birthdates, improve detection and removal of underage accounts, and strengthen reporting and moderation tools. Meta may utilize the EU’s age verification app or an equivalent system that meets the same standards.
Meta now has the opportunity to respond formally to the Commission’s investigation before any final decision is made, which could shape future industry practices on child safety and risk assessments under the DSA.
Read more AI Regulation stories on Goka World News.
Sources
This article is based on reporting and publicly available information from the following source:
