The structure of the document and recommendations remains largely unchanged from the previous draft, but the final version includes more clarification on what qualifies as an 'accessible to minors' service, and softens some of the wording on platform defaults, offering a little more flexibility in implementation.
The scope has been clarified to explain what qualifies as a service ‘accessible to minors,’ based on indicators such as user demographics, marketing practices, and platform design. A simple disclaimer in the terms of use is no longer sufficient. The guidelines now take a broader perspective on children's rights, requiring platforms to consider not only protection but also participation, non-discrimination and access to information.
Risk assessments must classify platform features as low, medium or high risk and be repeated at least once a year or after significant changes are made. Although publication is not mandatory, providers are encouraged to share summaries, omitting sensitive operational details.
In terms of age verification, the final version introduces a more structured framework. Age verification is required for adult content and high-risk environments. Age estimation is only recommended in cases where verification would not be proportionate and a user declaration alone is clearly recognised as insufficient. The guidelines do not prohibit national measures but set high requirements for compliance with the EU benchmark. This is clearly aimed at limiting fragmentation, although several Member States, in particular France, Denmark and Greece, have pushed strongly for more prescriptive national rules.
The Commission has also presented a prototype age verification app as a reference standard. Platforms may use other methods, but only if they meet equivalent standards of accuracy, reliability and privacy.
The guidelines relax some of the expectations regarding default settings and offer a slightly more flexible approach than the previous draft, but still expect geolocation, tracking and automatic playback to be disabled by default and recommend age-appropriate user controls.
In terms of governance, the guidelines maintain proportionality, allowing smaller platforms to apply lighter approaches, but emphasise the importance of oversight and, where appropriate, the participation of children.
The Commission will use these guidelines to assess compliance with Article 28 of the DSA. They will serve as a reference point for checking whether platforms meet the required standards and may form the basis for the enforcement of national rules. However, they remain voluntary and do not automatically guarantee compliance.