office@euconsent.eu

PPPA-AGEVER-01-2020: “Outline and trial an infrastructure dedicated to the implementation of child rights and protection mechanisms in the online domain”

PPPA-AGEVER-01-2020: “Outline and trial an infrastructure dedicated to the implementation of child rights and protection mechanisms in the online domain”

Home » News » Project activities » Age Verification and Child Protection: An Overview of the Legal Landscape

Age Verification and Child Protection: An Overview of the Legal Landscape

Age verification has been one of the methods used to ensure that children are protected from possible harms in the traditional world – traditional ID verification at shop checkouts offers a good example. However, its effectiveness in the digital world has been debated for quite some time. The question of how, and by which technical means age verification should be conducted has not been resolved yet. As technology advances, there is now  hope that product designers will be able to develop a privacy-preserving, flexible and tailored-made technical solution that could be adopted in the EU to protect children online, which is fundamentally the essence of the euCONSENT project.

The logical first step for the euCONSENT project was to undertake a legal mapping exercise in order to obtain an overview of the existing age verification requirements in the EU and the UK. Grouping the laws under three broad categories, our study set out to do a rapid review of the age verification requirements currently in place in the EU and the UK within the areas of: (i) online content as envisaged in the AVMSD for the protection of minors; (ii) online services focusing on online gambling; and (iii) the online sale of age restricted goods with a focus on tobacco and alcohol. The key findings from the study are now available on the project website.

There are various levels of age verification requirements mandated by different laws in the EU in the online context. In terms of content and information society services, two primary instruments that focus on children’s protection online are the Audiovisual Media Services Directive (‘AVMSD’) and the General Data Protection Regulation (‘GDPR’). They set the legal framework, establish principles, and regulate measures to be adopted by the operators to ensure that children are protected and able to exercise their rights in the digital world. The AVMSD requires operators to take necessary measures to protect children from harmful content that ‘may impair the physical, mental and moral development’ of them. The most harmful content such as pornography and gratuitous violence is subject to strictest measures. The AVMSD has been recently amended yet again due to how viewers, especially children, consume audiovisual content. The revisions to the Directive  reflect the changing nature of technology and newer delivery channels for online content. The material scope of the Directive therefore was extended to cover video-sharing platforms within its scope along with linear and non-linear services. The latter two services were already regulated under the previous version of the AVMSD. The revised Directive aligned these two services and amended the two-tier approach, where non-linear services were subject to relatively lighter rules. A significant change is the addition of video-sharing platforms to the remit of the AVMSD: although they do not usually have editorial responsibility for content published on their platforms, the new regime now required them to take appropriate measures to protect individuals, and, in particular children, from harmful content.

Whilst the AVMSD’s material scope has been extended to include video sharing platforms, it does not explicitly intent to regulate all social media platforms. However, a platform would fall within the scope of a video sharing platform service and be obliged to comply with the AVSMD ‘where the principal purpose of the service or of a dissociable section thereof or an essential functionality of the service is devoted to providing programmes, user-generated videos, or both’. Although the European Commission has published guidance with regard to how video sharing platforms and essential functionality test should be applied in practice, the determination of what constitutes a ‘video sharing platform’ is subject to the interpretation of the relevant authorities at national level. In practice, this could mean that there may be some differences of interpretation within different Member States. For instance, during our informal discussions with a number of national media regulatory authorities, it was apparent that there could be divergences in the way video sharing platforms are defined by different Member States. Given the short time since new AVMSD’s adoption, not many decisions or guidance have been issued by regulators to guide stakeholders in relation to the definition and scope of the video-sharing platforms. As more guidance is issued based on the revised AVMSD, this matter is expected to be clarified.

Our research revealed that although the due date to transpose the AVMSD has passed, only 17 countries have transposed the revised AVMSD into their national legal system as June 2021. Those 17 countries that transposed the AVMSD into their national legal system regulate the protection of minors in linear services, non-linear services, and video-sharing platforms. The transposition process is still ongoing in the remaining countries.

In terms of how age verification currently applies in practice, whilst there are exceptions many countries require stricter measures to be implemented for content that is categorised as 18 or above, including pornography and gratuitous violence. However, in most countries, content that falls within other categories usually do not require stricter measure such as age verification systems. Instead of requiring strict measures for such categories, Member States appear to provide discretion to parents and children by requiring providers to adopt other measures to protect and inform the viewers, for example labelling and age rating.

Our research also found that there is no consensus regarding the definition and scope of ‘harmful material’ among Member States. This is unsurprising, as each Member State could, and does, interpret what constitutes as harmful content that ‘may impair the physical, mental and moral development of minors’ differently. Except for certain content that are universally deemed harmful, the definition and scope of harmful material could naturally depend on the prevailing standards, culture, and other norms of the relevant society. Therefore, it is possible that content that is classified as harmful in one Member State which requires strict measures such as age verification, could fall under other categories that do not necessarily constitute harmful material in another state. This would require stakeholders to develop tailor-made and flexible solutions that could be adopted based on different requirements each Member State demands when it comes to implementation of any age verification systems.

In terms of gambling services, the study found that all EU Member States and the UK have set age limits to protect minors and vulnerable persons from the harmful effects of gambling. However, not all countries make a distinction between offline and online gambling. Some Member States recommend specific methods to be used for verification of the age and identity, yet there is no standard age verification system that is agreed to be applied in practice. Our findings show that 24 jurisdictions adopted 18 as the minimum age limit for all types of online gambling. In other jurisdictions, including Belgium, Estonia and the UK age limits ranges from 16 to 21 depending on the type of gambling activity. Greece has adopted 21 as the age limit for all types of gambling as standard.

In relation to the online sale of age-restricted goods of tobacco and alcohol, all Member States and the UK have imposed age limits for the sale of these products. The age limit set for the sale of alcohol ranges from 16 to 20 years in the EU (and the UK). Most countries have adopted 18 years as the age limit for the sale of alcohol. Some countries, including Austria, Belgium, Finland, Germany, and Sweden, have imposed different age limits depending on the beverage’s alcohol content. Poland has banned the online sale of alcohol in their jurisdiction. Sweden and Finland (for drinks with high alcohol content) and Lithuania (for all alcoholic drinks) have the highest age restriction where it is prohibited to sell these products to individuals below the age of 20. Germany and Belgium (for drinks with low alcohol content) and Luxembourg (for all alcoholic drinks), have the lowest age restriction where it is prohibited to sell these products to individuals below the age of 16.

As to the sale of tobacco, Member States have adopted 18 as the age limit to purchase tobacco and tobacco-like products. Online sale of tobacco is prohibited in several Member States, including Belgium, Bulgaria, France, Lithuania, Luxembourg and Poland. Countries such as UK and Malta require age verification systems to be implemented for online sale of such products. In contrast, some others do not have any explicit restrictions or regulations in place for the online sale of such products.

It was clear from the study that there are a range of national laws and EU laws that require age assurance and/or age verification measures to be developed and adopted to ensure the effective implementation of the laws.  Most countries have not established specific criteria or further guidance in relation to how age verification should be implemented in practice. Consequently, there is no common set of rules that is determined among Member States as to age verification, which we found to be a gap that needs addressing.

It is important to bear in mind that any age verification systems developed to complement the laws must also respect the full range of children’s rights as enshrined in the UN Convention on the Rights of the Child. To this end, it is essential to acknowledge that the primary goal of an age verification system is not to age-gate children from digital platforms. At the same time, as children are more vulnerable than adults, they require special protection in the online and offline worlds. The main objective should therefore be to create a safe environment for children to flourish, in a manner that they are able to take advantage of the full benefits and opportunities the internet presents. It is also absolutely essential that age verification systems are privacy-preserving with respect to all users, and that they consider children’s rights as an integral part of its system.

euConsent Consortium

Subscribe to our Newsletter

Do you want to be informed with the progress of euConsent or other related news? 

Thank you for subscribing. A email was sent to confirm your email address. Please check also spam folder.

Share This