How can children’s experiences in a digital world be made age-appropriate or at the very least not age-inappropriate or harmful? The effort to achieve this, on the part of policymakers, businesses and parents/ caregivers, is primarily led by a concern to protect children from harm facilitated by digital technologies. Child protection measures, in turn, are typically designed as a matter of responsible (or reputational) business practice, to address public and parental concerns to manage children’s access to and safety in the digital environment and/or in response to regulatory requirements (whether legislation or co- or self-regulation).
But what do we know about children’s concerns and experiences regarding accessing potentially harmful content, including negotiating parental control tools, responding to parental mediation, possibly finding workarounds for filters and age restrictions? It turns out – not that much. Today we are launching a new report reviewing the available evidence relating to two key measures to protect children online – age assurance and parental control tools – viewed from the perspective of children’s everyday lives.
We found that the existing evidence is limited, particularly in relation to age verification and how it affects children in diverse circumstances. Below we summarise key findings.
The use of age assurance in digital contexts
- The evidence shows that mechanisms for age assurance are rarely implemented for the sale of age-restricted goods.
- Existing barriers to accessing age-restricted content, goods and services are mostly ineffective.
- Children can use workaround strategies to challenge the age assurance system.
- Depending on their technical features, some measures might pose risks to children’s safety or privacy or can be more invasive than the situation requires.
- Accessing restricted goods became easier during COVID-19 lockdowns as many purchases were done online and delivered in a socially distanced manner.
- Parents want flexibility in deciding when a service may be appropriate to their children’s circumstances and needs.
- Age ratings are seen as informative by parents but not necessarily relevant to their specific circumstances.
The use of parental control tools for online safety
- Children value having control over their internet use and some tools help them achieve this.
- Parental control tools can respond to the anxieties of digital parenting as some parents worry how to effectively protect their children online.
- Parenting control measures are insufficient on their own; they work best when used in a context of supportive and enabling parenting.
- Children find unjustified restrictions frustrating.
- Fostering communication and trust between parents and children is crucial for enabling the positive use of parental control measures.
- Measures need to account for the child’s evolving capacities in a way that enables learning and development.
- Family diversity matters to how age assurance is used. The nature and need of technical mediation differ depending on family circumstances, parenting practices and cultural norms.
- Measures often demand technical skills that some parents lack.
- Restricting children’s internet use reduces opportunities, digital skills and learning but it does not necessarily limit the online risk of harm.
- There is a risk of exacerbating existing vulnerabilities or compounding disadvantage, especially in relation to vulnerable children and families.
Implications for child rights
A child rights approach to the digital environment will enable children to enjoy their civil rights and freedoms while also protecting them from harm. Future development of age verification and parental control tools needs to address the following concerns:
- Children’s right to protection is currently prioritised over their other rights, such as participation, learning or privacy.
- Measures do little to enable children’s right to be heard, they tend to be developed from the viewpoint of industry’s or parents’ interests, and rarely consider children’s needs or voices.
- Children’s increasing capacity to make their own choices remains largely unsupported. We found little evidence that the available measures possess the granularity that can support these changing needs.
- Many age assurance measures do not respect children’s rights to privacy or autonomy and can sometimes enable an inappropriate or undesirable degree of parental surveillance.
- It is possible that such measures might discriminate some children. Designs that assume the presence of an engaged adult might discriminate against children whose circumstances are different. It is also unclear how other vulnerable groups might be affected, for example disabled children.
- There appears to have been little attention to how child protection measures can have a positive effect on child rights, for example by creating a richer and more diverse digital ecology that caters for children’s rights and interests.
Child protection measures are designed to solve a wide range of intersecting problems relating to the content, contact, conduct and contract risks children may encounter online. They are subject to continual innovation as societal expectations, regulatory frameworks and the digital environment co-evolve. They offer considerable promise for child safety and protection online, should they overcome the existing challenges.
For design considerations and more findings, read the report.
Image credit: Photo by Priscilla Du Preez on Unsplash