top of page
  • Writer's pictureVenera Urbaeva

Child Development and Protection in the Digital Age

Updated: Apr 18, 2021



Proliferation of emerging technologies in recent decades has meant that children and adolescents are more exposed to digital technologies than ever before. This includes access to mobile devices, social media platforms, online chats, video gaming, robotic assistive technologies for children with autism, and the overall revamp of online education as a result of the COVID-19 pandemic. Today’s children are the new generation that is growing up in an age of digital technology with minimal or no regulatory frameworks in place to ensure their safety, integrity, and holistic development. This article reviews the definitions for children and artificial intelligence and explores ways children can be protected and can benefit from interaction with the digital landscape and artificial intelligence.


How do we define children?

Children and adolescents are a special group of people, defined as persons up to the age of 18 according to the United Nations Convention on the Rights of the Child (UN CRC), the single most ratified international human rights treaty in the world. UN CRC is premised on the recognition that childhood is a special and important phase of human development. Therefore, all children are accorded with specific child rights to ensure their growth and development free from violence, abuse, and neglect. Furthermore, childhood encompasses infancy, early childhood, and adolescence, which means children’s risk and readiness to benefit from digital platforms will vary in terms of their age and developmental level. What is appropriate for seven-year-olds is going to differ from what is appropriate for fifteen-year-old teenagers. As childhood is characterized by rapid changes in physical and cognitive growth, any potential abuse or neglect in relation to exposure to digital technologies may have a lifelong negative impact on their wellbeing. In addition, exclusion due to lack of access to digital technology can also be a missed opportunity for their development. Therefore, how we positively harness the role digital technologies and artificial intelligence can play in the development of children and adolescents is more important than ever.



How are children impacted by digital technology?

Access to social media, online chats, and games has provided opportunities for exploration and connection but also risks such as cyberbullying, online grooming, and online sexual abuse. The increasing number of hours that children spend looking at a screen raises concern among child development specialists. Psychologists warn against using screen-time as a reward, i.e. children can have screen-time if they complete a chore or activity.


Artificial intelligence systems, powered by machine learning, such as Alexa, Siri, or facial recognition systems, flag issues of privacy and identity protection for children, who are not fully aware of or can understand the dangers and repercussions of their interaction. Use of socially assistive robotics for special needs children can provide access to supplemental therapy, as therapists are not available 24/7. A month long study showed that children with autism spectrum disorder and their families found the experience of robots living in their home to be positive and useful. While there is limited research to ascertain benefits of socially assistive robotics on child development, it nevertheless presents a unique opportunity for supporting children with special needs and their families. The question is how do we ensure equitable access to such technology and protect children from possible misuse, such as breach of privacy, as well as balance between using artificial intelligence and ensuring the human component necessary for development and regulation of emotions. The risks and harms of digital tools present a dilemma of how to strengthen child protection.



What are the opportunities for promoting child learning and preventing digital harm?

The digital age presents ample opportunities for children and adolescents. Artificial intelligence can contribute to their learning through play. Games such as emoji scavenger hunt, shadow art, quick draw and chatbots can be fun ways to incorporate artificial intelligence into children’s learning activities with adequate parental supervision to ensure its benefit and developmental appropriateness depending on age. This requires parental digital literacy and time, which may not always be the case as parents juggle the challenges of working from home. However, appropriate exposure and training, such as coding, can enable children to participate and lead in creating future products of artificial intelligence.


There are other opportunities to harness the positive impact of technologies. Specifically, artificial intelligence can be used to fight online child sexual abuse by detecting and reporting it. According to Internet Watch Foundation, with the widespread lockdowns there has been an increase in individuals searching for child sexual abuse material online as well as a significant increase in public reporting of child pornography when comparing 2020 to 2019. It is important to understand that with online platforms, images of child pornography can (re)circulate and stay for a very long time, re-traumatizing the child victims perpetually. In order to fight child online sexual abuse, software and tools such as NetClean ProActive and Safer by Thorn can be deployed to automatically detect, report and remove images and videos of child pornography. Recently released guide for tech companies provides practical suggestions to take action to protect children based on voluntary principles to counter online sexual exploitation and abuse.


How can we safeguard children through legislative provisions and regulations?

There have been some advances with the development of regulatory frameworks, which are primarily taking place in the developed world or the Global North. In the UK, an Age Appropriate Design Code, also called the Children’s Code, came into force on 2 September 2020. It is a code of practice for data protection, which requires companies that develop apps, online games, web and social media sites (likely to be accessed by children) to conform to basic standards of protection by design and default. It is premised on the recognition that children should be given special treatment and protection for data privacy, as per the UN CRC. The standards of age appropriate design include principles of the best interests of the child, impact assessments of data protection, age appropriate application and default design, nudge techniques and possible detrimental use of data, as well as parental controls.


More globally, on 23 June 2020 the International Telecommunications Union (ITU) has launched updated Guidelines on Child Online Protection (COP) on developing a safe and empowering online environment for children and adolescents. The guidelines contain recommendations targeting specifically children, parents and caregivers, industry, and policymakers, which can be used by national governments to create safe digital spaces for children to play, learn and grow. COVID-19 has amplified the need for such guidelines to inform development of national policies and local actions to ensure child safeguarding. With greater recognition for public-private partnerships, the role of the industry in designing products and tools with child needs and child rights in mind is becoming very crucial, given that one in three internet users globally are children and young people, according to a report by UNICEF and ITU. The exclusion of 2/3 of the global population of children and young people as well as the stark gap among internet users in high-income and low-income countries (87% coverage compared to 6%, respectively) is a digital divide that may exacerbate existing inequalities and inequities. UNICEF is currently putting together a policy guidance on Artificial Intelligence for Children for governments and businesses to protect children and provide equitable opportunities for youth in shaping artificial intelligence.


How can parents and communities be empowered to protect children from online harm?

The potential risks and harm that children face can be mitigated through awareness raising and education of children and families, empowering them with practical tools to understand and mediate the concerns that exist in the online environment. This means that parents and caregivers, particularly in the low- and middle-income countries, will require additional support towards digital literacy and online child protection. Given that secondary education has now gone online due to the pandemic, there is a need to engage schools and education systems to install and ensure safeguarding software and programs, as well as equip teachers to ensure greater safety.


You can join a movement to protect children from harm online and sign the Child Online Safety Universal Declaration created by the Broadband Commission Working Group on Child Safety Online. We all have a role to play by doing our share of work, from developing policies and products that promote the protection of children to engaging various institutions in strengthening child online safety, which includes applying artificial intelligence and data analytics tools to prevent online networks and services from collecting and distributing child sexual abuse material. Taking concerted action towards child rights protection also involves partnering with researchers to generate evidence to better understand children’s digital experiences and enhance protection in today’s fast-paced digital landscape. Our commitment to children will be reflected in our actions and the tangible results we can achieve for a safer environment for children to play, learn and grow. Because EVERY child is worth it.


By Venera Urbaeva

Venera Urbaeva is a child protection and public health professional with over 14 years of experience in the humanitarian and development field.



bottom of page