Keeping our children safe with AI
* Register (or log in) to the AI4G Neural Network to add this session to your agenda or watch the replay
Last year marked the 30th anniversary of the UN Convention on the Rights of the Child. Yet, we are still failing to truly safeguard our children. The ascent of technology has led to more crimes being committed both in the real world and online with children falling victims to the worst predators.
According to the UN Office of Drugs and Crime, there has been a clear increase in the number of children being trafficked in recent years, with children now accounting for 30 per cent of all detected victims. UNICEF has indicated that, of the roughly 1.8 billion photos that are uploaded to the internet each day, around 720,000 are believed to be illegal images of children, while the National Center for Missing and Exploited Children in the US has stated that the number of reports of URLs containing Child Sexual Abuse Material (CSAM) has dramatically increased from 3,000 in 1998 to 18.4 million in 2020. The COVID-19 pandemic has also played a significant role in increasing the threat and risk of sexual exploitation of children, as both children and sex offenders found themselves confined in-doors and online for extended periods of time.
The potential of AI to support law enforcement and related authorities to prevent a wide range of forms of violence, exploitation and abuse is immense. Recently, for instance, facial recognition has been to identify missing children, while deep learning has help police to identify child abuse images on confiscated devices. Join us to explore how law enforcement and concerned authorities can use these and other applications of AI to safeguard our children and help us to identify the red-line between the need to ensure the safety of our children and the use of potentially invasive technologies by law enforcement. The Webinar will also launch a new UNICRI project supported by the Ministry of Interior of the United Arab Emirates to further explore these issues.