NEWS CENTER - Pointing out that "online abuse" against children has gained new areas with the development of technology, James Stevenson said, "AI-generated child sexual abuse material (AI-Gen CSAM) is a major concern. Fake images and videos can be created without a child’s awareness or involvement and then used to threaten and extort the child through claims the material is real."
As artificial intelligence (AI) technology advances and integrates into various aspects of life, the potential risks it poses have also caused general concern. The increasing potential to be used for harmful purposes has led experts to call for stricter regulations. At the top of the risk group are especially children. Artificial intelligence technologies that can speak like humans can significantly affect children's behaviour and decision-making processes. Especially children under the age of 10 perceive artificial intelligence applications as a real person and become vulnerable to misguidance and all kinds of dangers.
Drawing attention to this situation in the Online Child Abuse report published in 2023 under the name "Into The Light", Edinburgh University Childlight Global Child Safety Institute announced that at least more than 300 million children in 157 countries were exposed to "online sexual abuse". According to the report prepared as a result of 36 million data, surveys and 125 academic studies from 157 countries' official institutions, one in every 8 children was exposed to "online sexual abuse".
Preparing to announce its 2024 report in autumn, the institute's data analyst James Stevenson told Mezopotamya Agency (MA) about the dangers awaiting children.
DANGER FOR THE WORLD
In the latest report, "online child abuse" was recorded as 11 per cent in Western European countries, 11 per cent in the USA, 7 per cent in the UK and 7.5 per cent in Australia. Pointing out that this is a worldwide problem, Stevenson said: "one reason for the higher numbers is simply more people – including both children and offenders – have access to the internet. These countries also tend to host many of the companies that power the internet, so when abusive material is shared online, it is often stored on servers based in those countries, even if the abuse itself happened elsewhere.”
WHY WARS AND CRISES
Stating that the world has been in a state of conflict recently, Stevenson emphasised that this situation has led to an increase in "online child abuse". Stating that children are increasingly exposed to technology exploitation due to wars, Stevenson said, "Recent research from Childight has revealed a disturbing rise in child sexual abuse material (CSAM) hosted in countries affected by conflict and instability. Ukraine and the Holy Land are among the most concerning examples. As wars and conflicts force communities into chaos, predators are taking advantage of the disruption, targeting children who are spending more time online for education, socialising and play."
DATA IN COUNTRIES
Referring to the "online child abuse" data of some countries before and after the war, Stevenson said: "“As wars and conflicts force communities into chaos, predators are taking advantage of the disruption, targeting children who are spending more time online for education, socialising and play.
“Childlight highlights cases of CSAM material being uploaded in Ukraine, as measured by the National Center for Exploited and Missing Children, doubled to 105,117 in 2023, from 52,328 the year before. In Israel over the same period, it more than trebled, from 22,250 to 71,189 cases, while in Palestinian territory it increased by 148% from 28,452 to 70,472. In contrast, reports of CSAM hosting across the world as a whole rose by 13%, from 32,059,029 to 36,210,368. Statistics show a 195% rise in reported cases between 2020 and 2023 for Venezuela which has faced ongoing turmoil in recent years. Similar findings are identified in the case of famine and natural disasters, with reports of CSAM uploads related to Afghanistan up by 81,789 cases (a rise of 170%) between 2020 and 2023 and up by 1,009 cases (157%) over the same period. Conflict and crises create the perfect conditions for TF-CSEA to thrive in the shadows yet this remains an overlooked emergency. If we fail to act, more children will be coerced, extorted and exposed to lasting harm. Governments, humanitarian agencies, and tech companies must urgently work together to close the gaps that allow predators to exploit the most vulnerable in society."
“Data on child sexual abuse is often scattered across different agencies like police, health and social services, making it challenging to pull together reliable and accurate numbers. For technology facilitated child sexual exploitation and abuse (TF-CSEA) we have found that using ‘big data’ from global organisations helps us get a more consistent picture of the harm children are facing through technology. Having good data is essential if we want to prevent abuse happening in the first place, and not just respond to it once the harm is already done. This public health approach is crucial in order to protect children both on and offline from sexual abuse.”
A HIDDEN PANDEMIC
Drawing attention to the increasing prevalence of "online abuse", Stevenson said: "Technology facilitated child sexual exploitation and abuse (TF-CSEA) is a hidden pandemic. We need an urgent global response, with every country, community and organisation playing its part. In the same way it is better to vaccinate against a disease rather than treat the symptoms, we must prioritise prevention – stopping child sexual abuse before it happens and avoiding the lifelong consequences for children, families, and society. A preventative, public health approach has been successfully applied to everything from the eradication of smallpox to reducing violence. It is time we applied the same evidence-based approach to ending child sexual abuse, both online and offline. If we treating it as a public health emergency, we can create a safer world for children everywhere. But we must act now, because children can’t wait.”
CHILD PROTECTION SHOULD BE TAKEN AS A BASIS
Reminding that technology companies also have a responsibility in preventing "online abuse", Stevenson said: "Technology companies must adopt a safety by design approach – building their platforms and products with child protection at its heart, not as an afterthought. That means assessing the risks of features like end-to-end encryption or livestreaming before they’re launched and putting safeguards in place to prevent abuse. Their responsibilities don’t stop there. Technology companies must actively prevent illegal content, like child sexual abuse material (CSAM), from appearing on their platforms in the first place, not just remove it once it’s reported and the harm is done."
NEW THREAT ARTIFICIAL INTELLIGENCE
Pointing out that with the development of technology, "online abuse" against children has gained new areas, Stevenson continued as follows: "AI-generated child sexual abuse material (AI-Gen CSAM) is a major concern. Fake images and videos can be created without a child’s awareness or involvement and then used to threaten and extort the child through claims the material is real. The sheer volume of AI-Gen CSAM is adding substantially to the already heavy workload of law enforcement and hotlines. In terms of platforms, any app or game that allows users to connect with one another – even those designed for children – can be misused. Offenders are actively targeting platforms with large child audiences, including games that many parents may assume are safe. That’s why it’s crucial for tech companies to build in strong safety measures from the start, and for parents to stay informed about what their children are using and who they’re connecting with.”
MA / Berivan Kutlu