AFP launches call to action to help combat child sexual abuse

The AFP and Monash University are today calling for Australians to provide photographs of themselves to support a much-needed initiative to help save children from child abuse and help bring their perpetrators to justice.

My Pictures Matter is a world-first, crowdsourcing project that will support the development of ethical artificial intelligence (AI) to detect child sexual abuse material in videos or photos shared on the dark web or seized during criminal investigations.

The AI tool will be a significant breakthrough for investigators who often have to manually look through tens of thousands of files to find evidence of suspected child abuse material.

It means the AI tool will be able to quickly detect child abuse material on websites or offenders’ electronic devices, and triage them for investigators.

To work at scale, the project, an initiative of the AI for Law Enforcement and Community Safety (AiLECS) Lab – which is a collaboration between the AFP and Monash University – requires photographs from everyday Australians in their youth.

AFP Deputy Commissioner Lesa Gale said it was the first time the AFP had issued a nationwide call to action to help combat online child exploitation and abuse.

“By having access to ordinary, everyday photographs, the AI tool will be trained to look for what is different and identify unsafe situations, flagging potential child sexual abuse material,” Deputy Commissioner Gale said.

“But for it to work in the most effective way, we need about 100,000 pictures of Australians aged between 0-17, and of all ethnicities.

“This initiative was officially launched in June 2022, but for this foundational phase to succeed, we need 10,000 pictures. Currently, we have fewer than 1000 images but we hope to be able to use the AI tool within 12 months.

“The AFP and Monash University are asking adults to provide pictures of themselves in their youth – not images of their children – because consent is important. This enables development of technology that is both ethically accountable and transparent.

“We also do not want to source images from the internet because children in those pictures have not consented for their photographs to be uploaded or used for research.

“This project is so important because it may not only save a child from being abused a day longer, but it also will help our members, who day-in, day-out, are required to watch children being sexually abused for investigative purposes.”

The AI could be used in a number of ways. If a person’s device has been seized during a warrant and is suspected of containing child abuse material, the contents of files can be accessed and subsequently processed by the AI tool to rapidly assess and flag potential child exploitation material for investigator review.

Another example is deploying the tool to target websites suspected of containing child exploitation. It will not be used to scour the internet but will instead target devices or websites that are already suspected of containing child abuse material and subject to an active police investigation.

Because of the algorithms being developed, the AI will be able to detect and triage child sexual abuse material to enable police to intervene faster to remove children from harm.

My Pictures Matter Project Head and AiLECS Lab Stream Lead for Ethics, Transparency and Community Voice, Dr Nina Lewis, said the project’s dataset was de-identified and securely held by Monash University.

“Consent isn’t just saying ‘yes’, it means understanding what you’re agreeing to,” Dr Lewis said.

“We can reassure people that this dataset is wholly owned and managed by Monash University, with use by our AFP colleagues subject to the same transparency and accountability measures that apply for all researchers on the team working to combat child abuse.

“People are also free to withdraw their childhood photos from the dataset if they change their mind.”

Deputy Commissioner Gale said she was deeply concerned about the amount of child exploitation circulating online, and would release her own childhood photo to help encourage the community to have confidence in the My Pictures Matter initiative.

“The creation of child abuse material is a horrendous crime. The victims in these images are children and they are being used as a commodity for the sexual gratification of others, including those who try to make money from the abuse,” Deputy Commissioner Gale said.

“The children in this material are not actors, they are real children being abused.

“As someone who is dedicated to protecting our most vulnerable, I will do my small part to encourage the crowdsourcing of images, and I hope other Australians will do so as well.

“I will be using the hashtag #MyPicturesMatter to encourage community support, and for those who want to participate or want more information about the project, please click this link https://mypicturesmatter.org/

The campaign comes as the AFP supports National Child Protection Week, which this year runs from September 3-9.

Photographs used in this call to action have been supplied by people who are supporting their use in the media. Photographs submitted to Monash University are used solely for the My Pictures Matter initiative.

In the past financial year, the AFP-led Australian Centre to Counter Child Exploitation received more than 40,000 reports of online child exploitation. Each report can contain large volumes of images and videos of children being sexually assaulted or exploited.

The AFP works with partners across Australia and the globe to combat online child exploitation.

/Public Release. View in full here.