An expert on internet crimes against children warned lawmakers and others about the dark side of artificial intelligence, noting child predators are using the technology for “nefarious” purposes.
The 2024 Legislative Council Study Committee on the Regulation of Artificial Intelligence in Wisconsin met yesterday in Wausau, where they heard from several speakers on AI applications in law enforcement. The committee is tasked with making recommendations for state legislation focused on the use and development of AI.
Jacob Jansky is the director of Internet Crimes Against Children Task Force and the Human Trafficking Bureau, within the Wisconsin Department of Justice’s Division of Criminal Investigation. He acknowledged the great potential for AI in improving efficiency and reducing the paperwork burden for police officers and other professions, but centered his remarks on its applications in falsified imagery.
“Humanity right now is in a good place with AI, because it’s being used so frequently in cybersecurity, things like that … We are a little more advanced than those who would use it for nefarious means,” he said, but added “it’s only up to the creativity of those who want to use it nefariously for when that tide turns, and it will turn quickly and drastically.”
He showed a series of images showing the faces of young children and an older man, which despite being generated by AI, were virtually indistinguishable from real pictures. He said “the good news is” most AI companies don’t allow users to create sexually explicit images with their software, especially depicting children.
“It’s when we get into some of the off-platform stuff, some of the code that has been written and put out into the open source where some of that comes into play with child pornography,” he said.
He explained some child predators are using AI to place themselves into “obscene” imagery involving children, which law enforcement officials are tracking to identify those using the technology for these purposes. But when such materials are passed onto others, it can make it difficult to trace, posing a challenge for law enforcement.
While many AI programs learn from information through the internet, some users can build AI software programs locally on a single computer by training them on child pornography, Jansky said. This application uses an AI model called “stable diffusion,” which can create unique photorealistic images from text and image prompts.
“We actually had one case, in fact it was a novel case pretty much in the United States, right here in Wisconsin, where someone had their own processor or stable diffusion platform, and they were actually training it to make [child sexual abuse material] … You couldn’t tell the difference between a real image and a fake image,” he said.
Watch the video at WisconsinEye and see previous coverage on AI.
Register for an Sept. 26 WisPolitics event focused on state government policy around AI.