Generative AI has made it possible for any image of a child — a school photo, a soccer picture, a video from church — to be weaponized into sexual abuse material within seconds, by anyone, at almost no cost.
Readily available AI tools can take a single photograph of a real child and generate synthetic images that depict that child in sexual situations. The resulting images are indistinguishable from authentic photographs to most viewers. They spread across private chats, social media, gaming platforms, and school networks.
Victims are overwhelmingly girls and young women. The images are used for harassment, extortion, coercion, and — increasingly — as currency in online communities that trade such material.
Alabama's statutes, like those of most states, were written before generative AI existed. Laws against child sexual abuse material assume a real image of a real act. AI-generated material occupies a legal gray zone that predators exploit every day.
Schools and law enforcement across the state are encountering cases but often lack the tools, training, and statutory authority to respond effectively.
Every Alabama child with a public school yearbook photo, a social media tag, or a youth sports roster is a potential target. Age does not protect — middle school girls are the most common victims identified to date.
Families discover the abuse only after images are already spreading. By the time parents learn what has happened, the material is often on multiple platforms and devices across the country.
The psychological harm is real even when the images are synthetic. Survivors describe violation, loss of control, and permanent uncertainty about where their likeness exists online.
The tools are getting better, cheaper, and easier to use every month. Waiting another legislative session is not a neutral choice — it is a decision to let another full generation of Alabama children be targeted under laws that cannot protect them.
Take It Down Alabama is the coalition working to close the gap.