artificial iconoclasm


content warning: child sexual abuse, and how it relates to AI.


"child sexual abuse material", or CSAM, refers to photographic or video material that is produced by the sexual abuse of children. in order to produce such material, a child is sexually abused in the process.

…or at least, that's what i thought when i first learned of the term. because recently, i have seen a lot of people calling AI-generated images “CSAM”. and, for some reason, that's really been bothering me. now, i could say it's because no child is actually being abused in the creation of such an AI-generated image, that fake things and real things are inherently different.

but even past that, it just felt disrespectful to me. incorrect for more than purely intellectual reasons. i thought about it for a while, and realized why.

when you claim that AI-generated images are “CSAM”, you are not advocating for victims of childhood sexual abuse. instead, what you are doing is reducing the existence of those victims to nothing but an image. while you chase after the image in this iconolatry, you leave real victims behind, to suffer alone.

i was a victim of childhood sexual abuse. i am not a photograph, video, nor AI generated. i am a real person, with a real past, and real pain; not a robot slopping out what the combination of a fully clothed child and a naked adult would look like. i see no difference in how the predators of my past objectified and ignored me, to how these self-appointed advocates do the same.

both reduce my existence to nothing but the image, frozen on a screen.


i'm not sure what a good term for AI-generated images of children in sexual situations would be, and it should be pretty obvious that i think it's a bad thing for people to make stuff like that. whatever a good name for it would be, it ought to make clear that the image isn't real. now more than ever, we must remember to distinguish between what's real and what's fake.