A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • HelixDab2@lemm.ee
    link
    fedilink
    arrow-up
    19
    arrow-down
    2
    ·
    3 months ago

    CSAM possession is illegal because possession directly supports creation

    To expound on this: prior to this point, the creation of CSAM requires that children be sexually exploited. You could not have CSAM without children being harmed. But what about when no direct harms have occurred? Is lolicon hentai ‘obscene’? Well, according to the law and case law, yes, but it’s not usually enforced. If we agree that drawings of children engaged in sexual acts aren’t causing direct harm–that is, children are not being sexually abused in order to create the drawings–then how much different is a computer-generated image that isn’t based off any specific person or event? It seem to me that, whether or not a pedophile might decide that they eventually want more than LLM-generated images is not relevant. Treating a future possibility as a foregone conclusion is exactly the rationale behind Reefer Madness and the idea of ‘gateway’ drugs.

    Allow me to float a second possibility that will certainly be less popular.

    Start with two premises: first, pedophilia is a characteristic that appears to be an orientation. That is, a true pedophile–a person exclusively sexually attracted to pre-pubescent children–does not choose to be a pedophile, any more than a person chooses to be gay. (My understanding is that very few pedophiles are exclusively pedophilic though, and that many child molesters are opportunistic sexual predators rather than being pedophiles.) Secondly, the rates of sexual assault appear to have decreased as pornography availability has increased. So the question I would have is, would wide availability of LLM-generated CSAM–CSAM that didn’t cause any real, direct harm to children–actually decrease rates of child sexual assault?

    • RandomlyNice@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 months ago

      With regards to your last paragraph: Pedophiles can indeed by straight, gay or bi. Pedophiles may also not become molesters, and molesters of children may not at all be pedophilic. It’s seems you understand this. I mentioned ITT that I read a newspaper article many years ago that was commissioned to show the access to cp would increase child abuse, it seemed to show the opposite.
      If persons could use AI to generate their own porn of their own personal fantasies (whatever those might be) and NOT share that content what then? Canada allows this for text (maybe certain visuals? Audio? IDK). I don’t know about current ‘obscene’ laws in the USA, however, I do recall reading about an art exhibit in NY which featured an upside down urinal that was deemed obscene, than later deemed a work or art. I also recall seeing (via an internet image) a sculpture of what seemed to be a circle of children with penises as noses. Porn? Art? Comedy?

      • HelixDab2@lemm.ee
        link
        fedilink
        arrow-up
        6
        ·
        2 months ago

        My understanding was that ‘pure’ pedophiles–ones that have no interest at all in post-pubescent children or any adults whatsoever–tend to be less concerned with sex/gender, particularly because children don’t have defined secondary sex characteristics. I don’t know if this is actually correct though. I’m not even sure how you could ethically research that kind of thing and end up with valid results.

        And honestly, not being able to do solid research that has valid results makes it really fuckin’ hard to find solutions that work to prevent as many children from being harmed as possible. In the US at least research about sex and sexuality in general-much less deviant sexualities–seems to be taboo, and very difficult to get funding for.

    • 2xsaiko@discuss.tchncs.de
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      3 months ago

      Hard to say. I generally agree with what you’ve said though. Also, lots of people have other fantasies that they would never enact in real life for various reasons (e.g. it’s unsafe, illegal, or both; edit: I should also absolutely list non-consensual here). I feel like pedophilia isn’t necessarily different.

      However part of the reason loli/whatever is also illegal to distribute (it is, right? I assume it is at least somewhere) is that otherwise it helps people facilitate/organize distribution of real CSAM, which increases demand for it. That’s what I’ve heard at least and it makes sense to me. And I feel like that would apply to AI generated as well.

      • HelixDab2@lemm.ee
        link
        fedilink
        arrow-up
        7
        ·
        3 months ago

        It’s obvs. very hard to get accounts of what pedophiles are doing; the only ones that you can survey are ones that have been caught, which isn’t necessarily a representative sample. I don’t think that there are any good estimates on the rate of pedophilic tendencies.

        the reason loli/whatever is also illegal to distribute

        From a cursory reading, it looks like possession and distribution are both felonies. Lolicon hentai is pretty widely available online, and prosecutions appear to be very uncommon when compared to the availability. (Low priority for enforcement, probably?)

        I’m not sure that increasing the supply of CSAM would necessarily increase demand for CSAM in people that aren’t already pedophiles though. To put it another way, I’m sure that increasing the supply of gay porn would increase consumption of gay porn, but I am pretty sure that it’s not going to make more people gay. And people that aren’t gay (or at least bi-) aren’t going to be interested in gay porn, regardless of how hard up (heh) they might be for porn, as long as they have any choices at all. There’s a distinction between fetishes/paraphilia, and orientations, and my impression has been that pedophilia is much more similar to an orientation than a paraphilia.

        • 2xsaiko@discuss.tchncs.de
          link
          fedilink
          arrow-up
          2
          arrow-down
          2
          ·
          3 months ago

          I’m not sure that increasing the supply of CSAM would necessarily increase demand for CSAM in people that aren’t already pedophiles though.

          No, but allowing people to organize increases demand because then those who would want CSAM have a place to look for it and ask for it where it’s safe for them to do so, and maybe even pay for it to be created. It’s rather the other way around, the demand increases the supply if you want to put it like that. I’m not saying lolicon being freely available turns people into pedophiles or something like that, at all.

          • HelixDab2@lemm.ee
            link
            fedilink
            arrow-up
            5
            ·
            3 months ago

            I guess where I come down is that, as long as no real people are being harmed–either directly, or because their likeness is being used–then I’d rather see it out in the open than hidden. At least if it’s open you can have a better chance of knowing who is immediately unsafe around children, and easily using that to exclude people from positions where they’d have ready access to children (teachers, priests, etc.).

            Unfortunately, there’s also a risk of pedophilia being ‘normalized’ to the point where people let their guard down around them.