Teen boys use AI to make fake nudes of classmates, sparking police probe::Parents told the high school “believed” the deepfake nudes were deleted.

    • wildginger
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      1 year ago

      They are children. Being horny about classmates.

      Being sexually aroused by people your own age and wishing to fantasize about it is not enabling pedophilia, you literal psychopath.

      • r3df0x ✡️✝☪️@7.62x54r.ru
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        8
        ·
        1 year ago

        Circulating porn of minors is a crime and enables pedophiles. Not to mention teenage girls could easily commit suicide over something like this.

        • Fades@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          edit-2
          1 year ago

          So does yearbook and any other kind of photos that depict children for that matter

          You can’t keep pushing the goal posts, by your logic young people should never date or take photos together because it could enable pedophiles somewhere somehow

          These are children with brains still in development, they are discovering themselves and you want to label them forever a pedophile because they didn’t make a conscious effort to research how their spanking material could potentially enable a pedo (because we all know pedos can only be enabled by things produced by kids… yeah that’s the real threat)

          Instead of suggesting a way to help the victims you are advocating for the creation of yet more victims

          What a pathetic brain dead stance you are defending