Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

  • Belastend@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    5
    ·
    2 months ago

    An exfriend of mine Photoshopped nudes of another friend. For private consumption. But then someone found that folder. And suddenly someones has to live with the thought that these nudes, created without their consent, were used as spank bank material. Its pretty gross and it ended the friendship between the two.

    • Scrollone@feddit.it
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      8
      ·
      edit-2
      2 months ago

      You can still be wank material with just your Facebook pictures.

      Nobody can stop anybody from wanking on your images, AI or not.

      Related Louis CK

      • Belastend@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        2
        ·
        2 months ago

        Thats already weird enough, but there is a meaningful difference between nude pictures and clothed pictures. If you wanna whack one to my fb pics of me looking at a horse, ok, weird. Dont fucking create actual nude pictures of me.