• popcar2@programming.dev
    link
    fedilink
    English
    arrow-up
    10
    ·
    8 months ago

    Aside from the fact that it’s too late, I’m doubtful this would work in the first place. This could get companies to ignore your image if they detect you’ve used this, which may be the goal for some but won’t stop AI art generators from happening or compensate the artists in any way.

    Best case scenario for them is that this does work, and the people making models would just have to use the billions of images created before this tool for training.

    • SUPAVILLAIN@lemmygrad.ml
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      8 months ago

      Worse; I expect some darker gray to outright black-hat programmer to have already started working on “Deglaze” to be packed into Meta’s next swing at LLMs.

  • olicvb@lemmy.ca
    link
    fedilink
    English
    arrow-up
    3
    ·
    8 months ago

    Glaze got bypassed for stable diffusion quite fast, and i bet nightshade will too