For those who think copyright approaches to Generative AI are going to be a solution. They won’t be
#Facebook and Adobe already have the rights to all the images legally they need, which means that they will make GenAI models and there’s nothing you can do about it.
Copyright attacks will only kill the Open Source versions will mean that everyone will be pushed to these priorietary models instead, benefiting only the rich corpos.
This is the best summary I could come up with:
Previously, Meta’s version of this technology—using the same data—was only available in messaging and social networking apps such as Instagram.
Images include a small “Imagined with AI” watermark logo in the lower left-hand corner.
We put Meta’s new AI image generator through a battery of low-stakes informal tests using our “Barbarian with a CRT” and “Cat with a beer” image synthesis protocol and found aesthetically novel results, as you can see above.
(As an aside, when generating images of people with Emu, we noticed many looked like typical Instagram fashion posts.)
The generator appears to filter out most violence, curse words, sexual topics, and the names of celebrities and historical figures (no Abraham Lincoln, sadly), but it allows commercial characters like Elmo (yes, even “with a knife”) and Mickey Mouse (though not with a machine gun).
It doesn’t seem to do text rendering well at all, and it handles different media outputs like watercolors, embroidery, and pen-and-ink with mixed results.
The original article contains 513 words, the summary contains 160 words. Saved 69%. I’m a bot and I’m open source!
Image-based AI doesn’t bother me nearly as much although I strongly believe it should be required to have digitally embedded markers to tag it as AI-generated.
Also, it is somewhat hilarious to me that all those people who willingly gave their content to Meta are now reaping what they sowed.
Text-based AI is the real danger though as it pollutes the Internet at large with AI-generated misinformation and is much harder to tag as AI-generated.