- cross-posted to:
- [email protected]
- [email protected]
- technology
- technology
- cross-posted to:
- [email protected]
- [email protected]
- technology
- technology
Article by IEEE Spectrum: The writers tested the two AI image generators MidJourney and Stable Diffusion, testing their abilities to generate imagery that closely resembled copyrighted material, which proves that the training data of the image generators had to contain copyrighted material. Implemented safeguards were largely unsuccessful to curb the output of potentially infringing images.
Can you explain the double standard a bit more? I don’t understand it. Are you saying that the double standard is that AI companies sell a product that can be used to infringe copyright, yet they say that people infringing it using this sold product cannot monetize it?
What I mean is this: AI companies are arguing that people should not be able to earn money from the works they created (for example through selling licenses to their copyrighted works), insofar as paying for their training data is concerned. While - at the same time - they are charging money for the creation of works with AI.
To put it differently: “Artists should not earn money from creation of artworks. We should earn money from creation of artworks.”