- cross-posted to:
- technology
- cross-posted to:
- technology
A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.
The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”
This seems like the most likely scenario tbh. I’m not sure whether personal likeness IP is a bad thing per se but one thing is sure - it’s not being done to “protect the kids”.
It is. It means that famous people (or their heirs, or maybe just the rights-owner) can make even more money from their fame without having to do extra work. That should be opposed out of principle.
The extra money for the licensing fees has to come from somewhere. The only place it can come from is working people.
It would mean more inequality; more entrenchment of the current elite. I see no benefit to society.
Not necessarily I’m optimistic that this could lead to empowering status and personality as main resources and push money out of society.
How so? Fame is already a monetizable resource. The main changes that I see are that 1) no opportunity to show their face and make their voice heard needs to be missed for lack of time, and 2) age no longer needs to be a problem.