Rep. Joe Morelle, D.-N.Y., appeared with a New Jersey high school victim of nonconsensual sexually explicit deepfakes to discuss a bill stalled in the House.
Rep. Joe Morelle, D.-N.Y., appeared with a New Jersey high school victim of nonconsensual sexually explicit deepfakes to discuss a bill stalled in the House.
I would have thought that deepfakes are defamation per se. The push to criminalize this is quite the break with American first amendment traditions.
If I understand correctly, this would put any image hoster, including Lemmy, in hot water because 230 immunity is only for civil suits and not federal criminal prosecution.
the text of the bill exempts service providers from any liabilities as long as they make a good faith attempt to remove it as soon as they are aware of its existence. So if someone makes AI generated revenge porn on your instance as long as you take it down when notified, you want be in trouble.
Section 2252D
(a) Offense.—Whoever, in or affecting interstate or foreign commerce, discloses or threatens to disclose an intimate digital depiction—
“(1) with the intent to harass, annoy, threaten, alarm, or cause substantial harm to the finances or reputation of the depicted individual; or
“(2) with actual knowledge that, or reckless disregard for whether, such disclosure or threatened disclosure will cause physical, emotional, reputational, or economic harm to the depicted individual,
…
(d) Limitations.—For purposes of this section, a provider of an interactive computer service shall not be held liable on account of—
“(1) any action voluntarily taken in good faith to restrict access to or availability of intimate digital depictions; or
“(2) any action taken to enable or make available to information content providers or other persons the technical means to restrict access to intimate digital depictions.
So the law requires intent and carves out exceptions for service providers that try to remove it.
The lower part just says that overeager removal of depictions does not create liability. Say, onlyfans bans the account of a creator because some face recognition AI thought their porn depicted a celebrity. They have no recourse for lost income.
As to the upper part, I am not sure what “reckless disregard” means in this context. I don’t think it means that you only have to act if you happen to receive a complaint. If you see nudes of some non-porn celebrity, then it’s mostly likely a fake. It seems reckless not to remove it immediately. What if there are not enough mods to look at each image. Is it reckless to keep operating?
(d) Limitations.—For purposes of this section, a provider of an interactive computer service shall not be held liable on account of—
“(1) any action voluntarily taken in good faith to restrict access to or availability of intimate digital depictions; or
“(2) any action taken to enable or make available to information content providers or other persons the technical means to restrict access to intimate digital depictions.
I appreciate your reading into the text. I am not a lawyer so it isn’t always clear how to read the legal language crafted into these bills. Since the quoted part of the law is under the criminal penalty section of the bill, I read it as releasing the service provider from criminal liability if they try to stop the distribution of it. I see your point as how you read it and that makes sense to me
Yes, expressions can have meanings that are unclear to non-experts, like reckless disregard. It means specific things in the context of specific laws and I can’t guess how it should be interpreted here.
shall not be held liable on account of any action taken
to restrict access.
to make available the technical means to restrict access.
I took some words out to improve readability.
I believe the second one is for, EG, someone making a database of banned material, so that it can be filtered automatically on upload. Or if someone uses those images to train an AI to recognize fakes. For that purpose it will be necessary to “disclose” (IE distribute) the images to the people working on it; perhaps an outside company.
I would have thought that deepfakes are defamation per se. The push to criminalize this is quite the break with American first amendment traditions.
If I understand correctly, this would put any image hoster, including Lemmy, in hot water because 230 immunity is only for civil suits and not federal criminal prosecution.
the text of the bill exempts service providers from any liabilities as long as they make a good faith attempt to remove it as soon as they are aware of its existence. So if someone makes AI generated revenge porn on your instance as long as you take it down when notified, you want be in trouble.
Which part says that?
So the law requires intent and carves out exceptions for service providers that try to remove it.
You can read the whole text here
The lower part just says that overeager removal of depictions does not create liability. Say, onlyfans bans the account of a creator because some face recognition AI thought their porn depicted a celebrity. They have no recourse for lost income.
As to the upper part, I am not sure what “reckless disregard” means in this context. I don’t think it means that you only have to act if you happen to receive a complaint. If you see nudes of some non-porn celebrity, then it’s mostly likely a fake. It seems reckless not to remove it immediately. What if there are not enough mods to look at each image. Is it reckless to keep operating?
I appreciate your reading into the text. I am not a lawyer so it isn’t always clear how to read the legal language crafted into these bills. Since the quoted part of the law is under the criminal penalty section of the bill, I read it as releasing the service provider from criminal liability if they try to stop the distribution of it. I see your point as how you read it and that makes sense to me
Yes, expressions can have meanings that are unclear to non-experts, like reckless disregard. It means specific things in the context of specific laws and I can’t guess how it should be interpreted here.
I took some words out to improve readability.
I believe the second one is for, EG, someone making a database of banned material, so that it can be filtered automatically on upload. Or if someone uses those images to train an AI to recognize fakes. For that purpose it will be necessary to “disclose” (IE distribute) the images to the people working on it; perhaps an outside company.
It’s not, people know it’s a deepfake most of the time and don’t claim it’s real
It might also be harassment.
If it’s not defamation or harassment, then I’m not sure what the problem is. As broad as this is, it looks unconstitutional to me.