@[email protected] to TechnologyEnglish • 5 months agoAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.coexternal-linkmessage-square230fedilinkarrow-up1907arrow-down119
arrow-up1888arrow-down1external-linkAsking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violationwww.404media.co@[email protected] to TechnologyEnglish • 5 months agomessage-square230fedilink
minus-square@GlitzyArmrestlinkEnglish12•5 months agoIs there any punishment for violating TOS? From what I’ve seen it just tells you that and stops the response, but it doesn’t actually do anything to your account.
minus-square@NeoNachtwaechterlinkEnglish1•5 months agoShould there ever be a punishment for making a humanoid robot vomit?
Is there any punishment for violating TOS? From what I’ve seen it just tells you that and stops the response, but it doesn’t actually do anything to your account.
Should there ever be
Should there ever be a punishment for making a humanoid robot vomit?