@[email protected] to TechnologyEnglish • 7 months agoChatGPT provides false information about people, and OpenAI can’t correct itnoyb.euexternal-linkmessage-square61fedilinkarrow-up1206arrow-down111cross-posted to: [email protected]technologyaicompanionsfuck_ainews
arrow-up1195arrow-down1external-linkChatGPT provides false information about people, and OpenAI can’t correct itnoyb.eu@[email protected] to TechnologyEnglish • 7 months agomessage-square61fedilinkcross-posted to: [email protected]technologyaicompanionsfuck_ainews
minus-square@NeoNachtwaechterlinkEnglish4•7 months ago LLMs don’t actually store any of their training data, Data protection law covers all kinds of data processing. For example, input is processing, too. Output is processing, too. Section 4 of the GDPR. If you really want to rely on excuses, you would need wayyy better ones.
minus-square@[email protected]linkfedilinkEnglish0•7 months agoRight, so keep personal data out of the training set and use it only in the easily readable and editable context. It’ll still “hallucinate” details about people if you ask it for details about people, but those people are fictitious.
Data protection law covers all kinds of data processing.
For example, input is processing, too. Output is processing, too. Section 4 of the GDPR.
If you really want to rely on excuses, you would need wayyy better ones.
Right, so keep personal data out of the training set and use it only in the easily readable and editable context. It’ll still “hallucinate” details about people if you ask it for details about people, but those people are fictitious.