This is a good problem. I do not think it should be used as a reference in scientific papers (at least the way it works now), but there should still be a way to at least let people know where you got that information from.
This is more useful with things like Bing Chat that uses external sources to craft the answers, so you have “real” citations. But yeah, ChatGPT is currently impossible to verify.
I wonder if a seed could be implemented in the future to re-generate the same answers.
This is a good problem. I do not think it should be used as a reference in scientific papers (at least the way it works now), but there should still be a way to at least let people know where you got that information from.
This is more useful with things like Bing Chat that uses external sources to craft the answers, so you have “real” citations. But yeah, ChatGPT is currently impossible to verify.
I wonder if a seed could be implemented in the future to re-generate the same answers.