- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
I asked Chat-GPT a relatively straightforward question and it sent me down a series of rabbit holes that yes, I eventually found my way out of, but gottdamn oh how I might have wasted less time without asking a chabot and reading the tutorial instead
Learning programmer here. Is this an actually hard question?
not really, i think mine is correct if the inference pattern is a thing. the prompt is a bit off. GPT is not very good at just “getting right to it”. Its a language model, trained on how we communicate. Some priming is very helpful. Also dont ask questions, no need to be nice, however being collaborative, as if working with a co-worker is very helpful. Don’t have spaces or extra returns at the end of your input.
It is difficult in the sense that it uses obscure and rarely used concepts, not really algorithmically difficult. But I find it impressive that it makes the connection between these concepts and the question.
I am not a python expert, how’s this look
https://chat.openai.com/share/6496db95-e69e-467f-84ad-2f80daf28915
Saying “try harder” to ChatGPT is so tacky. I can’t explain why.
Because they are doing the exact opposite of trying harder?