• SolidGrue
    link
    32 years ago

    I asked Chat-GPT a relatively straightforward question and it sent me down a series of rabbit holes that yes, I eventually found my way out of, but gottdamn oh how I might have wasted less time without asking a chabot and reading the tutorial instead

    • manitcor
      link
      fedilink
      English
      22 years ago

      not really, i think mine is correct if the inference pattern is a thing. the prompt is a bit off. GPT is not very good at just “getting right to it”. Its a language model, trained on how we communicate. Some priming is very helpful. Also dont ask questions, no need to be nice, however being collaborative, as if working with a co-worker is very helpful. Don’t have spaces or extra returns at the end of your input.

    • @benni
      link
      12 years ago

      It is difficult in the sense that it uses obscure and rarely used concepts, not really algorithmically difficult. But I find it impressive that it makes the connection between these concepts and the question.

  • @NewNewAccount
    link
    12 years ago

    Saying “try harder” to ChatGPT is so tacky. I can’t explain why.

    • @average650
      link
      12 years ago

      Because they are doing the exact opposite of trying harder?