• BigFig
      link
      English
      526 months ago

      What do you do when ChatGPT just makes shit up or answers incorrectly to yes or no questions, you’d have no way of knowing it was wrong

      • @gridleaf
        link
        English
        156 months ago

        ChatGPT is most useful when you may not know the right answer, but you know a wrong answer when you see one. It’s very useful for technical issues. Much quicker for troubleshooting than searching page after page for a solution.

        • @TheDarkKnight
          link
          English
          26 months ago

          It’s actually great at troubleshooting linux stuff weirdly enough lol

            • Avid Amoeba
              link
              fedilink
              English
              86 months ago

              Yeah that makes sense. The success rate might fall off a cliff in more complex software projects. E.g. applications that require designs beyond 10 UML boxes with hundreds of thousands of lines, especially not written in JS/Python.

      • @Evotech
        link
        English
        1
        edit-2
        6 months ago

        Bing AI provides reference in the “more precise” version

      • @[email protected]
        link
        fedilink
        English
        16 months ago

        While this is an important thing to understand about AI, it’s an overstated issue once understood. For most questions I ask AI, it doesn’t matter if it’s correct as long as it pulls some half useful info to get me on track (i.e programming). For other questions, I only ask it if I need to figure out where to look next, which it will usually do just fine.

        The first page of my search results is all AI generated garbage articles anyway, at least I know what I am getting with GPT and can take it as such.

        • @Womble
          link
          English
          26 months ago

          Yup, as long as you are aware that it could be wrong and look at it critically LLMs at GPT scale are very useful tools. The best way I’ve heard it described is having a lightning fast intern who often gets things wrong but will always give it a go.

          So long as you’re calibrated to “how might this be wrong” when looking at the results it is exceptionally useful.

      • Otter
        link
        fedilink
        English
        16 months ago

        Not the other commenter:

        I usually have an idea about the thing I’m asking, and if not then I’ll look up the topics mentioned after some guided brainstorming

        I’ve also found that asking the same question again, after resetting the chat, can give you an idea of what is happening

    • @thorbot
      link
      English
      146 months ago

      I’m curious what you use it for, because I try to use it daily for IT related queries and it gets less than half of what I ask correct. I basically have to fact check almost everything it tells me which kind of defeats the purpose. It does shine when I need really abstract instructions though, the other day I asked it how to get into a PERC controller on some old server and Google had nothing helpful, and ChatGPT laid out the instructions to get in there and rebuild a disk perfectly. So while it has some usefulness I generally can’t really trust it fully.

      • @Womble
        link
        English
        1
        edit-2
        6 months ago

        The point you have to remember is that it is trained on bulk data out there in a very inefficient manner, it needs to see thousands of examples in order to start getting any sort of understanding of something. If you ask it “how do I do {common task} in {popular language}” you will generally get excellent results, but the further you stray from that the more likely to be error prone it is.

        Still it is often good to get you looking on the right track when you are unsure to start, and is fantastic for learning a new language. I’ve been using it extensively in learning C# where I know what I want to code but not exactly how to use existing features to do it.

      • @cybersandwich
        link
        English
        -3
        edit-2
        6 months ago

        But generally you can’t (shouldn’t) trust web search results fully either. At the end of the day, the onus is on you as the user to do your due diligence.

        I’ve seen ChatGPT give me wrong information, and sometimes it would be bad to execute the code or command it generated it, but I know enough to say “are you sure thats correct?”. Hell, you can just challenge it each time or open a new session and ask it “what does this code do: insert-code-it generated here”.

        You shouldn’t just paste a search result command from stack overflow into your terminal either. And at least with chatgpt you can ask it to explain the command or code in detail and it will walk you through what each step does.

        Also, pasting that command from stack over flow into chatgpt and adding your specific context around it is HUGE. Thats why I say they are different products/use cases but they work well in concert. They just dont work well combined together like bing and google have been doing.

        edit: I guess lemmy escapes certain characters and it ate my post.

    • @jacktherippah
      link
      English
      56 months ago

      ChatGPT is not a search engine. It takes random shit from the Internet and stitches it together. It can often get things wrong in my experience. It’s best to always fact check.

    • BaroqueInMind
      link
      fedilink
      46 months ago

      I thought ChatGPT can’t search the internet and is using a LLM snapshot from 2021?

      And I thought Bing’s ChatGPT model is allowed to search the internet live?

      Doesn’t that make Bing’s version of ChatGPT superior?

      • @[email protected]
        link
        fedilink
        English
        16 months ago

        This was recently updated for paid users. You can now browse the internet, upload files and images, and they’ve also unlocked APIs by giving it tokens. It’s getting closer to being fully multi-modal quite quickly.

        • BaroqueInMind
          link
          fedilink
          16 months ago

          Mine can search the internet.

          I feel like you are lying, because I cannot see where you can enable that feature.

    • @[email protected]
      link
      fedilink
      English
      3
      edit-2
      6 months ago

      Keyword searches worked fine and pulled up exactly what I wanted for years, I swear to god. Somewhere in the last decade though websites have gamed the system and now I can’t find anything no matter how I word my search. It’s depressing.

    • @eran_morad
      link
      English
      16 months ago

      I prefer that stack looks the same as it did way back when. And stack is usually where i find my answers.

    • @Supervisor194
      link
      English
      06 months ago

      I use ChatGPT every day too. Because Google is being such a shit about YouTube I am in the process of moving away from Google altogether. I use DuckDuckGo for search, which indirectly uses Bing. It’s mostly OK. Sometimes I’m forced to try Google, it usually doesn’t help. But for programming, yeah, StackOverflow feels downright regressive now.

      I’m honestly kind of surprised about this news, considering how horrible Google’s results are now.

      • BaroqueInMind
        link
        fedilink
        26 months ago

        I thought ChatGPT can’t search the internet and is using a LLM snapshot from 2021?

        And I thought Bing’s ChatGPT model is allowed to search the internet live?

        Doesn’t that make Bing’s version of ChatGPT superior?

        • @Chreutz
          link
          English
          16 months ago

          GPT4 on ChatGPT was recently (last week ish) updated to include data up to April 2023.

        • @Supervisor194
          link
          English
          1
          edit-2
          6 months ago

          I’ve found this to be kind of subjective. Bing/Bard is more current than ChatGPT but yet I just find ChatGPT to be better. It’s snappier and more conversant with context. It seems to understand you when you chide it for not quite doing what you asked it to do, and it responds in kind. I mostly use it for programming to be fair, but even for other stuff, ChatGPT just somehow feels more… real? I can’t quite put my finger on it.

          There was a short time where Bing chat was kind of frighteningly real. Took them five seconds to nerf that shit and it’s never been anywhere near the same.

          Edit: I expect this answer to be out of date within 3 months. Things keep moving.