Researchers say that the model behind the chatbot fabricated a convincing bogus database, but a forensic examination shows it doesn’t pass for authentic.

    • @PetDinosaurs
      link
      English
      81 year ago

      It’s just modeling humans. I was only a lab TA for two semesters, and I caught so many fake data sets.

  • @[email protected]
    link
    fedilink
    English
    171 year ago

    So… Software designed to make things up made something up when asked to make something up? Okay…

  • @[email protected]
    link
    fedilink
    English
    9
    edit-2
    1 year ago

    There was someone on radio the other day talking about doing research with it for their show and starting by asking a simple math question that it got wrong and then the conversation devolved into ChatGPT inventing anecdotes when asked about if a Nobel medal was ever brought to space and it ending up saying it didn’t know why it kept inventing anecdotes instead of finding reliable info!

    All to say, it doesn’t know what is and isn’t reliable information so it builds answers based on what it interprets you might want to read.

  • @BetaDoggo_
    link
    English
    51 year ago

    Automation tool does automation.