The 200 year-old company may soon go public on the back of AI-powered education products.

  • FaceDeer
    link
    fedilink
    -22 days ago

    How dare you say something insufficiently negative about the stuff everyone hates.

    • Lemminary
      link
      English
      62 days ago

      The downvotes are for the naïveté of the statement. Many people here use LLMs every day and have stated so in other threads. We just don’t think this is necessarily a proper use case given that you’re dealing with factual information. You can see as much in other comments on this thread pointing out the hallucinations.

      • FaceDeer
        link
        fedilink
        31 day ago

        Whereas I use LLMs every day, have actually written code that uses them, and I understand that they’re perfectly fine dealing with factual information when used in the proper framework. You’d be using retrieval-augmented generation (RAG) in an application like this.

        The “but hallucinations!” Objection goes in the same bin as “they can’t do fingers.” It’s an old concern that’s had a lot of work done to resolve it but that the general public haven’t bothered to keep up with.

        • Lemminary
          link
          English
          11 day ago

          “they can’t do fingers.” It’s an old concern

          Have you seen those gorilla hands, though? Yes, there are five fingers there but everyone got fucking man hands. lmao

          It seems RAG helps mitigate but doesn’t eliminate hallucinations yet. Not to mention it’s quite expensive and has trouble extracting information based on abstract concepts. It sounds promising but it’s not the silver bullet I’m being sold on.