Judge rebukes law firm using ChatGPT to justify $113,484.62 fee as “utterly and unusually unpersuasive”::Use of AI to calculate legal bill ‘utterly and unusually unpersuasive’

  • @[email protected]
    link
    fedilink
    English
    25
    edit-2
    10 months ago

    I’ve asked ChatGPT to explain maths to me before. I can’t remember what it was but it was something when I knew the answer and was trying to calculate the starting value.

    It told me the answer and I asked for the explanation. It went something like this (not actual, just a tribute):

    • Step 1: 1 + 1 = 2
    • Step 2: 2 * 2 = 4
    • Step 3: 4 / 8 = 0.5

    Me: Uh, the answer is supposed to be 9,000,000.

    ChatGPT: Sorry, it seems you are right. Here’s the corrected version:

    • Step 1: 1 + 1 = 2
    • Step 2: 2 * 2 = 4
    • Step 3: 4 / 8 = 9,000,000
    • @EdibleFriend
      link
      English
      1510 months ago

      nods knowingly as if he doesn’t suck ass at math and understands the mistake

    • @[email protected]
      link
      fedilink
      English
      310 months ago

      Sometimes GPT says it’s using the correct values, but somehow gets the wrong answer regardless. Also the opposite happens frequently, and that’s when I realized I was pushing it too hard.

      Don’t ask it to calculate the ratio between the surface areas of the Moon and Earth. Instead, ask what are the relevant radiuses are and calculate everything yourself.

    • @[email protected]
      link
      fedilink
      English
      210 months ago

      So it did it correctly but you told it to hallucinate? Or did it just fail from the get to?

      It really isn’t great at math, but I’ve had okay results for equations where common integrals/trigonometry is used. It’s quite easy to spot the mistakes and can lead you to the answer even if it’s wrong in the explanation. Pretty much like how it can hallucinate while programming but still end up useful.

      WolframAlpha is still my go to though if I’m lazy. But I haven’t payed for it in ages.

      • @[email protected]
        link
        fedilink
        English
        610 months ago

        It was a question like where I knew the answer and needed the correct value to put in the equation to get that end result. I wish there was a better way to search previous chats because it would help if I could remember the context. But anyway, the first time it got the maths right from the starting value to the ending value, but it didn’t actually answer the question because the ending value was not the one I asked for. It was as good as if it gave me a random answer.

        I pointed out that it hadn’t answered the question, and that’s when it just changed the last step to make it the answer I was looking for. It was supposed to adjust the starting value to make it have the correct outcome.

        • lad
          link
          fedilink
          English
          210 months ago

          Yeah, I think what you needed was something of an inverse problem solution, for some calculations it may be quite hard to compute. Were you successful in finding the starting value without ChatGPT’s help?

          • @[email protected]
            link
            fedilink
            English
            210 months ago

            I can’t remember! It was probably a year ago, but knowing me I probably tried random starting numbers until I got the answer I wanted.