• MxM111
    link
    fedilink
    0
    edit-2
    1 year ago

    Why do you think that algorithm that consumes and process information do not increase credence of statements in any way? They would be completely useless if it were the case, people would not use them. It is precisely because they give you something, they are helpful. Don’t you agree?

    If subjective opinion of non-specialist is as credible as the output of ChatGPT, there would be no point in such tool.

    As for false dichotomy, how it can be “false” when I ask you to compare two things? Fine, you do not trust both of those, but you can not even make judgement which one you trust less (or mistrust more)?

    • partial_accumen
      link
      11 year ago

      Why do you think that algorithm that consumes and process information do not increase credence of statements in any way?

      Because it processes based upon pattern math, not reason. You seem to be confusing the two.

      They would be completely useless if it were the case, people would not use them. It is precisely because they give you something, they are helpful. Don’t you agree?

      I don’t agree. People use them as an input to their otherwise human controlled research or process. Even if you’re asking ChatGPT to write you a letter for business, you don’t simply copy/paste the raw output without proofreading for context or tone. If you were using ChatGPT in this way, as I’ve said a number of times, you’d be fine. Instead, you’re taking raw ChatGPT output with zero external confirmation and claiming its fact.

      If subjective opinion of non-specialist is as credible as the output of ChatGPT, there would be no point in such tool.

      You’re setting these as binary conditions when they aren’t, but neither opinion of non-specialist or ChatGPT rise to the bar of “trusted fact”.

      As for false dichotomy, how it can be “false” when I ask you to compare two things? Fine, you do not trust both of those, but you can not even make judgement which one you trust less (or mistrust more)?

      Its false because you’re setting the conditions that these are the only two choices. They aren’t. I have the choice of actual research for fact finding.

      I’m seeing something in the pattern of this writing. Can I ask if you’re using ChatGPT output to aid in your argument?

      • MxM111
        link
        fedilink
        01 year ago

        Because it processes based upon pattern math, not reason. You seem to be confusing the two.

        Any process in human brain can be described as pattern, math. GPT has ANN as part of it, and I do not think you can say that it does not reason. It reasons differently, but it still does. It can solve problems, for example, that it was never trained upon (only on similar problems). You can literally ask it like “if John worked some number of days days and was earning 3$ per day on weekday, and on weekday it was earning $5 per day, then how many days he worked in total if he earned 11$ per week and he worked twice less days during weekend than during the week”. It will solve it. I just made up the problem, it had never seen the problem, but it correctly solves it. You are not calling that “reason”? I think half of the people in this community will have problem solving this. And GPT can solve much more difficult puzzles.

        I don’t agree. People use them as an input to their otherwise human controlled research or process.

        You do not make sense. If there is NOTHING that ChatGPT gives to you, of no validity whatsoever, then you would not use it. If there is at least some validity of the statements, only then it is useful.

        Its false because you’re setting the conditions that these are the only two choices.

        You do not make sense here either. I am asking you to compare two things, it is irrelevant that there are more things. I can ask you to compare the weight of an apple and of an orange. It does not matter that there is also potato. You still can compare it. It is not false dichotomy.

        • partial_accumen
          link
          11 year ago

          I don’t agree. People use them as an input to their otherwise human controlled research or process.

          You do not make sense. If there is NOTHING that ChatGPT gives to you, of no validity whatsoever, then you would not use it. If there is at least some validity of the statements, only then it is useful.

          Again, you’re saying its binary; all or nothing. I’ve said many times now that’s not the measure. The measure is if is a trusted source for answers. It isn’t. It can be part of a path to get a trusted answer, but raw ChatGPT isn’t.

          Its false because you’re setting the conditions that these are the only two choices.

          You do not make sense here either. I am asking you to compare two things, it is irrelevant that there are more things. I can ask you to compare the weight of an apple and of an orange. It does not matter that there is also potato. You still can compare it. It is not false dichotomy.

          A false dichotomy is a logical fallacy.

          If you need the definition of that its this:

          “Logical fallacies are deceptive or false arguments that may seem stronger than they actually are due to psychological persuasion, but are proven wrong with reasoning and further examination. These mistakes in reasoning typically consist of an argument and a premise that does not support the conclusion.” source

          You’re trying to move the goalposts with the argument. We started this exploring the question of “Is ChatGPT a trusted source?” Now you’re trying to redefine the question of “Is ChatGPT better than nothing?”. I’m not engaging in that question, and you’ve stopped engaging in the original question.

          I don’t see any productive conversation going forward. I thank you for your time and attention during this discussion. I don’t believe either one of us was persuaded to the other’s position, but I appreciate your involvement. Have a great day!

          • MxM111
            link
            fedilink
            0
            edit-2
            1 year ago

            We started this exploring the question of “Is ChatGPT a trusted source?”

            I think I see where our misunderstanding is. I have never stated that it is a “trusted source”. Only as a source that has positive (and in my opinion reasonably high) credence , that is that it increases validity of the statement. But it does not make it true! This is precisely why I

            1. Stated that I am not specialist (so that you know MY credence is low)

            2. Provided source of the statement (so that you know that the credence of those statements is better than mine, but not to the level of actual scientific papers, or even Wikipedia)

            That means that you (the reader of my post) should NOT take my statement as true, only as possibly or likely to be true, and you should not assume that I think I am absolutely right. However, I do believe that there is quite noticeable credence of ChatGPT output and I was posting this in discussion board for the purpose of discussion of those statements with total expectation if the statements are wrong, than people will point that out. So far, despite of multiple attacks on my post, nobody have mention any factual error in the ChatGPT output.