• @polygon6121
    link
    English
    17 months ago

    The model makes decisions thinking it is right, but for whatever reason can’t see a firetruck or stopsign or misidentifies the object… you know almost like how a human hallucinating would perceive something from external sensory that is not there.

    I don’t mind giving it another term, but “being wrong” is misleading. But you are correct in the sense that it depends on every given case…

    • @[email protected]
      link
      fedilink
      English
      17 months ago

      No, the model isn’t “thinking”, no model in use today has anything resembling an internal cognitive process. It is making a prediction. A covid test is predicting whether you have the Covid-19 virus inside you or not. If its prediction contradicts your biological state, it is wrong. If an object recognition algorithm does not predict there being a firetruck, how is that not being wrong in the same way?

      • @polygon6121
        link
        English
        17 months ago

        Predicting? Ok, if you say so.