• @Spesknight
    link
    931 year ago

    Computers don’t do what you want, they do what you tell them to do.

      • @[email protected]
        link
        fedilink
        Español
        201 year ago

        I wouldn’t call them passive, they do too much work. More like aggressively submissive.

        • @[email protected]
          link
          fedilink
          19
          edit-2
          1 year ago

          Maliciously compliant perhaps

          They do what you tell them, but only exactly what and how you tell them. If you leave any uncertainty chances are it will fuck up the task

    • "no" banana
      link
      141 year ago

      Must’ve been chatGPT’s fault

      • @[email protected]
        link
        fedilink
        81 year ago

        My experience is that: If you don’t know exactly what code the AI should output, it’s just stack overflow with extra steps.

        Currently I’m using a 7B model, so that could be why?

  • IWantToFuckSpez
    link
    fedilink
    23
    edit-2
    1 year ago

    Yeah but have you ever coded shaders? That shit’s magic sometimes. Also a pain to debug, you have to look at colors or sometimes millions of numbers trough a frame analyzer to see what you did wrong. Can’t program messages to a log.