• @Spesknight
    link
    938 months ago

    Computers don’t do what you want, they do what you tell them to do.

      • @[email protected]
        link
        fedilink
        Español
        208 months ago

        I wouldn’t call them passive, they do too much work. More like aggressively submissive.

        • @[email protected]
          link
          fedilink
          19
          edit-2
          8 months ago

          Maliciously compliant perhaps

          They do what you tell them, but only exactly what and how you tell them. If you leave any uncertainty chances are it will fuck up the task

    • no banana
      link
      148 months ago

      Must’ve been chatGPT’s fault

      • @[email protected]
        link
        fedilink
        88 months ago

        My experience is that: If you don’t know exactly what code the AI should output, it’s just stack overflow with extra steps.

        Currently I’m using a 7B model, so that could be why?

  • IWantToFuckSpez
    link
    fedilink
    23
    edit-2
    8 months ago

    Yeah but have you ever coded shaders? That shit’s magic sometimes. Also a pain to debug, you have to look at colors or sometimes millions of numbers trough a frame analyzer to see what you did wrong. Can’t program messages to a log.