“Many developers say AI coding assistants make them more productive, but a recent study set forth to measure their output and found no significant gains. Use of GitHub Copilot also introduced 41% more bugs, according to the study from Uplevel”

study referenced: Can GenAI Actually Improve Developer Productivity? (requires email)

  • @AstridWipenaugh
    link
    533 months ago

    AI has simply weaponized copy/pasting stack overflow answers without reading them.

    • El Barto
      link
      23 months ago

      What do you mean by weaponizing?

      • @AstridWipenaugh
        link
        13 months ago

        When used in a metaphorical way like this, meaning code assistants are not literal weapons, it means to make a thing readily available to the masses. In context, my comment means that AI has streamlined the task of copy/pasting from stack overflow. You don’t even have to search for what you’re trying to do and the AI will use answers it’s scraped from the site to fill out your code for you.

        • synae[he/him]
          link
          fedilink
          English
          43 months ago

          Not only making things available to the masses, I think “democratizing” would be a better word for that intent.

          “Weaponizing” something like this specifically has a connotation of danger or harm to some (unspecified) target.

          In this context of copy/pasting stackoverflow answers without understanding them (a practice Considered Harmful), it was previously something that required intent to do. Now your editor will instantly suggest solutions to you and all you have to do is press the tab key to accept unknown lines of code.

        • El Barto
          link
          4
          edit-2
          3 months ago

          This is the first time I hear the term weaponizing in the way you describe it. I would have said “AI obliterated the chore of finding the right answer from SO” or something similar. To me, weaponizing means, using against a person or entity.

          But regardless, thanks for explaining.