• hperrin@lemmy.ca
    link
    fedilink
    English
    arrow-up
    29
    ·
    14 days ago

    I mean, if it codes like it draws, yeah, that’s what I would expect.

    • Jankatarch
      link
      fedilink
      arrow-up
      12
      ·
      14 days ago

      Yeah pretty accurate description lmao.

      Mine would be “imagine a website where people are putting their drawings down to 3 months after they learned what a pen is, and a good 89% of the training data was from that website.”

  • bravesirrbn ☑️
    link
    fedilink
    arrow-up
    23
    ·
    14 days ago

    Earlier this year, Google found that 90 percent of software developers across the industry are using AI tools on the job, up from a mere 14 percent last year.

    Yes, because their management chains are basically forcing them to. And/or the definition of “using AI tools” is so watered down it just means not disabling 100% of features that were shoved into every single IDE

    • Jared White ✌️ [HWC]@humansare.social
      link
      fedilink
      English
      arrow-up
      13
      ·
      14 days ago

      Yeah, that stat is wildly suspect. I would expect in reality it is closer to 70% based on a number of other surveys & reports. And even that is an inflated number (a result of peer pressure and top-down enterprise pressure).

  • magic_lobster_party@fedia.io
    link
    fedilink
    arrow-up
    15
    ·
    14 days ago

    I suspect code hallucinated by AI tend to also be more difficult to fix. If the developers can’t understand the code, then it’s likely the AI itself won’t ”understand” it either.

    Another crisis in software will come.

    • MoonRaven@feddit.nl
      link
      fedilink
      arrow-up
      14
      ·
      14 days ago

      As a senior software developer. You are correct. If at code reviews I suspect that the developer generated the code and would not know what it does, I ask them what the code does. If they can’t explain it, they won’t be able to maintain it.