• @Holyginz
    link
    Français
    143 months ago

    That’s stupidly short sighted. There will likely always be a need for coders in some form or fashion. As it stands right now AI is nowhere near good enough right now yo replace programmers.

    • @[email protected]
      link
      fedilink
      Français
      -13 months ago

      This is a non-sequitur. You can’t go from “as it stands right now AI is nowhere near good enough […]” (true statement) to “there will likely always be a need for coders in some form or fashion” (citation needed).

      To get to the latter, you need the claim “AI will never perform at a human level wrt programming or general level intelligence”. We have no evidence of this claim. Maybe LLMs aren’t the right architecture for achieving AGI, but we already have LLMs performing at human college level for most tasks.

      A lot of people have this perspective that AI needs to be perfect to replace humans, but that’s not true. It doesn’t even necessarily need to match the best of us, if you can hire a fleet of 128 junior dev-level AIs for the cost of 1 human junior dev, why would you ever hire a junior dev?

      Once juniors can’t find positions, only hobbyists will become senior devs, so there’ll be a massive reduction in senior developers in the world. Plus there’s no sign that junior-dev is the max level for LLM-AIs, nor is there a sign that LLMs are the absolute end for what AI has to offer.

      • @[email protected]
        link
        fedilink
        Français
        23 months ago

        we already have LLMs performing at human college level for most tasks.

        This is such a dubious claim that there is also very little evidence of. There are examples of LLMs performing some tasks, but that’s hardly displaying any consistent “college level” activity. Passing a bar exam does not mean that an LLM can be a lawyer. All it means is that it can pass a bar exam. Not to mention that just the phrase “college level” is extremely ambiguous and makes it impossible to debate. It also ignores all the times LLMs perform in completely wrong ways, or just produce incoherent garbage (though Pepperoni Hug Spot was pretty amazing!).

        I would absolutely hire one junior dev over a fleet of junior level LLMs of any size! I can talk to and train a junior. I can understand their motivation. I can watch them grow and learn the needs of the product, company, stakeholders, customers, etc. Even one LLM coder is a constant code review against an agent that does things seemingly at random and who you cannot talk to or understand. 128 of them would be a nightmare!

        LLMs for code are a neat tool and I do use them on occasion. They can sort of help summarize documentation or maybe help generate examples, but they also fuck up a lot. They can be a good teaching tool for a junior, so long as they don’t just follow blindly. But to replace even a junior coder? Absolutely not, and there’s no evidence they’ll ever get there. Note that I’m not saying it’s not possible, just that there is no evidence right now.

        You mention lack of evidence that “there will always be a need for coders”, but there’s just as much lack of evidence that AGI of any form (LLM or otherwise) is at all possible. Heck, we’ve been less than 10 years away from AGI since the 1970s…

        All these discussions around AI are getting ridiculous. They always point to some mythical future AI that may never exists. It also implies that we should be making decisions affecting real people’s lives today based on this mythical future AI. Remember that past performance is no indication of future gains. Just because there has been tremendous acceleration in the last couple years does not mean it will continue at that pace into the future.

        • @[email protected]
          link
          fedilink
          Français
          0
          edit-2
          3 months ago

          You also don’t know what is meant by “junior level LLM”. A junior level LLM would definitionally require the same level of code review as a junior developer. You have this weird human-bias that doesn’t actually exist in capitalism. Capitalists don’t prefer to watch people grow or develop. They want consistent, scalable output. They want the ability to throw money at something and get more output per money spent. Coding is notoriously difficult to scale, you get diminishing returns as you have to coordinate more and more people together.

          LLMs and AI in general are different in that they scale vertically (from a consumer’s perspective). You buy more API credits, you can make more requests to a model with a larger context window and more accuracy. It’s a capitalist’s wet dream. Division of labor and reducing complexity of specific jobs has been the goal of capitalists since forever.

          This is why we went from tailoring as a profession to repetitive factory work. Anyone can do a factory job, it takes no ramp up time, meaning there’s a massive labor market to reduce the cost of the labor. It’s worse from the worker’s perspective, but better for the capitalist, and that’s all that matters.

          Capitalists don’t understand and don’t care about code quality. If you’ve ever worked a corporate job you’ve felt this friction, the constant battle between developers who care about quality output and capitalist stakeholders who care about quantity and speed. AIs already blow humans out of the water in terms of quantity and speed.

          Even if 2024 is the end all be all of AI, even if we literally never have another breakthrough and it doesn’t improve at all (nobody in the world actually believes this is the case, including you), the current LLMs will still radically transform the way capitalists interact with coders. A ton of simple, junior level contracting work will be gone, as those super small businesses barely had enough to hire a developer to begin with, they’ll strongly prefer 20x the output at 75% of the quality of a junior developer for 1/100th the amount of money.

          It just takes time for capitalists to react to technological changes, and for easier ways to interact with AI to become more mainstream (e.g Dave vs the GPT API). I have used autogpt before, which is the same idea. It can build a simple web app on its own with just a single English prompt. Often times that’s all small contracts need, some dumb/simple CRUD app. It takes like $1.00 of GPT4 API tokens to put together a blogging site with SSO login, postgresdb, react frontend and backend and a basic bootstrap UI. That would’ve been like 10 hours for a junior at $20 an hour, so like $200 for a human or $1 for an AI to do it in 30 minutes. With modern day 2024 technology.

        • @[email protected]
          link
          fedilink
          Français
          0
          edit-2
          3 months ago

          You’re missing the entire point I was making, attacking the veracity of some tests and metrics we use to measure aptitude. By all measures that we have in existence, LLMs perform at a college level (or higher). Maybe you disagree with those measures, cool. That’s completely irrelevant to the actual point I was making.

          YOU made the claim. You said “there will likely always be a need for coders”. You have no idea how the burden of proof works. You made a claim, I said there was a lack of evidence. You can’t then go back and say “you lack evidence to reject my claim I made without evidence!”

          The irony of this conversation is that any top level LLM (Opus, GPT4, Gemini advanced) wouldn’t have made such a rudimentary error in logic. Without even getting into the discussion of whether it “understands” what it’s saying, functionally, it wouldn’t have strung together the same incoherent message you put together.

  • @Syringe
    link
    Français
    53 months ago

    We’re not there yet. Right now, AI development still requires someone to stitch it all together. It’s a VERY useful tool when you’re trying to dig through twenty years of docs and issue threads (lookin at you Drupal) and it’s a massive time saver, but at best it just sort of guesses at what you’re wanting based on patterns that you’ve already established in code. All this will do is allow the to produce more code, cheaper. It might be true that there won’t be developers who don’t use AI in the future, but as long as business reqs are created and interpreted by people, sometimes going to need to be in charge of it to make sure the machine is interpreting things the way that it’s supposed to.

  • @[email protected]
    link
    fedilink
    Français
    43 months ago

    A ceci prêt que c’est bon pour développer la logique chez les enfants d’apprendre a programmer.

    • Syl ⏚OP
      link
      fedilink
      Français
      33 months ago

      Ça sera intéressant de voir les mecs galérer ensuite pour debugger le code généré.

      • TGhost [She/Her]
        link
        fedilink
        Français
        23 months ago

        Et les femmes elles font la soupe mon coco 😆 ?

        AH ah juste pour te faire iech 🫠 🙃

      • @[email protected]
        link
        fedilink
        Français
        23 months ago

        Ça sera différent. Il y aura aussi des ia pour aider a debugger. Je doute que le code soit optimisé en quoi que ce soit, mais a ma connaissance l’optimisation de code c’est déjà plus tellement la mode.