I saw another article today saying how companies are laying off tech workers because AI can do the same job. But no concrete examples… again. I figure they are laying people off so they can pay to chase the AI dream. Just mortgaging tomorrow to pay for today’s stock price increase. Am I wrong?
I’d say more like 20% more productive for most developers. Maybe it suits your coding style better than most?
Most of the time spent developing software isn’t writing code, but understanding the problem you’re trying to solve and translating that into an algorithm. I see more utility in generating tests, since a lot of developers don’t have good testing skills.
That 20% is just way too optimistic for anything more serious so as it would normally prompt hiring of software engineers.
If the project currently requires human developers as paid employees, it will continue to require that. So in introducing today’s ai, you either pay for the employees and the language model expenses, or you pay reduced employee expenses and the language model expenses, and then figure out a way to fund a complete, unavoidable refactor/rewrite down the line and how to adapt the business model back to sustaining employing the original amount of engineers on top of that lump sum.
If the project never was going to employ anyone, then yeah, using a language model can be more productive. It’s never going to require the amount of stability and cohesiveness a serious application doing serious things would require.
Otherwise, it’s just going to add work and require effort in an amount of multiples that scales with the complexity and seriousness of the application.
And while it does this, it consumes ridiculous amounts of more energy and resources than a human person would. Especially those that are not sustainable, that humans do not generally require in such immense amounts.
It’s going to be a net negative for a good while. If we ever survive the burning of our resources with these current models, maybe we get to something actually serious and usable, but I doubt those two can ever work together.
I don’t know what tools you’re using, but that translating the problem into an algorithm is exactly what the AI is very good at.
I basically only architect stuff now, then fine tune the AI prompts and results.