It’s by design very verbose and “English”-like, like instead of x=y*z it would go “MULTIPLY y BY z GIVING x”, the idea was that it would read almost like natural language, so that non-tech staff could understand it.
Yes. COBOL can be excused because it was the first time anyone was going down that path.
Yeah. And a lot of non-programmers became programmers thanks to Cobol.
I think we’re seeing this effect with AI code copilots, as well. It can’t replace a programmer, but it can elevate a potential programmer to proficient (in at least some tasks) faster than was possible before.
I know it theoretically means I earn less than I might have, but for my whole career there’s been so much more to be done than there are of us to do it, anyway.
Everything that comes later, less so.
Yeah. They really need to get off my lawn with this nonsense. We’ve seen this enough times to know that it’ll be great, but still won’t solve all our problems.
i mean syntax is part of it, but it can only help you so much. like no matter how you talk about mathematics, you have to somehow understand what multiplication is, but it certainly does help to write “5x3” rather than “5+5+5”
But/and also also, just because you might know what a multiplication is you still might not know how to use that to make audio louder!
(You might say “well just add to the loudness!” or if you actually know it’s not that easy you might say “just multiply it by 2!”, but the computer doesn’t simply take “audio” it takes some form of bytes in an array encoding said audio, let’s say PCM to make it easier, and you still need to know how to loop over that and multiply every value by 2 to double the physical volume.)
oh but don’t forget clipping and the fact that you now increased audio variance means the 10¢ tinny speakers at checkout cant power it, so now you have to work around perceptive loudness and normalize to speech frequencies and when you get to the shop to install new firmware you see a granny wearing glasses asking “what does the self checkout menu say?” and now you have a new problem.
Man alive, don’t get the managers working with audio. “Doubling the stream” might work if you’re using a signed audio format rather than an unsigned one, and the format is in the same endianness as the host computer uses. Neither of which are guaranteed when working with audio.
But of course, the ear perceives loudness in a logarithmic way (the decibel scale), so for it to be perceived as “twice as loud”, it generally needs an exponential increase. Very high and low frequencies need more, since we’re less sensitive to them and don’t perceive increases so well.
Except that it’s not the syntax that makes programming hard, it’s the thought process, right?
Exactly!
And, of course, AI doesn’t help with the thought process at all, but did made the syntax much simpler to deal with, once again.
So - once again - people who don’t understand what you just pointed out, now believe we don’t need programmers anymore. Just like the last several times that we “didn’t need programmers anymore”, for basically the same reason.
I understand that we rinse and repeat the same nonsense for networking, systems administrator, etc, every few years. Some people genuinely believe that the computers are someday going to magically start understanding themselves for us.
It’s very common. Every few years there is some no-code platform claiming no developers are needed anymore in any sector, not just web dev. Invariably these only work if you stay on the narrow path and of course the customer asks something outside of the easy path after the first demo so a lot of work by devs are needed to make of happen.
AI is just one more like that, but with hype on steroids.
And very old. Part of the sales pitch for the COmmon Business-Protected Language was that anyone could learn to code in almost plain English.
Also, the stuff they wind up making is the kind of stuff that people with no coding experience make. Cooking up an ugly website with terrible performance and security isn’t much harder than making an ugly presentation with lots of WordArt. But it never was, either.
Between COBOL and LLM-enhanced “low code” we had other stuff, like that infamous product from MS that produced terrible HTML. At this point I can’t even recall what it was called. The SharePoint editor maybe?
Even SQL was originally called SEQUEL, Structured English QUEry Language. They got sued for the name and changed it to SQL. It was also pitched to retrieve data with plain language.
the kind of stuff that people with no coding experience make
The first complete program I ever wrote was in Basic. It took an input number and rounded it to the 10s or 100s digit. I had learned just enough to get it running. It was using strings and a bunch of if statements, so it didn’t work for more than 3 digit numbers. I didn’t learn about modulo operations until later.
In all honesty, I’m still pretty proud of it, I was in 4th or 5th grade after all 😂. I’ve now been programming for 20+ years.
I’m not really into writing interactive fiction; I just tried it a little since it seemed neat. It turns out that I’m not great at coming up with things to write about, which makes it hard to actually write. Inform 7 makes some decisions that complicate using it with a programming background; I’m considering trying to write my own language for similar purposes (but different paradigms).
Ruby on Rails was probably the peak of the hype wave. It had a tutorial that any manager could follow to build a simple data driven website in minutes.
Well, forget for a moment everything you know about webpages and now you want a form where the user can create an account. The sales person tells you that the user has entered the data for us, so it just needs to be sent with a request to the backend, which always looks the same. And then it just needs to be put into a INSERTINTO, which also always looks the same.
All of that stuff can clearly be auto-generated by the framework. And 70% of the boilerplate code does exactly that, so that obviously means 70% of the workload of your devs disappears, which means you can get rid of 70% of your developers.
It just makes it really easy to scam people, when they don’t know the technical side…
Wait, people really thought web frameworks would replace Devs? Which frameworks? 😂
People thought COBOL would let managers write code.
What.
It’s by design very verbose and “English”-like, like instead of x=y*z it would go “MULTIPLY y BY z GIVING x”, the idea was that it would read almost like natural language, so that non-tech staff could understand it.
Except that it’s not the syntax that makes programming hard, it’s the thought process, right?
Yes. COBOL can be excused because it was the first time anyone was going down that path. Everything that comes later, less so.
Yeah. And a lot of non-programmers became programmers thanks to Cobol.
I think we’re seeing this effect with AI code copilots, as well. It can’t replace a programmer, but it can elevate a potential programmer to proficient (in at least some tasks) faster than was possible before.
I know it theoretically means I earn less than I might have, but for my whole career there’s been so much more to be done than there are of us to do it, anyway.
Yeah. They really need to get off my lawn with this nonsense. We’ve seen this enough times to know that it’ll be great, but still won’t solve all our problems.
i mean syntax is part of it, but it can only help you so much. like no matter how you talk about mathematics, you have to somehow understand what multiplication is, but it certainly does help to write “5x3” rather than “5+5+5”
But/and also also, just because you might know what a multiplication is you still might not know how to use that to make audio louder! (You might say “well just add to the loudness!” or if you actually know it’s not that easy you might say “just multiply it by 2!”, but the computer doesn’t simply take “audio” it takes some form of bytes in an array encoding said audio, let’s say PCM to make it easier, and you still need to know how to loop over that and multiply every value by 2 to double the physical volume.)
oh but don’t forget clipping and the fact that you now increased audio variance means the 10¢ tinny speakers at checkout cant power it, so now you have to work around perceptive loudness and normalize to speech frequencies and when you get to the shop to install new firmware you see a granny wearing glasses asking “what does the self checkout menu say?” and now you have a new problem.
Man alive, don’t get the managers working with audio. “Doubling the stream” might work if you’re using a signed audio format rather than an unsigned one, and the format is in the same endianness as the host computer uses. Neither of which are guaranteed when working with audio.
But of course, the ear perceives loudness in a logarithmic way (the decibel scale), so for it to be perceived as “twice as loud”, it generally needs an exponential increase. Very high and low frequencies need more, since we’re less sensitive to them and don’t perceive increases so well.
Exactly!
And, of course, AI doesn’t help with the thought process at all, but did made the syntax much simpler to deal with, once again.
So - once again - people who don’t understand what you just pointed out, now believe we don’t need programmers anymore. Just like the last several times that we “didn’t need programmers anymore”, for basically the same reason.
I understand that we rinse and repeat the same nonsense for networking, systems administrator, etc, every few years. Some people genuinely believe that the computers are someday going to magically start understanding themselves for us.
It’s very common. Every few years there is some no-code platform claiming no developers are needed anymore in any sector, not just web dev. Invariably these only work if you stay on the narrow path and of course the customer asks something outside of the easy path after the first demo so a lot of work by devs are needed to make of happen.
AI is just one more like that, but with hype on steroids.
And very old. Part of the sales pitch for the COmmon Business-Protected Language was that anyone could learn to code in almost plain English.
Also, the stuff they wind up making is the kind of stuff that people with no coding experience make. Cooking up an ugly website with terrible performance and security isn’t much harder than making an ugly presentation with lots of WordArt. But it never was, either.
Between COBOL and LLM-enhanced “low code” we had other stuff, like that infamous product from MS that produced terrible HTML. At this point I can’t even recall what it was called. The SharePoint editor maybe?
Even SQL was originally called SEQUEL, Structured English QUEry Language. They got sued for the name and changed it to SQL. It was also pitched to retrieve data with plain language.
FrontPage?
Microsoft Frontpage?
The first complete program I ever wrote was in Basic. It took an input number and rounded it to the 10s or 100s digit. I had learned just enough to get it running. It was using strings and a bunch of if statements, so it didn’t work for more than 3 digit numbers. I didn’t learn about modulo operations until later.
In all honesty, I’m still pretty proud of it, I was in 4th or 5th grade after all 😂. I’ve now been programming for 20+ years.
Even natural-language languages like Inform 7 require a little programming knowledge for when it hates you.
Hey, about out to an interactive fiction dude/dudette!
I programmed in TADS many years ago, but I want to learn and use inform, because I want a Z-code game like my timeless heroes at Infocom.
I’m not really into writing interactive fiction; I just tried it a little since it seemed neat. It turns out that I’m not great at coming up with things to write about, which makes it hard to actually write. Inform 7 makes some decisions that complicate using it with a programming background; I’m considering trying to write my own language for similar purposes (but different paradigms).
Ruby on Rails was probably the peak of the hype wave. It had a tutorial that any manager could follow to build a simple data driven website in minutes.
Is that a “framework”? Anyhow it was first released a year after you claimed this all happened.
Ruby was the really hot one.
.Net accomplished very similar outcomes and caused a lesser version of the same hyperbole, a few years earlier.
Yes, both are called frameworks.
Of course, I’m going from an old person’s memory, so who knows or cares? You can learn from my experiences, or not.
If you check my post history, you’ll see plenty of evidence that I am, as claimed, a cranky old software developer.
I don’t see why anyone would want to pretend to be me. It’s not that much fun.
Well, forget for a moment everything you know about webpages and now you want a form where the user can create an account. The sales person tells you that the user has entered the data for us, so it just needs to be sent with a request to the backend, which always looks the same. And then it just needs to be put into a
INSERT INTO
, which also always looks the same.All of that stuff can clearly be auto-generated by the framework. And 70% of the
boilerplatecode does exactly that, so that obviously means 70% of the workload of your devs disappears, which means you can get rid of 70% of your developers.It just makes it really easy to scam people, when they don’t know the technical side…
deleted by creator