What a dumb comparison. Calculators are just tools to do the same mechanical action as abaci, which were also just tools to speed up human mechanical actions of calculation.
Writing, drawing, research are creative, not mechanical, and offloading them to a tool is very different from offloading calculations to integrated circuits
Most of what’s being offloaded to AI is boiler plate work. People underestimate how much of what we do every day is boiler plate, and the perfect workload to offload to allow humans to focus more on the creative stuff.
First off, no, not even boilerplate can be done incorrectly sometimes. Software that ingests words and outputs words can’t check, say, official forms for correctness. Or test reports. You need a different type of reasoning for that.
And then, even if we assume that AI can do these tasks correctly, boilerplate isn’t being just offloaded, it’s being created. Sure, we’ve had bullshit generators before. But now our bullshit machines are faster, and spew out more believable bullshit. Google has been ruined by generated slop. That’s work that wasn’t performed before, doesn’t improve our lives and yet is being done.
Generating boilerplate to get past the blank page phase is not the same as trying to make it check forms for correctness. That’s why I didn’t suggest it should be used for that, so I don’t know what use the strawman is to make an irrelevant point.
Many of you are very, very anti-AI. We get it. But that also leads to you having next to no experience with it, because you don’t practice enough to understand how to use it correctly, and it leads to y’all pulling nonsense criticisms out of your ass.
Not a good analogy, except there is one interesting parallel. My students who overuse a calculator in stats tend to do fine on basic arithmetic but it does them a disservice when trying to do anything more elaborate. Granted, it should be able to follow PEDMAS but for whatever weird reason, it doesn’t sometimes. And when there’s a function that requires a sum and maybe multiple steps? Forget about it.
Similarly, GPT can make cliche copy writing, but good luck getting it to spit out anything complex. Trust me, I’m grading that drinble. So in that case, the analogy works.
LLMs by their very nature drive towards cliche and most common answers, since they’re synthesizing data. Prompts can attempt to sway it away from that, but it’s ultimately a regurgitation machine.
Actual AI might be able to eventually, but it would require a lot more human like experience (and honestly, the chaos that gives us creativity). At that point it’ll probably be sentient, and we’d have bigger things you worry about, lol
This. It’s a tool, embrace it and learn the limitations…or get left behind and become obsolete. You won’t be able to keep up with people that do use it.
The invention of the torque wrench didn’t severely impede my ability to retrieve stored information, and everyone else’s, affecting me by proxy.
The tech four years ago was impressive but for me it’s only done two things since becoming widely available: thinned the soup of Internet fun things, and made some people, disproportionally executives at my work, abandon a solid third of their critical thinking skills.
I use AI models locally, to turn around little jokes for friends, you could say I’ve put more effort into machine learning tools than many daily AI users. And I’ll be the first to call the article described by OP as a true, shameful indictment of us as a species.
BREAKING NEWS: Since the invention of calculators, less people using abacus!
What a dumb comparison. Calculators are just tools to do the same mechanical action as abaci, which were also just tools to speed up human mechanical actions of calculation.
Writing, drawing, research are creative, not mechanical, and offloading them to a tool is very different from offloading calculations to integrated circuits
Most of what’s being offloaded to AI is boiler plate work. People underestimate how much of what we do every day is boiler plate, and the perfect workload to offload to allow humans to focus more on the creative stuff.
First off, no, not even boilerplate can be done incorrectly sometimes. Software that ingests words and outputs words can’t check, say, official forms for correctness. Or test reports. You need a different type of reasoning for that.
And then, even if we assume that AI can do these tasks correctly, boilerplate isn’t being just offloaded, it’s being created. Sure, we’ve had bullshit generators before. But now our bullshit machines are faster, and spew out more believable bullshit. Google has been ruined by generated slop. That’s work that wasn’t performed before, doesn’t improve our lives and yet is being done.
Generating boilerplate to get past the blank page phase is not the same as trying to make it check forms for correctness. That’s why I didn’t suggest it should be used for that, so I don’t know what use the strawman is to make an irrelevant point.
Many of you are very, very anti-AI. We get it. But that also leads to you having next to no experience with it, because you don’t practice enough to understand how to use it correctly, and it leads to y’all pulling nonsense criticisms out of your ass.
Not a good analogy, except there is one interesting parallel. My students who overuse a calculator in stats tend to do fine on basic arithmetic but it does them a disservice when trying to do anything more elaborate. Granted, it should be able to follow PEDMAS but for whatever weird reason, it doesn’t sometimes. And when there’s a function that requires a sum and maybe multiple steps? Forget about it.
Similarly, GPT can make cliche copy writing, but good luck getting it to spit out anything complex. Trust me, I’m grading that drinble. So in that case, the analogy works.
You think it won’t ever spit out anything complex?
LLMs by their very nature drive towards cliche and most common answers, since they’re synthesizing data. Prompts can attempt to sway it away from that, but it’s ultimately a regurgitation machine.
Actual AI might be able to eventually, but it would require a lot more human like experience (and honestly, the chaos that gives us creativity). At that point it’ll probably be sentient, and we’d have bigger things you worry about, lol
Then enjoy your AI slop! I’m not stopping you.
This. It’s a tool, embrace it and learn the limitations…or get left behind and become obsolete. You won’t be able to keep up with people that do use it.
dude you figuring out how to make the AI shit out something half-passable isn’t making you clever and superior, it’s just sad
The invention of the torque wrench didn’t severely impede my ability to retrieve stored information, and everyone else’s, affecting me by proxy.
The tech four years ago was impressive but for me it’s only done two things since becoming widely available: thinned the soup of Internet fun things, and made some people, disproportionally executives at my work, abandon a solid third of their critical thinking skills.
I use AI models locally, to turn around little jokes for friends, you could say I’ve put more effort into machine learning tools than many daily AI users. And I’ll be the first to call the article described by OP as a true, shameful indictment of us as a species.