That GitHub Copilot and friends are useful? I would argue that their utility is rather subjective, but there are indications that it improves developer productivity.
I’m unsure if you’ve used tools like GH Copilot before, but it primarily operates through “completions” (“spicy autocorrect” in its truest form) rather than a chatbot-like interface. It’s mostly good for filling out boilerplate and code that has a single obvious solution; not game-changing intelligence by any means, but useful in relieving the programmer of various menial tasks.
May I ask, what evidence are you hoping to see in particular?
yar I thought of that at the time too but with “gendercritical” having been used by ghouls I felt like the well might’ve been poisoned. still don’t really have a good one :|
I too want a taxi driver that doesn’t know how to drive a car but can adjust the little TV content in the back.
Psh I mean all he has to do is step on the gas pedal and the car does all the work anyways right? I’m glad he doesn’t have to think to much about so he has more time to get the thermostat just right.
I mean…yea? That’s kind of the point. It’s not driving, it’s the copilot. You’re the one driving, and it will get the thermostat right because you’re busy operating the vehicle and want to keep your attention on the road. That seems useful to me.
If you already have an idea of the code you want to write and start typing it, Copilot can help auto complete so you can focus on actually solving whatever problem you’re working on instead of searching for the correct syntax online. I understand shitting on AI is fun and there’s plenty of valid criticisms to be made, but this is actually kind of useful.
how could we possibly be critical of the technology that at best replicates basic editor functionality (templating, syntax completion), outputs wildly incorrect code, and burns rainforests?
I’m not saying you can’t be critical of it, but templating and syntax completion is in fact useful. Suggesting incorrect code is obviously bad, but all of this stuff is still relatively new and I’m sure it’ll get better with time. Can’t we at least try to be a little optimistic about what this stuff is capable of when we give our criticisms, instead of having knee jerk reactions that make this out to be the harbinger of the apocalypse?
Side point to address the linked article: yes, computing systems use energy. If our energy grid is overly reliant on the burning of fossil fuels that release harmful emissions, that doesn’t mean we need to stop the advancement of our computers. It means we need to stop using so much fossil fuels in our grid.
syntax completion has existed since 1957, and templating (or macros that implement templates) has been in editors long enough I don’t think anyone remembers when they got added (but here’s TECO (1962) anyway. the implementations of these things that run inside your editor are lightweight, predictable, and don’t increase carbon emissions by 30%, and it’s really weird that you’re in this thread cosplaying as a programmer but somehow don’t know basic shit about how code’s written, actually??? why is that, I wonder.
yes, computing systems use energy
come the fuck off it. so much of computer science involves studying algorithmic efficiency, something you just tried to talk past. it’s how we know that the regular expressions and push-down automata we implement in editors to do fast efficient syntax completion are a better fucking idea than using some shit that does the same thing less reliably and consumes so much fucking energy doing it that it increases greenhouse emissions by 30 fucking percent
but all of this stuff is still relatively new and I’m sure it’ll get better with time
What is the exact point of taking this attitude? Anybody who cares to look knows exactly what’s wrong with this stuff. It’s an astonishingly, and I mean “astonishing” as in “actually beyond ordinary human comprehension” as in “literally awe-inspiring”, wasteful means (whether your energy source is fossil fuels or solar!) of doing - at the absolute outside best - extraordinarily basic shit. Every single day the window of useful applications and potential improvements narrows incredibly rapidly, and the people who are fundamentally steering the whole programme are proven liars and scam artists, and proven beyond any shadow of a doubt at that?
Who cares if it’s relatively new, or if there’s room for mild-mannered optimism? What practical teeth does that argument have? What purpose does it actually serve beyond satisfying a basically shallow political impulse to moderate perceivedly heightened emotive responses to these incredibly stark facts?
The only actually reasonable response to this farrago is full-throated opposition to every element of the whole show which is either a lie or covering for a lie, which is virtually every single element. If all that you’re left with is “hey, transformers are pretty cool, and I look forward to seeing how they contribute in their own partial way to our collective technical means of saving the planet, and incidentally anti-trust legislation should put people like Altman behind bars for the rest of their lives” then so be it! That’s a far more even-handed and fundamentally sensible response than blithely insisting that the occasional trinket has room for improvement - in fact if you’re liberal-minded it’s the essential output of any sensible thoughts on how to maintain a democratic society.
yes, computing systems use energy. If our energy grid is overly reliant on the burning of fossil fuels that release harmful emissions, that doesn’t mean we need to stop the advancement of our computers. It means we need to stop using so much fossil fuels in our grid.
Now where have I heard something like this before? I’m trying to think of something, but I just can’t quite seem to remember…
Nope, I just feel like there’s a lot of reactionary content out there about AI. It’s still in it’s infancy and a lot of the tech bros behind these companies are full of shit and over hype it, which is exactly why I was also skeptical about ChatGPT passing the bar exam when it initially happened. But even with that said, it’s still a tool that can be applied in useful ways, such as giving suggestions for code or correcting grammar as you type.
There’s just no nuance in these discussions and you’re a perfect example of that
There’s just no nuance in these discussions and you’re a perfect example of that
the incredible nuance of pretending that basic editor features can’t be done without AI and ignoring a 30% increase in greenhouse gasses (and the entire field of algorithmic complexity) because something something fossil fuels something something progress. you fucking shithead.
it’s time for you to take your leave. but with your time spent not posting in this thread, if you’re actually worried about reactionaries (and we can tell you’re not), might I recommend looking up what your best boy Sam Altman’s been doing with Peter Thiel and the rest of his fash friends? it’s a real easy search to do, but you won’t
“Ah but see, there is no agency, there is merely emergent behaviour! It is none of our choices that drive this, but merely the ideas some have had that drive this engine of our doom. Alas, we can do nothing about this outcome!”
I have no idea what you mean by this comment. All I’m saying is that an auto complete feature when writing code is useful, which is largely what this was designed for.
That GitHub Copilot and friends are useful? I would argue that their utility is rather subjective, but there are indications that it improves developer productivity.
I’m unsure if you’ve used tools like GH Copilot before, but it primarily operates through “completions” (“spicy autocorrect” in its truest form) rather than a chatbot-like interface. It’s mostly good for filling out boilerplate and code that has a single obvious solution; not game-changing intelligence by any means, but useful in relieving the programmer of various menial tasks.
May I ask, what evidence are you hoping to see in particular?
https://awful.systems/comment/1286383
I look forward to the money that I’ll make cleaning up the mess you provide people with
I know I’m six months late to the party but how do you like “promptcritical”?
prompt critical is already a term of art in nuclear energy and it’s a state that you’d very, very much avoid (unless that was the intention of course)
heh, yeah, that’s my background.
yar I thought of that at the time too but with “gendercritical” having been used by ghouls I felt like the well might’ve been poisoned. still don’t really have a good one :|
oof, yeah, you’re right I reckon!
It really chafes how awful dipshits can turn chunks of language into superfund sites.
might have to just stick with tradition and go with “prompthater”
noprompter
oh that’s a good one
I too want a taxi driver that doesn’t know how to drive a car but can adjust the little TV content in the back.
Psh I mean all he has to do is step on the gas pedal and the car does all the work anyways right? I’m glad he doesn’t have to think to much about so he has more time to get the thermostat just right.
“hi you seem to be fearing for your life! here’s a calming advert”
The moral equivalent of “peril-sensitive shades” will be the killer app for augmented reality headsets.
I mean…yea? That’s kind of the point. It’s not driving, it’s the copilot. You’re the one driving, and it will get the thermostat right because you’re busy operating the vehicle and want to keep your attention on the road. That seems useful to me.
If you already have an idea of the code you want to write and start typing it, Copilot can help auto complete so you can focus on actually solving whatever problem you’re working on instead of searching for the correct syntax online. I understand shitting on AI is fun and there’s plenty of valid criticisms to be made, but this is actually kind of useful.
how could we possibly be critical of the technology that at best replicates basic editor functionality (templating, syntax completion), outputs wildly incorrect code, and burns rainforests?
I’m not saying you can’t be critical of it, but templating and syntax completion is in fact useful. Suggesting incorrect code is obviously bad, but all of this stuff is still relatively new and I’m sure it’ll get better with time. Can’t we at least try to be a little optimistic about what this stuff is capable of when we give our criticisms, instead of having knee jerk reactions that make this out to be the harbinger of the apocalypse?
Side point to address the linked article: yes, computing systems use energy. If our energy grid is overly reliant on the burning of fossil fuels that release harmful emissions, that doesn’t mean we need to stop the advancement of our computers. It means we need to stop using so much fossil fuels in our grid.
syntax completion has existed since 1957, and templating (or macros that implement templates) has been in editors long enough I don’t think anyone remembers when they got added (but here’s TECO (1962) anyway. the implementations of these things that run inside your editor are lightweight, predictable, and don’t increase carbon emissions by 30%, and it’s really weird that you’re in this thread cosplaying as a programmer but somehow don’t know basic shit about how code’s written, actually??? why is that, I wonder.
come the fuck off it. so much of computer science involves studying algorithmic efficiency, something you just tried to talk past. it’s how we know that the regular expressions and push-down automata we implement in editors to do fast efficient syntax completion are a better fucking idea than using some shit that does the same thing less reliably and consumes so much fucking energy doing it that it increases greenhouse emissions by 30 fucking percent
FPGA systems researchers: with this clever trick we can make the chip have 36% lower surface area and use 14% less energy!
Worst people you know: haha gpus go brrrrrrrrrrr
also the worst people I know: LLMs will replace human labor! now tell me why my copilot-generated HDL isn’t working
What is the exact point of taking this attitude? Anybody who cares to look knows exactly what’s wrong with this stuff. It’s an astonishingly, and I mean “astonishing” as in “actually beyond ordinary human comprehension” as in “literally awe-inspiring”, wasteful means (whether your energy source is fossil fuels or solar!) of doing - at the absolute outside best - extraordinarily basic shit. Every single day the window of useful applications and potential improvements narrows incredibly rapidly, and the people who are fundamentally steering the whole programme are proven liars and scam artists, and proven beyond any shadow of a doubt at that?
Who cares if it’s relatively new, or if there’s room for mild-mannered optimism? What practical teeth does that argument have? What purpose does it actually serve beyond satisfying a basically shallow political impulse to moderate perceivedly heightened emotive responses to these incredibly stark facts?
The only actually reasonable response to this farrago is full-throated opposition to every element of the whole show which is either a lie or covering for a lie, which is virtually every single element. If all that you’re left with is “hey, transformers are pretty cool, and I look forward to seeing how they contribute in their own partial way to our collective technical means of saving the planet, and incidentally anti-trust legislation should put people like Altman behind bars for the rest of their lives” then so be it! That’s a far more even-handed and fundamentally sensible response than blithely insisting that the occasional trinket has room for improvement - in fact if you’re liberal-minded it’s the essential output of any sensible thoughts on how to maintain a democratic society.
this should be a post
Now where have I heard something like this before? I’m trying to think of something, but I just can’t quite seem to remember…
@LargeMarge @self you seem very keen to find a problem for your beloved solution. Did you used to be a Blockchainer?
Nope, I just feel like there’s a lot of reactionary content out there about AI. It’s still in it’s infancy and a lot of the tech bros behind these companies are full of shit and over hype it, which is exactly why I was also skeptical about ChatGPT passing the bar exam when it initially happened. But even with that said, it’s still a tool that can be applied in useful ways, such as giving suggestions for code or correcting grammar as you type.
There’s just no nuance in these discussions and you’re a perfect example of that
the incredible nuance of pretending that basic editor features can’t be done without AI and ignoring a 30% increase in greenhouse gasses (and the entire field of algorithmic complexity) because something something fossil fuels something something progress. you fucking shithead.
it’s time for you to take your leave. but with your time spent not posting in this thread, if you’re actually worried about reactionaries (and we can tell you’re not), might I recommend looking up what your best boy Sam Altman’s been doing with Peter Thiel and the rest of his fash friends? it’s a real easy search to do, but you won’t
and of course it was another throwaway account
We’ll engrave it on LLMs’ tombstone, right next to blockchain and its “it’s still early”.
“Ah but see, there is no agency, there is merely emergent behaviour! It is none of our choices that drive this, but merely the ideas some have had that drive this engine of our doom. Alas, we can do nothing about this outcome!”
I have no idea what you mean by this comment. All I’m saying is that an auto complete feature when writing code is useful, which is largely what this was designed for.
holy fuck shut the fuck up
near-verbatim sealioning in tyol 2024. who says satire is dead
And devs wonder why people think they are hostile shut-ins that have no social skills…
SMFH
that’s great thanks
wut
That wasn’t sealioning. They were asking for a use case.