Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid!
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post, there’s no quota for posting and the bar really isn’t that high
The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
Promptfondler proudly messes with oss project (OpenAI subreddit)
To be clear nothing in the post makes me think they actually did what they are claiming, from doing non-specific ‘fixes’ to never explicitly saying what the project is other that that it is ‘major’ and ‘used by many’ to the explicit ‘{next product iteration} is gonna be so incredible you guys’ tone of the post, it’s just the thought of random LLM enthusiasts deciding en masse to play programmer on existing oss projects that makes my hairs stand on end.
Here they are explaining their process
Amazing how much tech hype nowadays is ‘the next version will be great!’. Parts of this always have existed, and there is also the other part of tech hype ‘You don’t get it, this isn’t just tech, this allows you to be A Platform!’. Vast fields of new possibilities, always just out of reach. Fusion is 17.6 years away people!
E: Related to that, also see how people always need to shift to the next big thing. The next codebase will fix your problems, no the next new AI system will be better, dump the old and learn the new thing. (Don’t forget to not notice you are not actually doing things, just learning new systems over and over).
crickets
there are errors and you read them into the ai. Someone alerted that a build wasn’t working.
Somebody set up us the bomb. Main screen turn on.
Figuring out a codebase from first principles.
what is the function of #include
That person is getting paid twice as much as you now for using chatgpt to code. And they still don’t know what #include means.
It’s you!
https://youtu.be/qE6emvdmg-M
Oh man this is just as good/bad as I remember.
I swear to god it was like 50 years ago it feels, but I still remember everyone responding to criticism of ChatGPT with “GPT 4 is going to be much better just wait”.
That was this year, third april in the year of our lord covid 2020
Don’t forget to read OPs “take” on the Halting Problem.
I was mentally prepared to get irrationally angry, but fortunately this is such incoherent word salad that it’s not even wrong.
At least they get called on being a word salad merchant in that sub, the response in r/openai is basically rapturous.
Amazing that when there is pushback against his ideas he starts to ad hominem call people professors.
(there was this talk/article about how to recognize a crank I forgot the link/source but iirc ‘reacting badly to pushback’ and ‘always trying to solve the biggest open problems firsts’ are two things on the list).
I need a shorter name for the whole genre of person that’s on way too many uppers and won’t stop using ChatGPT all night to make all their decisions, cause we keep running into them and for some reason all of them are obsessed with CS woo
maybe we’re really just witnessing what happens when your “nootropic” habit gets out of hand and you’re still in debt from gambling on crypto, so you get high as shit and convince yourself you’re a genius because whenever you read the tea leaves they tell you exactly what you expected to hear. this is, unfortunately, how cults tend to form.
“Presumably-outsourced non-thinker” but unfortunately it’s not as snappy as promptfans and promptfondlers
Prompt-deciduous?
They are having a delirium supported by ML, so Promptlerius?
Like OP, I also learned about Gödel’s incompleteness theorems and was struck with a sense of profundity despite not having the mathematical grounding to come to any meaningful conclusions from this. Unlike OP, I don’t ramble about the things I don’t really understand.
(I do, however, ramble about protein structure and biochemistry, which is very cool, and also my jam)
I don’t even get what the guy is claiming to do here. Hope he didn’t try sending his fixes upstream.