the writer Nina Illingworth, whose work has been a constant source of inspiration, posted this excellent analysis of the reality of the AI bubble on Mastodon (featuring a shout-out to the recent articles on the subject from Amy Castor and @[email protected]):
Naw, I figured it out; they absolutely don’t care if AI doesn’t work.
They really don’t. They’re pot-committed; these dudes aren’t tech pioneers, they’re money muppets playing the bubble game. They are invested in increasing the valuation of their investments and cashing out, it’s literally a massive scam. Reading a bunch of stuff by Amy Castor and David Gerard finally got me there in terms of understanding it’s not real and they don’t care. From there it was pretty easy to apply a historical analysis of the last 10 bubbles, who profited, at which point in the cycle, and where the real money was made.
The plan is more or less to foist AI on establishment actors who don’t know their ass from their elbow, causing investment valuations to soar, and then cash the fuck out before anyone really realizes it’s total gibberish and unlikely to get better at the rate and speed they were promised.
Particularly in the media, it’s all about adoption and cashing out, not actually replacing media. Nobody making decisions and investments here, particularly wants an informed populace, after all.
the linked mastodon thread also has a very interesting post from an AI skeptic who used to work at Microsoft and seems to have gotten laid off for their skepticism
I’ve got this absolutely massive draft document where I’ve tried to articulate what this person explains in a few sentences. The gradual removal of immediate purpose from products has become deliberate. This combination of conceptual solutions to conceptual problems gives the business a free pass from any kind of distinct accountability. It is a product that has potential to have potential. AI seems to achieve this better than anything ever before. Crypto is good at it but it stumbles at the cash-out point so it has to keep cycling through suckers. AI can just keep chugging along on being “powerful” for everything and nothing in particular, and keep becoming more powerful, without any clear benchmark of progress.
Edit: just uploaded this clip of Ralph Nader in 1971 talking about the frustration of being told of benefits that you can’t really grasp https://youtu.be/CimXZJLW_KI
this is also the marketing for quantum computing. Yes, there is a big money market for quantum computers in 2023. They still can’t reliably factor 35.
shit, I forgot about quantum computing. If you don’t game, do video production or render 3d models, you’re upgrading your computer to keep up with the demands of client-side rendered web apps and the operating system that loads up the same Excel that has existed for 30 years.
Lust for computing power is a great match for AI
i literally upgrade computers in the past decade purely to get ones that can take more RAM because the web now sends 1000 characters of text as a virtual machine written in javascript rather than anything so tawdry as HTML and CSS
the death of server-side templating and the lie of server-side rendering (which practically just ships the same virtual machine to you but with a bunch more shit tacked on that doesn’t do anything) really has done fucked up things to the web
as someone who never really understood The Big Deal With SPAs (aside from, like, google docs or whatever) i’m at least taking solace in the fact that like a decade later people seem to be coming around to the idea that, wait, this actually kind of sucks
deleted by creator
the worst part is I really despise this exact thing too, but have also implemented it multiple times across the last few years cause under certain very popular tech stacks you aren’t given any other reasonable choice
this is why my tech stack for personal work has almost no commonality with the tech I get paid to work with
deleted by creator
React doesn’t have to suck for the user (lemmy is fast) but …
this is the thing.
6 degrees of transpiler separation.
when your web page is actually an app written in JS, the commercial temptation to load it up with as many trackers as will fit is overwhelming
Every day, we pay the price for embracing a homophobe’s 10-day hack comprising a shittier version of Lisp.
The internet document transfer protocol needs a separation of page and app
deleted by creator
The comp.basilisk.faq answers that question!
Quantum computing marketing feels that way because people have slapped the quantum label on everything from health stickers to car parts.
But about half the field is pretty dour on if and when that will ever be a reality.
Ironically the use case that’s had the most promise for quantum foundations over the past few years is photonic based neural networks for AI. Because the end result is what matters and the network itself acting like a black box is generally fine, most of the measurement problem goes away and analog processing of ML workloads have been already showcased. MIT just the other week announced availability of a DIY kit for researchers in replicating their work on an A100 equivalent running in photonics even. In that space, the speed has been the opposite of the general purpose quantum computing field.
@[email protected] @self @trisweb I didn’t know masto picked up lemmy posts like this
@[email protected] @[email protected] @self Yep, it’s all the same protocol. It’s pretty weird though; no indication of what platform the post really came from or how it was intended to be viewed. I could see that being useful first-class information for the reader on whatever platform they’re reading from.
Trying to remember how I even got this post. Did you boost it from your masto account?
@trisweb @[email protected] @self yeah I figured the activitypub protocol used some kind of content type definition to control where stuff was appropriately published… I never got around to actually reading the docs.
I have no idea how it came to your feed. I found it because you boosted it!
as an open source federated protocol, ActivityPub and all the apps built on top of it are required to have a layer of jank hiding just under the surface
ActivityPub is a protocol for software to fail to talk to each other
@self has tapped Lemmy with carefully aimed hammers in a few places so that we federate both ways with Mastodon, which has been pretty cool actually
deleted by creator
seems to be on circumstances.run, which i’m on. that’s treehouse/glitch with authfetch
deleted by creator
@dgerard @self Oh interesting, so is this Lemmy instance special in this regard?
so not very? If your Mastodon has authfetch enabled then it doesn’t work properly. If it does then it does. I’m on circumstances.run which has authfetch on - it receives comments from awful.systems but doesn’t seem to pass them back.
@trisweb @self meanwhile on lemmy
@fasterandworse @self Fediverse problems. This could… use improvement. But it’s cool that it works!
The problem is that previous benchmarks have been so completely blown out of the water they keep needing to establish new benchmarks.
If GPT-3 scores around 30% in a standardized test and GPT-4 scores 95% in that same test, how useful will that test be to evaluating GPT-5?
What ever happened to the benchmark that was being used in popular coverage of AI for decades of the Turing test? That disappeared pretty quick from the conversation over the past two years.
You may not like the technology for whatever reason (and I’d encourage introspection on just how much of those attitudes are the result of decades of self-propagandizing via Sci Fi that’s since been revealed to have poorly anticipated reality). But don’t make the mistake of conflating those feelings with analyzing where the trend is going over the next few years.
If you think this is the peak of AI, you’re in for quite the surprise.