Google IO is the apex of AI per minute
We got the bad ending, that’s for sure. :(
Last year they were called cuda core. Now they are called AI Tops… wtf is an ai top… did we all become bottoms?
TOP is a measurement of performance (Trillions of Operations Per second), a CUDA core is not, it’s a physical processing unit on the chip.
TOP is the AI version of a FLOP, the thing we typically use to measure graphics performance.
OK but what’s a BOTTOM?
Billions of Operations that Treat TOP as Operational Masters
Us, the ones getting fucked
you
Does that make you my TOP?
Ya gonna pump Trillions of Operations per second inta me, hun?
What I’m hearing is we use TOPS to measure bots
Don’t underestimate! I was brushing my teeth and my electric toothbrush ran out of AIs. ☹️
I feel called out
Is this the new standard for measuring bullshit?
^AI no ^AI
I think this is literally how many times they said “AI” in their presentation, and then they normalized it per length if presentation
So… The answer is yes.
Yes.
AMD again being more efficient than nvidia but will sell like 10% as much
AI generated presentations in five years are going to be favouring AI. With more AI in the AI training sets we can AI generate better presentations on the benefits of AI.
Excellent
This makes me want to buy Intel instead of AMD for the first time in a couple of decades.
I mean if the new gen of gpus has accellerators it makes sense; actually out of curiosity, does any of the new intel stuff have any of that? I am still at the old i5 chips
I’m having a hard time understanding your question, but I’ll try my best:
if the new gen of gpus has accellerators
-
GPUs are pretty much nothing but [graphics] accelerators, although they are increasingly general-purpose for parallel computation and have a few other bits and pieces tacked on, like hardware video compression/decompression.
-
If you typo’d “CPU,” then the answer appears to be that Intel desktop CPUs with integrated graphics are much more common than AMD CPUs with integrated graphics (a.k.a. “APUs”) because Intel sprinkles them in throughout their product range, whereas AMD mostly leaves the mid- to top-end of their range sans graphics because they assume you’ll buy a discrete graphics card. The integrated graphics on the AMD chips that do have them tend to be way faster than Intel integrated graphics, however.
-
If you mean “AI accelerators,” then the answer is that that functionality is inherently part of what GPUs do these days (give or take driver support for Nvidia’s proprietary CUDA API) and also CPUs (from both Intel and AMD) are starting to come out with dedicated AI cores.
does any of the new intel stuff have any of that? I am still at the old i5 chips
“Old i5 chips” doesn’t mean much – that just means you have a midrange chip from any time between 2008 and now. What matters is the model number that comes after the “Core i5” part, e.g. “Core i5 750” (1st-gen from 2008) vs. “Core i5 14600” (most recent gen before rebranding to “Core Ultra 5”, from just last year).
As far as “it makes sense” goes, to be honest, an Intel CPU would still probably be a hard sell for me. The only reason I might consider one is if I had some niche circumstance (e.g. I was trying to build a Jellyfin server and having the best integrated hardware video encode/decode was the only thing I cared about).
What I really had in mind when I say it makes me want to buy Intel (aside from joking about rejecting “AI” buzzword hype) is the new Intel discrete GPU (“Battlemage”), oddly enough. It’s getting to be about time for me to finally upgrade from the AMD Vega 56 I’ve been using for over seven(!) years now, so I’ll be interested to see how the Intel Arc B770 might compare to the AMD Radeon RX 9070 (or whichever model it’s competing against).
-
These companies are putting NPUs into their processors, and nobody will ever build the software to use them because everything is done on GPUs. It’s a dog and pony show.
There is 3000 minutes in 50h. You need
- 2220 AIs if you have Intel GPU
- 5670 AIs if you have AMD GPU
- 4470 AIs if you have NVidia GPU
well, say i’m playin’ off my legacy gaming rig with northwoods p4 and geforce 3. how many hours per ‘ai’?
∞ hours
The problem here is that we replaced all the world’s actual intelligence with artificial intelligence and now we’re not sure what to do with it, but I really want some Brawndo…