• Lost_My_MindM
    link
    fedilink
    arrow-up
    121
    ·
    22 hours ago

    Ok, so they bought billions of dollars of ram/storage, to put inside servers that haven’t been bought yet, to put inside data centers that haven’t been built yet, in order to run AI that doesn’t work yet, in order to chase profits that are impossible to achieve.

    And now, despite driving ram prices up to absurd prices, you’ve begun to realize the same thing all of us knew from before day one. NOBODY WANTS THIS SHIT!!!

    • dylanmorgan@slrpnk.net
      link
      fedilink
      arrow-up
      44
      arrow-down
      1
      ·
      20 hours ago

      None of that RAM (or the GPUs) have been purchased. All that is just letters of intent or even flimsier agreements, there’s no contracts or actual money changing hands.

      • Canaconda@lemmy.ca
        link
        fedilink
        arrow-up
        48
        arrow-down
        1
        ·
        19 hours ago

        Doesn’t matter, they’ve captured the entire supply chain which was their goal.

        This is not about AGI… its about monopolizing the future of computing.

      • humanspiral@lemmy.ca
        link
        fedilink
        arrow-up
        8
        ·
        15 hours ago

        the good news for RAM prices is when OpenAI makes money by either reselling “contracts”, or cancelling/getting cancelled their letters of intent to make ddr5 instead,

        • Napster153
          link
          fedilink
          arrow-up
          7
          ·
          13 hours ago

          We’d hope but you just know they’ll try and rob us blind still somehow. Intelligence is second from the bottom to these people.

      • InputZero
        link
        fedilink
        arrow-up
        19
        ·
        18 hours ago

        All that is just letters of intent or even flimsier agreements, there’s no contracts or actual money changing hands.

        Not quite. So while none of it has been made the pre-production, procurement, scheduling machine time, that is what’s going to make retooling to make consumer RAM take forever. TSMC or whomever can’t just flip a switch and produce a different product. It takes weeks to months to change over production that complicated. Money will change hands, work has already been done and agreed upon.

    • Canaconda@lemmy.ca
      link
      fedilink
      arrow-up
      16
      ·
      19 hours ago

      Yup and this is all going according to plan.

      1. Corner the Market

      2. Raise prices

      3. Sell High - The bubble will not burst until they’re ready to leave us holding the bag. The burst will be triggered by the sell off.

      4. Buy low - All these assets will be liquidated during bankruptcy for pennies on the dollar; to the same shareholders as before.

      5. Rinse

      6. Repeat

      7. Fuck you (and me)

      • jello8_@lemmy.today
        link
        fedilink
        arrow-up
        1
        ·
        11 hours ago

        You forgot to throw a couple bailouts in there and some regulatory capture mandating AI slop in you cars or something for “safety.”

        • Tollana1234567@lemmy.today
          link
          fedilink
          arrow-up
          1
          ·
          8 hours ago

          they are peddling AI so hard as a surveillence tech for most government, because they think that is where constant revenue stream is.the___

      • Buelldozer@lemmy.today
        link
        fedilink
        arrow-up
        10
        ·
        18 hours ago

        Congratulations, you’ve just covered how the Computer / Tech Industry has worked since mainframes were invented. It’s a constant cycle of $NewThing that almost works, desperate effort by a lot of companies to make it work right / better, market cornering, BoomTime for a lucky few companies, then someone figures out how to do it cheaper or re-focus the market on something slightly different, then BustTime.

    • bridgeenjoyer@sh.itjust.works
      link
      fedilink
      arrow-up
      21
      arrow-down
      2
      ·
      20 hours ago

      I agree with you completely, but,

      I wouldn’t say “no one wants this” though. The oligarchs have poured in billions and bought off every media company to constantly spout off about ai companies so your general normie thinks its “the future”. Almost every single (normie) person I know (except 1 who is anti AI, and he’s a geek) is using some form of slopbot for tons of things. Easy excel formulas (that anyone can do), turning pictures black and white (that literally any photo program has been able to do for 30+ years) , to summarize documents (because people are idiots now and have no reading comprehension) etc. The normies LOVE it and eat up the slop. Especially if they were stupid at computers before, now they think they’re on the level of woz because they told a chatbot to make slop code.

      The company I’m in can’t go 3 seconds without bringing up “ai innovation” and “being future ready”.

      Its only here on Lemmy that people dislike it. The rest of the world is already addicted, and we are screwed.

      • cynarM
        link
        fedilink
        English
        arrow-up
        22
        ·
        19 hours ago

        I’ve seen quite a few people who make casual use of it. The key point is that it is currently free to them. As soon as it starts costing money, a lot will bail on it.

        • MrKoyun
          link
          fedilink
          arrow-up
          1
          ·
          16 hours ago

          People dont need to be excited for it to want & use it.

      • ragas@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        15 hours ago

        Hmm it is different in my bubble. Most of the people I know use AI sparingly and generally do not trust the results without checking.

        • Tollana1234567@lemmy.today
          link
          fedilink
          arrow-up
          1
          ·
          8 hours ago

          i used it for the first time a month ago, it does not give even correct info, it just Assume what it sees from other sites, it doesnt have “checks” to see which ones are comments, post or blogs over official info.

      • OpenStars@piefed.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        15 hours ago

        thinks its “the future”

        Sort of, yeah. The thing is… it IS the future, whether we like it or not… it’s just not the PRESENT.

    • Buelldozer@lemmy.today
      link
      fedilink
      arrow-up
      7
      arrow-down
      9
      ·
      edit-2
      18 hours ago

      NOBODY WANTS THIS SHIT!!!

      That’s a popular take, especially around here, but AI does have some pretty nice use cases; just not as many as the TechBros would have you believe.

      Here’s some examples I’ve personally seen in the last 14 days:

      1. It’s good at transcribing meetings, including picking out who is talking, backing into an agenda, and highlighting action items.
      2. It’s darn good at writing even moderately complex scripts in any of the common languages. (Powershell, Python, R, etc)
      3. In the right hands (fingers?) it’s getting increasingly good at finding and exploiting security flaws.
      4. It’s amazing at slicing and dicing data if the person using it knows what they’re doing.

      Does all of the “Agentic” Woo Woo shit work? No, it absolutely doesn’t but it is clearly getting better as time goes on.

      IMO this whole AI thing has some very strong parallels to the early '80s computer industry. Right now it often requires specialist knowledge for good results which makes it clunky to use, it is somewhat slow, there’s very little interoperability, and it requires enormous amounts of power. Hell even this “over buying hardware” schtick fits right in, this happened with SRAM and then several times with DRAM as the industry matured.

      However the industry is also making progress at almost insane speed; not only is the output getting demonstrably better but the negatives are being addressed. In the past 30 days I’ve seen prototype ASIC-esque hardware that works in a standard desktop PC and processes nearly 10,000 tokens a second with local processing.

      The only reason you’re not seeing that kind of kit in the market yet is because the models are still changing too much and no one wants to commit hundreds of millions to making cards that would be outdated before they could be shipped. We’re probably only 18-24 months away though.

      I’ve also seen 10x improvements in memory usage (TurboQuant) and literally dozens of little tweaks and tricks to reduce footprint and speed processing. Just like what was going on in the PC industry in the '80s and '90s.

      So sure, Fuck AI (mostly) as it exists today but it won’t be long before it’s as ubiquitous as tablets and smartphones.

      • MrKoyun
        link
        fedilink
        arrow-up
        10
        ·
        16 hours ago

        And we’re fucking the world up to… transcribe meetings?

        • OpenStars@piefed.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          15 hours ago

          No, it’s to make the rich richer.

          Many people do not think about what or why they are doing what they do, or what its end outcome will be.

      • Lost_My_MindM
        link
        fedilink
        arrow-up
        9
        ·
        18 hours ago

        I don’t think you get why I don’t want AI.

        All the things you mentioned that AI is good at? Thats a bad thing to have. The more the technology becomes better, the worse all of our lives become.

        AI will steal all jobs. ALL jobs. Even the prostitutes. Whatever your job is, AI within 10 years will do it better than you at a fraction of your cost. Basically for free. And you can’t get another job, because ALL jobs are AI now. Build a robot, slap some AI in it, connect it to the main server, and it now has access to every AI units databases.

        And then what about us? Well, the wealthy become the overlords, and we become the slaves.

        • SummerReaper
          link
          fedilink
          arrow-up
          1
          ·
          8 hours ago

          I think the only industry that’s actually safe at this time is psychology. Therapy and mental health is bigger now than before. Plus it requires a real comprehensive understanding of the human experience that’s simply impossible for AI to do effectively with positive results.

          There probably be attempts though, I do think it’ll be ruled as highly illegal.

        • lemmy_outta_here
          link
          fedilink
          arrow-up
          2
          ·
          17 hours ago

          I actually agree with 99% of what you wrote, but you are a bit optimistic in one regard: they will want some sex slaves, but most of us will be food.

          • Zink@programming.dev
            link
            fedilink
            arrow-up
            3
            ·
            14 hours ago

            Whoa whoa, has “eat the rich” been one of those situations where the hyphen/comma is in the wrong place?

            It’s really “Eat, the rich!”

      • aesthelete
        link
        fedilink
        arrow-up
        3
        ·
        18 hours ago

        So sure, Fuck AI (mostly) as it exists today but it won’t be long before it’s as ubiquitous as tablets and smartphones.

        In order for it to be this ubiquitous it has to run locally or on commodity hardware IMO. The true lasting effects from this hype cycle are likely the capabilities that are being driven into smaller language models that don’t have out of control resource requirements.

        • Buelldozer@lemmy.today
          link
          fedilink
          arrow-up
          2
          ·
          17 hours ago

          In order for it to be this ubiquitous it has to run locally or on commodity hardware IMO.

          I agree, which is why I shared that I recently saw a prototype ASIC-esque PCI card. The local hardware is coming, the models just need to settle down some before anyone will commit to building that hardware.

          In the '90s and '00s you needed a zillion dollars of custom Silicon Graphics workstations and months of processing to do the FX for movies like “The Terminator”. In 2020 you could replicate it in a few hours with commodity hardware.

          The LLMs and AI will be the same, it just needs more than 5 years to get there.

          • aesthelete
            link
            fedilink
            arrow-up
            1
            ·
            10 hours ago

            Yeah if you can run them locally using a small board, that’ll last.

        • boonhet@sopuli.xyz
          link
          fedilink
          arrow-up
          1
          ·
          18 hours ago

          In order for it to be this ubiquitous it has to run locally or on commodity hardware IMO.

          LLMs as they are, can already run on smartphones, which pretty are ubiquitous themselves.

          So a flagship phone would have 12-16 gigs of RAM these days I believe. A low-end phone 4 gigs.

          Here are the sizes of some different parameter count versions of Qwen 3.5, a popular Chinese open-weight LLM:

          27B: 17 GB - not yet possible to run on current flagship phones, but once the RAM crisis ends, I could see this happening.

          9B: 6.6 GB

          4B: 3.4 GB

          2B: 2.7 GB

          0.8B: 1 GB.

          For any recently manufactured device, there will be versions of multiple popular LLMs that will run on the RAM size they have available.

          • aesthelete
            link
            fedilink
            arrow-up
            1
            ·
            10 hours ago

            Most people do not have a smartphone with that amount of RAM. But ultimately, yeah, eventually it’ll run on readily available hardware or it’ll go into a dustbin.

            There’s already ollama and stuff. It’ll stick around.

            • boonhet@sopuli.xyz
              link
              fedilink
              arrow-up
              1
              ·
              8 hours ago

              I mean fairly low end phones are 4 GB now. They could likely afford running a model that fits in 1GB of RAM. Different models for different classes of phone even for the same manufacturer will likely be a thing.