I’ve had this conversation too many times to not make a meme out of it.

  • pulsewidth
    link
    fedilink
    arrow-up
    1
    ·
    7 hours ago

    The issue around them is more being built on time, as they have tight contracts with the scalers that allow them to simply not pay their interval payments or even pull out of the contracts entirely if delivery dates are delayed.

    They’re bespoke too - which is why they’re getting ‘AI datacenter’ builds instead of approaching existing datacenters. AI racks require up to a megawatt per rack. That’s insane. They have been custom designed and built by UPS and power companies.

    https://blog.se.com/datacenter/2025/10/16/the-1-mw-ai-it-rack-is-coming-and-it-needs-800-vdc-power/

    Yes, they could be pivoted away from AI to host ‘something else’, but it won’t help save the companies that built them get paid, because they’ll only be using a small fraction of their power delivery, and the $20,000 AI GPUs have pretty limited use-cases. It will be a massive oversupply issue causing all the datacenters hosting prices to have to drop drastically to get any businesses even into their tenancies. This will cause those hosting companies (which are up to their gills in loans) to go under - They’re the ones taking the big risks on AI. Not Meta/Google/MS/etc.