• mechoman444
    link
    fedilink
    arrow-up
    1
    arrow-down
    5
    ·
    1 day ago

    Idk man, my electricity bill doesn’t care whether the data center powering ChatGPT was “speculatively built” or not. 500 million people actively using these tools daily is not fake demand, that’s just… demand. The money flowing into infrastructure is large and fast and messy, sure, but acting like the entire thing is vapor because the ROI timeline is uncertain is the same logic people used to say Amazon was a scam in 2001. Sometimes the buildout comes before the profit model and that’s just how new infrastructure works. The RAM price sucks though, not gonna lie.

    • x00z
      link
      fedilink
      English
      arrow-up
      4
      ·
      22 hours ago

      Electricity bills have already been increasing heavily around AI datacenters. I read an article about a place in the Netherlands where they can’t build any more houses at the moment because they’re building a datacenter that is going to take that electricity.

      • mechoman444
        link
        fedilink
        arrow-up
        2
        ·
        18 hours ago

        Aside from driving profits through demand for LLM use, these data centers are an environmental disaster.

        The scale of these facilities is insane. A flagship LLM model can require around 1 TB of RAM to run effectively. A mid-tier model may need roughly 140 GB. In contrast, a top-end gaming PC typically needs 64 GB at most, and the vast majority of systems function perfectly well with 16 GB.

        I was simply pointing out that these massive data centers exist because that demand exists.