• @SomeGuy69
      link
      -9
      edit-2
      3 months ago

      Compared to 24gb we had in 2020 that’s barely an upgrade. I’m not joking, AI stuff needs more VRAM. 32gb is too little for the price they’ll ask for. Right now VRAM is more important than pure power, you can’t use larger models when your VRAM bottlenecks.

      Edit: oh I get it. I mentioned AI my bad.

      • @Valmond
        link
        123 months ago

        We had GPUs with 24GB in 2017, go buy a pro one if you do AI and need that much RAM.

        I mean, those cards are for gaming, right?

        • @SomeGuy69
          link
          03 months ago

          RTX3090 with 24gb VRAM was the best you could get as consumer GPU. Not sure which 2017 GPU you mean but it certainly was a completely different price range and not for gaming. It would be the same as comparing it to an A100, makes no sense.

          A lot of people can’t afford two PCs or a GPU that costs 10 times as much, so obviously the GPU needs to be good for gaming and other stuff. I only voiced my opinion after all. If you don’t need more VRAM, that’s fine. I however don’t have an unlimited wallet and VRAM should be higher for the asking price they have, but again, you can disagree if you think it’s fine.

          • @[email protected]
            link
            fedilink
            5
            edit-2
            3 months ago

            Titan Z in its full glory was Tesla K80 & shipped with 24GB RAM in 2014.

            Titan RTX had 24 ramsies in 2018.

            If RAM is what you need, and don’t wanna download it, you “can” buy an H100 with 80GB RAM (2022, Hopper architecture), they sell it on a PCI card too.
            It costs some real cash-moneys tho.

            In the realm of still imaginable GPU prices, Quadro usually offered more RAM, eg currently sold is the RTX 6000 Ada, comes with 48 gigirammers.

            On consumer gaming cards nVidia has need horrible with RAM for a few gens now, I dont care about AI (on my desktop I use it once per year & on my servers & don’t even run it on GPUs bcs of how little I need it), def not enough gRAM for gaming imho, need some gigs as buffer for other shit.

            • @SomeGuy69
              link
              -33 months ago

              I know there have been non gaming GPUs with so much VRAM, but that’s beside the point.

              • @[email protected]
                link
                fedilink
                4
                edit-2
                3 months ago

                Titan is considered consumer, Quadro workstation, Tesla enterprise/datacentre.

                I didn’t give it much thought, but I would consider AI (that needs more than low- or mid-consumer vRAM) the domain of Quadro.

                • @SomeGuy69
                  link
                  -43 months ago

                  Gaming. GAMING! No one gamed on a Titan card.

                  • @[email protected]
                    link
                    fedilink
                    3
                    edit-2
                    3 months ago

                    About as many people game(d) on Titans as they did/do on 3090 or 4090 I suppose.

                    And Titans were def cheaper than those two + were the top of the desktop consumer gaming line, just like those two *90 mentioned above are.

                    The full name was GeForce GTX Titan” (or later Totan Black/Titan Z) and the were a tiny step above 780 (or later 780 Ti).
                    Analogous to current gen that would be 4080 and 4090, they just called the ‘90’ Titan back then.

                    Edit: Titans msrp was 999 dollsigns 10 years ago, 4090 was 1.599 (former was about true, the latter was bullshit).

          • @Valmond
            link
            23 months ago

            No problem, but I mean if you’re just tinkering around then you could do with even less memory as long as the model stays in it and you sample small pieces in small batches.

            We all had P series gpus and we had to buy up because the trainees model didn’t fit in 16GB (they had probably too much money) so I don’t remember what card it was for the 24GB.

            • @SomeGuy69
              link
              03 months ago

              For just tinkering around one could use SD1.5 with a 4GB VRAM GPU and stop after a few minutes. I spend quite some time on AI image generation, like on average 4 hours per day since over a year now. New models, especially video AI generation will need more VRAM, but since I don’t do this commercially, I can’t just pay 30k for a GPU.