• @QuadratureSurfer
      link
      English
      151 year ago

      I’ve got it running with a 3090 and 32GB of RAM.

      There are some models that let you run with hybrid system RAM and VRAM (it will just be slower than running it exclusively with VRAM).

      • Deceptichum
        link
        fedilink
        161 year ago

        Yeah but damn does it get slow.

        I always find it interesting how text is so much slower than image generation. I can do a 1024x1024 in probably 20s, but I get like 1 word a second with text.

        • ferret
          link
          fedilink
          English
          51 year ago

          Languages are complex and, more importantly, much less forgiving to error

    • DarkThoughts
      link
      fedilink
      11 year ago

      Hopefully we see more specific hardware for this. Like extension cards with pretty much just tensor cores and their own ram.

      • Deceptichum
        link
        fedilink
        11 year ago

        I’d love to see some consumer level AI stuff, sadly it all seems to be designed for server farms and by the time it ages out into consumer prices it’s so obsolete there’s no point in getting it.

        • @raldone01
          link
          English
          11 year ago

          Do they want consumer ai cards to exist though?

          Think about the data!

          • Deceptichum
            link
            fedilink
            11 year ago

            Card makers? They only want money, if theres enough consumer level demand they will make them.

      • @topinambour_rex
        link
        English
        11 year ago

        Graphic cards without video connection exists since a while.