@Wilshire to TechnologyEnglish • 5 months agoThe first GPT-4-class AI model anyone can download has arrived: Llama 405Barstechnica.comexternal-linkmessage-square61arrow-up1215arrow-down117cross-posted to: [email protected]
arrow-up1198arrow-down1external-linkThe first GPT-4-class AI model anyone can download has arrived: Llama 405Barstechnica.com@Wilshire to TechnologyEnglish • 5 months agomessage-square61cross-posted to: [email protected]
minus-square@raldone01linkEnglish2•edit-25 months agoMy specs because you asked: CPU: Intel(R) Xeon(R) E5-2699 v3 (72) @ 3.60 GHz GPU 1: NVIDIA Tesla P40 [Discrete] GPU 2: NVIDIA Tesla P40 [Discrete] GPU 3: Matrox Electronics Systems Ltd. MGA G200EH Memory: 66.75 GiB / 251.75 GiB (27%) Swap: 75.50 MiB / 40.00 GiB (0%)
minus-squaresunzulinkfedilink1•5 months agook this is a server. 48gb cards and 67gb ram? for model alone?
minus-square@raldone01linkEnglish2•5 months agoEach card has 24GB so 48GB vram total. I use ollama it fills whatever vrams is available on both cards and runs the rest on the CPU cores.
minus-square@raldone01linkEnglish1•5 months agoWhat are you asking exactly? What do you want to run? I assume you have a 24GB GPU and 64GB host RAM?
so there is no way a 24gb and 64gb can run thing?
My specs because you asked:
CPU: Intel(R) Xeon(R) E5-2699 v3 (72) @ 3.60 GHz GPU 1: NVIDIA Tesla P40 [Discrete] GPU 2: NVIDIA Tesla P40 [Discrete] GPU 3: Matrox Electronics Systems Ltd. MGA G200EH Memory: 66.75 GiB / 251.75 GiB (27%) Swap: 75.50 MiB / 40.00 GiB (0%)
ok this is a server. 48gb cards and 67gb ram? for model alone?
Each card has 24GB so 48GB vram total. I use ollama it fills whatever vrams is available on both cards and runs the rest on the CPU cores.
What are you asking exactly?
What do you want to run? I assume you have a 24GB GPU and 64GB host RAM?
correct. and how ram speed work in this tbh
My memory sticks are all DDR4 with 32GB@2133MT/s.