@Wilshire to TechnologyEnglish • 6 months agoThe first GPT-4-class AI model anyone can download has arrived: Llama 405Barstechnica.comexternal-linkmessage-square61arrow-up1215arrow-down117cross-posted to: [email protected]
arrow-up1198arrow-down1external-linkThe first GPT-4-class AI model anyone can download has arrived: Llama 405Barstechnica.com@Wilshire to TechnologyEnglish • 6 months agomessage-square61cross-posted to: [email protected]
minus-squareBlaster MlinkEnglish6•6 months agoAs a general rule of thumb, you need about 1 GB per 1B parameters, so you’re looking at about 405 GB for the full size of the model. Quantization can compress it down to 1/2 or 1/4 that, but “makes it stupider” as a result.
As a general rule of thumb, you need about 1 GB per 1B parameters, so you’re looking at about 405 GB for the full size of the model.
Quantization can compress it down to 1/2 or 1/4 that, but “makes it stupider” as a result.