LughM to [email protected]English • 1 year agoNVIDIA's Eos supercomputer can train a 175 billion parameter GPT-3 model in under four minuteswww.engadget.comexternal-linkmessage-square2fedilinkarrow-up120arrow-down10
arrow-up120arrow-down1external-linkNVIDIA's Eos supercomputer can train a 175 billion parameter GPT-3 model in under four minuteswww.engadget.comLughM to [email protected]English • 1 year agomessage-square2fedilink
minus-square@blackfirelinkEnglish4•1 year agoSo it was a perf test of a 1b token size model not the full 3.7T that get3 is trained with. I mean great. They are showing improvement but this is just a headline grabber they haven’t done anything actually useful here.
minus-square@[email protected]linkfedilinkEnglish1•1 year agoJust checking in to say they are still there - so many rascals showing off rigs these days
So it was a perf test of a 1b token size model not the full 3.7T that get3 is trained with. I mean great. They are showing improvement but this is just a headline grabber they haven’t done anything actually useful here.
Just checking in to say they are still there - so many rascals showing off rigs these days