@[email protected] to No Stupid Questions • edit-28 months agoXXXNSFWmessage-square97fedilinkarrow-up1118arrow-down14
arrow-up1114arrow-down1message-squareXXXNSFW@[email protected] to No Stupid Questions • edit-28 months agomessage-square97fedilink
minus-square@A_Very_Big_FanlinkEnglish1•8 months agoSo you do unironically think it takes that amount of equipment and power to output to a single device lmao
minus-square@mojofrododojolinkEnglish0•8 months agoI can’t tell if you’re fucking dense or can’t read. A LLM RUN ON A PHONE WILL DO YOU FUCKALL GOOD. you uninformedly think you can run an AI worth a damn on your phone - and the corpus to teach it? fuck off, you stupid git. good luck with your HAL 9. You’re gonna walk through the apocalypse with a moron. Which fits, you’ll be equals.
minus-square@A_Very_Big_FanlinkEnglish1•edit-28 months agoHere’s a RaspberryPi doing a variety of tasks with various LLMs, like programming and accurately describing a picture. There’s a literal mountain of evidence of what these models can do. It’s been fun making you rage :3
So you do unironically think it takes that amount of equipment and power to output to a single device lmao
I can’t tell if you’re fucking dense or can’t read.
A LLM RUN ON A PHONE WILL DO YOU FUCKALL GOOD.
you uninformedly think you can run an AI worth a damn on your phone - and the corpus to teach it?
fuck off, you stupid git. good luck with your HAL 9. You’re gonna walk through the apocalypse with a moron. Which fits, you’ll be equals.
Here’s a RaspberryPi doing a variety of tasks with various LLMs, like programming and accurately describing a picture.
There’s a literal mountain of evidence of what these models can do. It’s been fun making you rage :3