Hellfire103 to linuxmemesEnglish • 17 hours agoDistro Focuseslemmy.caimagemessage-square180fedilinkarrow-up1772arrow-down18
arrow-up1764arrow-down1imageDistro Focuseslemmy.caHellfire103 to linuxmemesEnglish • 17 hours agomessage-square180fedilink
minus-squareWFHlinkfedilinkEnglish5•10 hours agoLinks means Left in german. Same pronunciation. Bilingual play on words if you will.
minus-square@Zachariahlink6•edit-211 hours agoWho doesn’t? I mean version 2.9.2 just came out in May.
minus-square@[email protected]linkfedilinkEnglish3•11 hours agoi tried living in the terminal but i had no one to talk to
minus-square@Zachariahlink6•11 hours agoWe’re in your terminal: https://github.com/LunaticHacker/lemmy-terminal-viewer
minus-square@[email protected]linkfedilinkEnglish3•11 hours agosadly based on the latest issues submitted and my experience, the app no longer works: https://github.com/LunaticHacker/lemmy-terminal-viewer/issues/11
minus-squareLucy :3linkfedilink1•10 hours agoIf you have a decent GPU or CPU, you can just set up ollama with ollama-cuda/ollama-rocm and run llama3.1 or llama3.1-uncensored.
minus-square@[email protected]linkfedilinkEnglish2•10 hours agoI have a ryzen 5 laptop. not really decent enough for that workload. and im not crazy about AI.
minus-squareLucy :3linkfedilink1•10 hours agoI bet even my Pi Zero W could run such a model* * with 1 character per hour or so
minus-square@[email protected]linkfedilinkEnglish2•10 hours agointeresting, well it’s something to look into, but id like a place to communicate with like minded people.
do you use lynx for web browsing?
*Links 🇩🇪
??
Links means Left in german. Same pronunciation. Bilingual play on words if you will.
oh lol. what is right in german?
Rechts
🧐
Who doesn’t?
I mean version 2.9.2 just came out in May.
i tried living in the terminal but i had no one to talk to
We’re in your terminal: https://github.com/LunaticHacker/lemmy-terminal-viewer
sadly based on the latest issues submitted and my experience, the app no longer works: https://github.com/LunaticHacker/lemmy-terminal-viewer/issues/11
😍
If you have a decent GPU or CPU, you can just set up ollama with ollama-cuda/ollama-rocm and run llama3.1 or llama3.1-uncensored.
I have a ryzen 5 laptop. not really decent enough for that workload. and im not crazy about AI.
I bet even my Pi Zero W could run such a model*
* with 1 character per hour or so
interesting, well it’s something to look into, but id like a place to communicate with like minded people.