Love it. Very clear and to the point. Thank you.
rrr
- 1 Post
- 4 Comments
Joined 29 days ago
Cake day: April 13th, 2025
You are not logged in. If you use a Fediverse account that is able to follow users, you can follow this user.
Very nice rough answer i’ll check it out. . Thank you!
Of my many projects, currently for vs code (to actively exclude microsoft in my infrastructure), I’m using containerized code-server with the vscode-llama extension and a docker client. on my win 10 laptop (sorry, moving off windows at end of win10 support) i mapped the extension’s FIM “on” toggle by nesting a wsl call which nests a docker run --rm command which nests an ephemeral llama-cpp container for the AI calls. Docker engine running in wsl with ports forwarded from host/wsl vm.
Why? Portable, cross-platform and partially ephemeral dev environment with minimal dependencies.
I might even get a terraform tofu module for this eventually.
I have a small attention span 💀.
leads to? try is.
oh shoot! no, but i should have. this is perfect!