wuphysics87 to [email protected] • 2 months agoCan you trust locally run LLMs?message-square16fedilinkarrow-up164arrow-down17file-text
arrow-up157arrow-down1message-squareCan you trust locally run LLMs?wuphysics87 to [email protected] • 2 months agomessage-square16fedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-square@SupraMariolink16•2 months agoAnd if you don’t want to do that…run it in a VM and unplug your nic/disconnect wifi
And if you don’t want to do that…run it in a VM and unplug your nic/disconnect wifi