wuphysics87 to [email protected] • 1 month agoCan you trust locally run LLMs?message-square16fedilinkarrow-up164arrow-down17file-text
arrow-up157arrow-down1message-squareCan you trust locally run LLMs?wuphysics87 to [email protected] • 1 month agomessage-square16fedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-square@SupraMariolink16•1 month agoAnd if you don’t want to do that…run it in a VM and unplug your nic/disconnect wifi
And if you don’t want to do that…run it in a VM and unplug your nic/disconnect wifi