@[email protected] to [email protected]English • 16 hours agoFirefox introduces AI as experimental featurelemmy.fishimagemessage-square39fedilinkarrow-up1116arrow-down110file-text
arrow-up1106arrow-down1imageFirefox introduces AI as experimental featurelemmy.fish@[email protected] to [email protected]English • 16 hours agomessage-square39fedilinkfile-text
minus-square@[email protected]linkfedilink2•edit-26 hours agoIt gives you many options on what to use, you can use Llama which is offline. Needs to be enabled though about:config > browser.ml.chat.hideLocalhost.
minus-square@[email protected]linkfedilink3•4 hours agoand thus is unavailable to anyone who isn’t a power user, as they will never see a comment like this and about:config would fill them with dread
minus-square@[email protected]linkfedilink2•edit-22 hours agoLol, that is certainly true and you would need to also set it up manually which even power users might not be able to do. Thankfully there is an easy to follow guide here: https://ai-guide.future.mozilla.org/content/running-llms-locally/.
It gives you many options on what to use, you can use Llama which is offline. Needs to be enabled though about:config > browser.ml.chat.hideLocalhost.
and thus is unavailable to anyone who isn’t a power user, as they will never see a comment like this and about:config would fill them with dread
Lol, that is certainly true and you would need to also set it up manually which even power users might not be able to do. Thankfully there is an easy to follow guide here: https://ai-guide.future.mozilla.org/content/running-llms-locally/.