potentiallynotfelix@lemmy.fish to Firefox@lemmy.mlEnglish · 24 hours agoFirefox introduces AI as experimental featurelemmy.fishimagemessage-square55fedilinkarrow-up1163arrow-down111file-text
arrow-up1152arrow-down1imageFirefox introduces AI as experimental featurelemmy.fishpotentiallynotfelix@lemmy.fish to Firefox@lemmy.mlEnglish · 24 hours agomessage-square55fedilinkfile-text
minus-squareTheMachineStops@discuss.tchncs.delinkfedilinkarrow-up4·edit-215 hours agoIt gives you many options on what to use, you can use Llama which is offline. Needs to be enabled though about:config > browser.ml.chat.hideLocalhost.
minus-squareSwedneck@discuss.tchncs.delinkfedilinkarrow-up4·13 hours agoand thus is unavailable to anyone who isn’t a power user, as they will never see a comment like this and about:config would fill them with dread
minus-squareTheMachineStops@discuss.tchncs.delinkfedilinkarrow-up4·edit-211 hours agoLol, that is certainly true and you would need to also set it up manually which even power users might not be able to do. Thankfully there is an easy to follow guide here: https://ai-guide.future.mozilla.org/content/running-llms-locally/.
It gives you many options on what to use, you can use Llama which is offline. Needs to be enabled though about:config > browser.ml.chat.hideLocalhost.
and thus is unavailable to anyone who isn’t a power user, as they will never see a comment like this and about:config would fill them with dread
Lol, that is certainly true and you would need to also set it up manually which even power users might not be able to do. Thankfully there is an easy to follow guide here: https://ai-guide.future.mozilla.org/content/running-llms-locally/.