this post was submitted on 31 Dec 2024
82 points (72.0% liked)

Firefox

18166 readers
2 users here now

A place to discuss the news and latest developments on the open-source browser Firefox

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] jeena@piefed.jeena.net 27 points 1 month ago (3 children)

Thanks for the summary. So it still sends the data to a server, even if it's Mozillas. Then I still can't use it for work, because the data is private and they wouldn't appreciate me sending their data toozilla.

[–] KarnaSubarna@lemmy.ml 21 points 1 month ago (1 children)

In such scenario you need to host your choice of LLM locally.

[–] ReversalHatchery@beehaw.org 5 points 1 month ago (1 children)

does the addon support usage like that?

[–] KarnaSubarna@lemmy.ml 7 points 1 month ago (1 children)

No, but the “AI” option available on Mozilla Lab tab in settings allows you to integrate with self-hosted LLM.

I have this setup running for a while now.

[–] cmgvd3lw@discuss.tchncs.de 4 points 1 month ago (1 children)

Which model you are running? Who much ram?

[–] KarnaSubarna@lemmy.ml 4 points 1 month ago* (last edited 1 month ago)

My (docker based) configuration:

Software stack: Linux > Docker Container > Nvidia Runtime > Open WebUI > Ollama > Llama 3.1

Hardware: i5-13600K, Nvidia 3070 ti (8GB), 32 GB RAM

Docker: https://docs.docker.com/engine/install/

Nvidia Runtime for docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html

Open WebUI: https://docs.openwebui.com/

Ollama: https://hub.docker.com/r/ollama/ollama

[–] LWD@lemm.ee 12 points 1 month ago

Technically it's a server operated by Google, leased by Mozilla. Mistral 7b could technically work locally, if Mozilla cared about doing such a thing.

I guess you can basically use the built-in AI chatbot functionality Mozilla rushed out the door, enable a secret setting, and use Mistral locally, but what a missed opportunity from the Privacy Browser Company

[–] Hamartiogonic@sopuli.xyz -2 points 1 month ago* (last edited 1 month ago)

According to Microsoft, you can safely send your work related stuff to Copilot. Besides, most companies already use a lot of their software and cloud services, so LLM queries don’t really add very much. If you happen to be working for one of those companies, MS probably already knows what you do for a living, hosts your meeting notes, knows your calendar etc.

If you’re working for Purism, RedHat or some other company like that, you might want to host your own LLM instead.