80
submitted 6 months ago by KarnaSubarna@lemmy.ml to c/firefox@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] KarnaSubarna@lemmy.ml 7 points 6 months ago

No, but the “AI” option available on Mozilla Lab tab in settings allows you to integrate with self-hosted LLM.

I have this setup running for a while now.

[-] cmgvd3lw@discuss.tchncs.de 4 points 6 months ago

Which model you are running? Who much ram?

[-] KarnaSubarna@lemmy.ml 4 points 6 months ago* (last edited 6 months ago)

My (docker based) configuration:

Software stack: Linux > Docker Container > Nvidia Runtime > Open WebUI > Ollama > Llama 3.1

Hardware: i5-13600K, Nvidia 3070 ti (8GB), 32 GB RAM

Docker: https://docs.docker.com/engine/install/

Nvidia Runtime for docker: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html

Open WebUI: https://docs.openwebui.com/

Ollama: https://hub.docker.com/r/ollama/ollama

this post was submitted on 31 Dec 2024
80 points (71.5% liked)

Firefox

20333 readers
42 users here now

/c/firefox

A place to discuss the news and latest developments on the open-source browser Firefox.


Rules

1. Adhere to the instance rules

2. Be kind to one another

3. Communicate in a civil manner


Reporting

If you would like to bring an issue to the moderators attention, please use the "Create Report" feature on the offending comment or post and it will be reviewed as time allows.


founded 5 years ago
MODERATORS