374
you are viewing a single comment's thread
view the rest of the comments
[-] IrateAnteater@sh.itjust.works 5 points 22 hours ago

Since VLC runs on just about everything, I'd imagine that the cloud service will be best for the many devices that just don't have the horsepower to run an LLM locally.

[-] GenderNeutralBro@lemmy.sdf.org 2 points 22 hours ago

True. I guess they will require you to enter your own OpenAI/Anthropic/whatever API token, because there's no way they can afford to do that centrally. Hopefully you can point it to whatever server you like (such as a selfhosted ollama or similar).

[-] zurohki@aussie.zone 1 points 22 hours ago

It's not just computing power - you don't always want your device burning massive amounts of battery.

this post was submitted on 09 Jan 2025
374 points (98.4% liked)

Opensource

1533 readers
711 users here now

A community for discussion about open source software! Ask questions, share knowledge, share news, or post interesting stuff related to it!

CreditsIcon base by Lorc under CC BY 3.0 with modifications to add a gradient



founded 1 year ago
MODERATORS