13

Is anybody aware of any self hosted alternatives to Parrot.ai or Otter.ai? I've tried these services and I'm finding them very useful, but the price tag is a little steep. It seems like something that the open source community could solve. Anybody know of any projects, either existing or upcoming? Thanks!

you are viewing a single comment's thread
view the rest of the comments
[-] chmclhpby@lemmy.world 2 points 1 year ago

You can run a transcription model and a language model (the AI you talk to) locally however you will need a beefy GPU especially if you want to run the large models for better results.

OpenAI’s Whisper is open source and does transcription, and you can run inference on language models like LLaMa (+variants) or GPT4all locally. To store information long term (“AI memory”) you could find an open source vector database but I don’t have experience with this.

[-] colebrodine@midwest.social 1 points 1 year ago

Thank you! I'll make see if I can string together a few things to come up with my own homebrew version of these services. Honestly, for what they're charging I think I can justify a new dedicated GPU. I've got a few other dockers/services which could take advantage of it anyway, so maybe this is the excuse I've been needing to pull the trigger on that purchase.

[-] chmclhpby@lemmy.world 2 points 1 year ago

LLaMa-2 was just released and the fine-tunings people have made of it are topping the leaderboards right now in terms of performance for an open source language model. As for inference, don’t forget to look into quantization so you can run larger models on limited vram. I’ve heard about vLLM and llama.cpp and its derivatives.

If you’re looking for a GPU ~$300, I heard a used 3060 is better value than a 4060 right now on performance and memory throughout but not power efficiency (if you want an easy time with ML unfortunately the only option is nvidia).

Good luck! Would be nice to get an update if you find a good solution, it seems could share your use case

[-] colebrodine@midwest.social 2 points 1 year ago

Thanks for the tip on the GPU! I live in an area where power is relatively cheap, so I'll probably go for the 3060. I really wish some of these would work better with AMD since their drivers seem to be more Linux-Friendly these days.

If I get something going, I'll share for sure!

this post was submitted on 11 Aug 2023
13 points (81.0% liked)

Selfhosted

40406 readers
347 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS