924
should've been the axe template
(lemmy.blahaj.zone)
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to !politicalmemes@lemmy.world
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads
No advertisements or spam. This is an instance rule and the only way to live.
A collection of some classic Lemmy memes for your enjoyment
Gonna have to shill some FOSS LLMs here
You need an inference engine. Just use a llama.cpp derivate (fuck ollama, for a few reasons) and download an open model from HuggingFace (heavily recommend mistral series, which are Apache 2 license I think but I don't really remember)
You need to find a "quantization" of the model, you can find those from the model DNA on the right side of the screen in huggingface. You need a GGUF format to be exact.
Then all you need to do is tune some inference parameters and you're golden.