653
Sept
(lemmy.world)
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
But if that's how you're going to run it, why not also train it in that mode?
That is a thing, and it's called quantization aware training. Some open weight models like Gemma do it.
The problem is that you need to re-train the whole model for that, and if you also want a full-quality version you need to train a lot more.
It is still less precise, so it'll still be worse quality than full precision, but it does reduce the effect.
Your response reeks of AI slop
4/10 bait
Is it, or is it not, AI slop? Why are you using so heavily markdown formatting? That is a telltale sign of an LLM being involved
They used one formatting mark, and it's the most common. What are you smoking, and may I have some?
I am not using an llm but holy bait
Hop off the reddit voice
...You do know what platform you're on? It's a REDDIT alternative