77
you are viewing a single comment's thread
view the rest of the comments
[-] yogthos@lemmygrad.ml 3 points 21 hours ago

I've found Qwen is overall similar, their smaller model that you can run locally tends to produce somewhat better output in my experience. Another recent open source model that's good at coding is GLM https://z.ai/blog/glm-4.5

6gb vram is unfortunately somewhat low, you can run smaller models but the quality of output is not amazing.

this post was submitted on 22 Aug 2025
77 points (100.0% liked)

technology

23926 readers
68 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS