2

Hiya,

Recently upgraded my server to an i5-12400 CPU, and have neen wanting to push my server a bit. Been looking to host my own LLM tasks and workloads, such as building pipelines to scan open-source projects for vulnerabilities and insecure code, to mention one of the things I want to start doing. Inspiration for this started after reading the recent scannings of the Curl project.

Sidenote: I have no intention of swamping devs with AI bugreports, i will simply want to scan projects that i personally use to be aware of its current state and future changes, before i blindly update apps i host.

What budget friendly GPU should i be looking for? Afaik VRAM is quite important, higher the better. What other features do i need to be on the look out for?

top 15 comments
sorted by: hot top controversial new old
[-] Diplomjodler3@lemmy.world 4 points 1 month ago

The budget friendly AI GPUs are in the shelf right next to the unicorn pen.

[-] breadsmasher@lemmy.world 3 points 1 month ago

Ooh do they have any magic beans? Im looking to trade a cow for some

[-] irmadlad@lemmy.world 0 points 1 month ago

I've self hosted a few of the bite sized LLM. The thing that's keeping me from having a full blown, self hosted AI platform is my little GeForce 1650 just doesn't have the ass to really do it up right. If I'm going to consult with AI, I want the answers within at least 3 or 4 minutes, not hours. LOL

[-] Diplomjodler3@lemmy.world 1 points 1 month ago

Quite so. The cheapest card that I'd put any kind of real AI workload on is the 16GB Radeon 9060XT. That's not what I would call budget friendly, which is why I consider a budget friendly AI GPU to be a mythical beast.

[-] comrade_twisty@feddit.org 1 points 1 month ago* (last edited 1 month ago)

Afaik the budget friendliest local AI solutions currently are Mac Minis! Due to the CPU/GPU/RAM unified structure they are powerhouses for AI and astonishingly well priced for what they can put out.

[-] afk_strats@lemmy.world 1 points 1 month ago* (last edited 1 month ago)

3090 24gb ($800 USD)

3060 12gb x 2 if you have 2 pcie slots (<$400 USD)

Radeon mi50 32gb with Vulkan (<$300 ) if you have more time, space, and will to tinker

[-] papertowels@mander.xyz 0 points 1 month ago

What does budget friendly mean to you?

[-] melroy@kbin.melroy.org 0 points 1 month ago

I heard about people using multiple used 3090s in a single motherboard for this. Apparently it delivers a lot of bang for the buck, as compared to a single card with loads of VRAM.

[-] Lumisal@lemmy.world 0 points 1 month ago

Yes but you have to find the now kinda rare used NVLink ideally

[-] MalReynolds@slrpnk.net 1 points 1 month ago

Nah, NVLink is irrelevant for inference workloads (inference nearly all happens in the cards, models are split up over multiple and tokens are piped over pcie as necessary), mildly useful for training but you'll get there without them.

[-] drkt@scribe.disroot.org 0 points 1 month ago

It's all VRAM, that's the bottleneck for even the best GPUs. AMD support is spotty so you should stay in Nvidia's claws unless you know what you're doing. Figure out what kind of money you're willing to part with, and then get whatever Nvidia GPU gets you the most VRAM.

[-] anamethatisnt@sopuli.xyz -1 points 1 month ago

Yeah, for a budget friendly AI GPU I would look for a 5060 Ti 16GB.

[-] marud@piefed.marud.fr 0 points 1 month ago

Don't forget that the "budget friendly" card cost does not include the "non budget friendly" power bill that goes with it.

[-] frongt@lemmy.zip 1 points 1 month ago

Only if you're using it a lot. At idle or turned off it's negligible.

this post was submitted on 07 Nov 2025
2 points (100.0% liked)

Selfhosted

53808 readers
362 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS