204

If even half of Intel's claims are true, this could be a big shake up in the midrange market that has been entirely abandoned by both Nvidia and AMD.

top 50 comments
sorted by: hot top controversial new old
[-] Assman@sh.itjust.works 57 points 2 months ago

All these weird numeric names. I'm gonna build a GPU and name it Jonathan.

[-] jonathan@lemmy.zip 18 points 2 months ago
[-] Brickhead92@lemmy.world 11 points 2 months ago

Just don't name it Steve. You're in for a world of troubles with GPU Steve.

[-] lime@feddit.nu 8 points 2 months ago

sorry, apple already took that one. call it Jeff or something.

[-] chiliedogg@lemmy.world 3 points 2 months ago

The Arc cards actually have a really fun generational naming mechanic.

It's RPG classes. First gen was Alchemist. Second (what the article is about) is Battlemage. I'm guessing we're getting Cleric, Druid, etc.

load more comments (1 replies)
load more comments (2 replies)
[-] vzq@lemmy.world 51 points 2 months ago

Intel GPU claims are NEVER true.

[-] MudMan@fedia.io 32 points 2 months ago

Meh, I ended up with an A770 for a repurposed PC and it's been pretty solid, especially for the discounted price I got. I get that there were some driver growing pains, but I'm not in a hurry to replace that thing, it was a solid gamble.

[-] brucethemoose@lemmy.world 10 points 2 months ago* (last edited 2 months ago)

The A770 was definitely a "fine wine" card from the start. Its raw silicon specs were way stronger than the competition, it just needed to grow into it.

This ones a bit smaller though...

[-] MudMan@fedia.io 3 points 2 months ago

Their promo benchmarks have it beating the 770, though, whcih is still a viable card at this price point. It'll be interesting to see if that pans out on reviews with independent tests.

Not in the market for one of these, but very curious to see how the 780 fares later. Definitely good to have more midrange options.

The whole goal of battlemage was to increase utilization and cut down on wasted silicon. The overall number of transistors are almost the same. If utilization of those transistors is much more efficient then 25% should easily be doable with all of the other architectural improvements.

load more comments (4 replies)
[-] unexposedhazard@discuss.tchncs.de 3 points 2 months ago

It hasnt been like that anymore for a while now.

load more comments (3 replies)
[-] brucethemoose@lemmy.world 39 points 2 months ago* (last edited 2 months ago)

If they double up the VRAM with a 24GB card, this would be great for a "self hosted LLM" home server.

3060, 3090 prices have been rising like crazy because Nvidia is vram gouging and AMD inexplicably refuses to compete. Even ancient P40s (double vram 1080 TIs with no display) are getting expensive. 16GB on the A770 is kinda meager, but 24GB is the point where you can fit the Qwen 2.5 32B models that are starting to perform like the big corporate API ones.

And if they could fit 48GB with new ICs... Well, it would sell like mad.

[-] Psythik@lemmy.world 20 points 2 months ago

I always wondered who they were making those mid- and low-end cards with a ridiculous amount of VRAM for... It was you.

All this time I thought they were scam cards to fool people who believe that bigger number always = better.

[-] sugar_in_your_tea@sh.itjust.works 11 points 2 months ago

Yeah, AMD and Intel should be running high VRAM SKUs for hobbyists. I doubt it'll cost them that much to double the RAM, and they could mark them up a bit.

I'd buy the B580 if it had 24GB RAM, at 12GB, I'll probably give it a pass because my 6650 XT is still fine.

load more comments (3 replies)
[-] brucethemoose@lemmy.world 9 points 2 months ago

Also "ridiculously" is relative lol.

The Llm/workstation crowd would buy a 48GB 4060 without even blinking, if that were possible. These workloads are basically completely vram constrained.

load more comments (4 replies)
load more comments (6 replies)
[-] Juice260@lemmy.world 19 points 2 months ago

I’m reserving judgement of course to see how things actually play out but I do want to throw a cheapest pc together for my nephew and that card would make a fine centerpiece.

[-] Magister@lemmy.world 4 points 2 months ago

don't forget mini PC like a beelink with a 8745HS for instance can be pretty great for games

[-] Gerudo@lemm.ee 2 points 2 months ago

I have an AMD build beelink and I'm shocked what it can run.

load more comments (1 replies)
load more comments (1 replies)
load more comments (4 replies)
[-] Buffalox@lemmy.world 15 points 2 months ago* (last edited 2 months ago)

Funny the Radeon RX 480 came out in 2016 at a similar price. Is that a coincidence?
Incidentally the last great generation offering a midrange GPU at a midrange price. The Nvidia 1060 was great too, and the 1080 is claimed to maybe be one of the best offers of all time. Since then everything has been overpriced.

The RX 480 was later replaced by the 580 which was a slight upgrade at great value too. But then the crypto wave hit, and soon a measly 580 cost nearly $1000!!! Things have never returned quite back to normal since. Too little competition with only Nvidia and AMD.

30 series started to look like a return to good priced cards. Then crypto hit and ruined that. Now we have AI to keep that gravy train going. Once the AI hype dies down maybe we'll see cards return to sane pricing.

load more comments (1 replies)
[-] DarkThoughts@fedia.io 9 points 2 months ago

As someone with a 6650 XT, which is a little slower than the 6700 or 4060, I doubt the increased vram, which is of course still nice, is enough to push it for 1440p. I struggle even in 1080p in some games, but I guess if you're okay with ~40 FPS then you could go that high.

Unfortunately, if the 4060 is roughly the target here, that's still far below what I'm interested in, which is more the upper midrange stuff (and I'd love one with 16 GB vram at least).

At least the price is much more attractive now.

load more comments (3 replies)
[-] poVoq@slrpnk.net 8 points 2 months ago

Dunno, realisticly speaking it is a slightly cheaper 7600, hardly a market shake-up.

load more comments (4 replies)
[-] RxBrad@infosec.pub 5 points 2 months ago

It's a pretty decent value when stacked up against RTX 4000 and RX 7000 GPUs.

But we're only a month or two from the next generation of Nvidia & AMD cards.

Those companies could even shit the bed for a second generation in a row on price-to-performance improvements, and the B580 will probably just end up being in-line with those offerings.

[-] chiliedogg@lemmy.world 5 points 2 months ago

Yeah,but by the time the 5060 is available, the tarrifs will have it at $450+.

[-] RxBrad@infosec.pub 4 points 2 months ago

Yeah, I'll be curious to see how that all plays out.

Current GPU pricing still seems to have the 2019-2020 25% GPU tariff price baked-in. Note how prices didn't drop 25% when those were rescinded.

Do Nvidia & AMD factor those in their pricing and give consumer a break? Or do they just jack up prices again and aim for mega-profits?

Hell, will the tariffs even happen? At one point, those tariffs were supposedly contingent on U.S. Federal income taxes being abolished, and being used to replace that government tax income. The income tax part seems to have been dropped from the narrative ever since the election.

[-] sugar_in_your_tea@sh.itjust.works 4 points 2 months ago

Seems like a decent card, but here are my issues:

  • 12 GB RAM - not quite enough to play w/ LLMs
  • a little better than my 6650 XT, but not amazingly so
  • $250 - a little better than RX 7600 and RTX 4060 I guess? Probably?

If it offered more RAM (16GB or ideally 24GB) and stayed under $300, I'd be very interested because it opens up LLMs for me. Or if it had a bit better performance than my current GPU, and again stayed under $300 (any meaningful step-up is $350+ from AMD or Nvidia).

But this is just another low to mid-range card, so I guess it would be interesting for new PC builds, but not really an interesting upgrade option. So, pretty big meh to me. I guess I'll check out independent benchmarks in case there's something there. I am considering building a PC for my kids using old parts, so I might get this instead of reusing my old GTX 960, the board I'd use only has PCIe 3.0, so I worry performance would suffer and the GTX 960 may be a better stop-gap.

[-] SharkAttak@kbin.melroy.org 15 points 2 months ago

12 GB RAM - not quite enough to play w/ LLMs

Good. Not every card has to be about AI, there's enough of those already; we need gaming cards.

load more comments (1 replies)
[-] exu@feditown.com 4 points 2 months ago

If their claims are true, I'd say this is a decent upgrade from my RX 6600 XT and I'm very likely buying one.

[-] sugar_in_your_tea@sh.itjust.works 5 points 2 months ago

Sounds like a ~10% upgrade, but I'd definitely wait for independent reviews because that could be optimistic. It's certainly priced about even with the 6600 XT.

But honestly, if you can afford an extra $100 or so, you'd be better off getting a 6800 XT. It has more VRAM and quite a bit better performance, so it should last you a bit longer.

[-] brucethemoose@lemmy.world 3 points 2 months ago* (last edited 2 months ago)

Its weird that Intel/AMD seem so disinterested in the LLM self hosting market. I get its not massive, but it seems way big enough for niche SKUs like they were making for blockchain, and they are otherwise tripping over themselves to brand everything with AI.

[-] sugar_in_your_tea@sh.itjust.works 4 points 2 months ago

Exactly. Nvidia's thing is RTX, and Intel/AMD don't seem interested in chasing that. So their thing could be high mem for LLMs and whatnot. It wouldn't cost them that much (certainly not as much as chasing RTX), and it could grow their market share. Maybe make an extra high mem SKU with the same exact chip and increase the price a bit.

[-] brucethemoose@lemmy.world 6 points 2 months ago* (last edited 2 months ago)

Well AMD won't do it ostensibly because they have a high mem workstation card market to protect, but the running joke is they only sell like a dozen of those a month, lol.

Intel literally had nothing to lose though... I don't get it. And yes, this would be a very cheap thing for them to try, just a new PCB (and firmware?) which they can absolutely afford.

[-] sugar_in_your_tea@sh.itjust.works 3 points 2 months ago* (last edited 2 months ago)

They might not even need a new PCB, they might be able to just double the capacity of their mem chips. So yeah, I don't understand why they don't do it, it sounds like a really easy win. It probably wouldn't add up to a ton of revenue, but it makes for a good publicity stunt, which could help a bit down the road.

AMD got a bunch of publicity w/ their 3D Cache chips, and that cost a lot more than adding a bit more memory to a GPU.

load more comments (2 replies)
[-] bluemellophone@lemmy.world 3 points 2 months ago

There are some smaller Ollama Llama 3.2 models that would fit on 12GB. I’ve run some of the smaller Llama 3.1 models under 10GB on NVIDIA GPUs

[-] warm@kbin.earth 4 points 2 months ago

Using ML upscaling does not qualify it as a 1440p card... what a poor take.

[-] krimson@lemmy.world 4 points 2 months ago
load more comments
view more: next ›
this post was submitted on 03 Dec 2024
204 points (97.2% liked)

Technology

63023 readers
759 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS