387
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 19 Oct 2024
387 points (99.0% liked)
Technology
59583 readers
3193 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
Yup, it's the classic name-brand tax. That, and Nvidia also wins on features, like RTX and AI/compute.
But most people don't actually use those features, so most people seem to be buying Nvidia due to brand recognition. AMD has dethroned Intel on performance and price, yet somehow Intel remains dominant on consumer PCs, though the lead is a lot smaller than before.
If AMD wants to take over Nvidia, they'll need consistently faster GPUs and lower prices with no compromises on features. They'd have to invest a ton to get there, and even then Nvidia would probably sell better than AMD on name recognition alone. Screw that! It makes far more sense for them to stay competitive and suck up a bunch of the mid-range market and transition the low-end market to APUs. Intel can play at the low-mid range markets, and AMD will slot themselves as a bit better than Intel, and a better value than Nvidia.
That said, I think AMD needs to go harder on the datacenter for compute, because that's where the real money is, and it's all going to Nvidia. If they can leverage their processors to provide a better overall solution for datacenter compute, they could translate that into prosumer compute devices. High end gaming is cool, but it's not nearly as lucrative as datacenter. I would hesitate to make AI-specific chips, but instead make high quality general compute chips so they can take advantage of whatever comes after the current wave of AI.
I think AMD should also get back into ARM and low-power devices. The snapdragon laptops have made a big splash, and that market could explode once the software is refined, and AMD should be poised to dominate it. They already have ARM products, they just need to make low-power, high performance products for the laptop market.
They don't need to go with ARM. There's nothing inherently wrong with the x86 instruction set that prevents them from making low power processors, it's just that it doesn't make sense for them to build an architecture for that market since the margins for servers are much higher. Even then, the Z1 Extreme got pretty close to Apple's M2 processors.
Lunar Lake has also shown that x86 can match or beat Qualcomm's ARM chips while maintaining full compatibility with all x86 applications.
Hence ARM. ARM already has designs for low power, high performance chips for smaller devices like laptops. Intel is chasing that market, and AMD could easily get a foot in the door by slapping their label on some designs, perhaps with a few tweaks (might be cool to integrate their graphics cores?). They already have ARM cores for datacenter workloads, so it probably wouldn't be too crazy to try it out on business laptops.