397
submitted 3 weeks ago* (last edited 3 weeks ago) by SnotFlickerman@lemmy.blahaj.zone to c/technology@lemmy.world

OK, maybe you wouldn't pay three grand for a Project DIGITS PC. But what about a $1,000 Blackwell PC from Acer, Asus, or Lenovo?


Besides, why not use native Linux as the primary operating system on this new chip family? Linux, after all, already runs on the Grace Blackwell Superchip. Windows doesn't. It's that simple.

Nowadays, Linux runs well with Nvidia chips. Recent benchmarks show that open-source Linux graphic drivers work with Nvidia GPUs as well as its proprietary drivers.

Even Linus Torvalds thinks Nvidia has gotten its open-source and Linux act together. In August 2023, Torvalds said, "Nvidia got much more involved in the kernel. Nvidia went from being on my list of companies who are not good to my list of companies who are doing really good work."

top 50 comments
sorted by: hot top controversial new old
[-] chemicalwonka@discuss.tchncs.de 252 points 3 weeks ago* (last edited 3 weeks ago)

Don't forget those who made it happen. Nvidia was "forced" to integrate Linux into its ecosystem

Nvidia has always been hostile to the Linux community or negligent to say the least

1000043457

[-] Cris_Color@lemmy.world 88 points 3 weeks ago

Man, I completely forgot about that. That's honestly wild to think about in retrospect...

[-] CeeBee_Eh@lemmy.world 10 points 3 weeks ago

It's not. It had nothing to do with it. Nvidia was all in with Linux as soon as they realized their hardware could be used for data processing and AI. That realization was way more than a decade ago.

load more comments (2 replies)
[-] projectmoon@lemm.ee 27 points 3 weeks ago

Don't know about "always." In recent years, like the past 10 years, definitely. But I remember a time when Nvidia was the only reasonable recommendation for a graphics card on Linux, because Radeon was so bad. This was before Wayland, and probably even before AMD bought ATI. And it was certainly long before the amdgpu drivers existed.

[-] Dark_Arc@social.packetloss.gg 14 points 3 weeks ago

Yeah it was before AMD did graphics.

ATI had an atrocious closed source driver. I used it ... but it was not good at much of anything.

[-] endeavor@sopuli.xyz 6 points 3 weeks ago* (last edited 3 weeks ago)

I had ati card on my pc when pentium 4 was all the rage. I literally spent my teenage years learning english in order to get the dumb games I saved up ages for to work without crashing constantly. Its shocking how the same terrible card manufacturer is part of the company that makes the only cpu worth damn and great gpus.

load more comments (1 replies)
[-] CeeBee_Eh@lemmy.world 14 points 3 weeks ago

Nvidia was "forced" to integrate Linux into its ecosystem

100% bullcrap.

Nvidia's servers for data processing have always run Linux. And you know what those servers run? It's not Windows, that's for sure. So why would they write multiple versions of a driver for the same hardware interface? Their servers use the same drivers that you would use for gaming on a Linux desktop system.

In fact, no version of Windows is supported on their DGX servers, and AFAIK you can't even install Windows on it (even if you managed, it wouldn't be usable).

Long story short, a vendor we were working with (about 6 or 7 years ago now), was working on their Linux version of their SDK. We wanted to do some preliminary testing on Nvidia's new T4s that at this point were only available via Nvidia's testing datacenter (which we had access to).

During a call with some of the Nvidia engineers I had to ask the awkward question of "any chance there's a Windows server we can test on?". I knew it was a cringe question and I died a little during the 10 second silence until one of the Nvidia guys finally replied with "no one uses Windows for this stuff". And he said it slowly like the reply to such a question needed to go slow to be understood, because who else would ask that question unless you're slow in the head?

Nvidia has always been hostile to the Linux community or negligent to say the least

People say "hostile", but I think a better word is arrogant. They wanted to force the industry to use their own implementations they owned or pioneered like egl-stream instead of open standards. But AMD and Intel have proven that open source graphics drivers not only work, but benefit from being open so that the community can scratch their own itches and fix issues faster.

[-] priapus@sh.itjust.works 2 points 3 weeks ago

Yep, Nvidia has never been hostile towards Linux, they benefit from supporting it. They just don't care to support the desktop that much, and frankly neither do AMD or Intel. They often take an extremely long time to fix simple bugs that only effect desktop usage. Fortunately, in their case, the drivers can be fixed by other open source contributors.

[-] mac@lemm.ee 7 points 3 weeks ago

Am I missing something here? Nvidia never caved to their demands IIRC

[-] shortwavesurfer@lemmy.zip 77 points 3 weeks ago

Honestly, I've found that my compute needs have been surpassed quite a while ago, and so I could easily get away with buying a $300 computer.

[-] SnotFlickerman@lemmy.blahaj.zone 53 points 3 weeks ago

Honestly, for real, a lot of low-power PCs are really useful once they have crap like Windows off of them and a lightweight Linux distro on them.

[-] shortwavesurfer@lemmy.zip 34 points 3 weeks ago

Exactly. Get yourself a somewhat low-end PC, wipe windows, and install Linux Mint, and you're pretty much golden.

[-] Emi@ani.social 11 points 3 weeks ago

Did exactly this with an old laptop and use to mainly for tv and occasionally browsing when staying at our hut/cottage? Still bit slow but works.

[-] ricecake@sh.itjust.works 13 points 3 weeks ago

I've found my preferences have been creeping up in price again, but only because I've found I want an actually physically lightweight laptop, and those have been getting more available, linux-able and capable.

I only need a few hundred dollars worth of computer, and anything more can live on a rack somewhere. I'll pay more than that for my computer to be light enough I don't need to think about.

[-] tal@lemmy.today 19 points 3 weeks ago* (last edited 3 weeks ago)

Up until the early 2000s, serial computation speed doubled about every 18 months. That meant that virtually all software just ran twice as quickly every 18 months of CPU advances. And since taking advantage of that was trivial, new software releases did, traded CPU cycles for shorter development time or more functionality, demanded current hardware to run at a reasonable clip.

In that environment, it was quite important to upgrade the CPU.

But that hasn't been happening for about twenty years now. Serial computation speed still increases, but not nearly as quickly any more.

This is about ten years old now:

https://preshing.com/20120208/a-look-back-at-single-threaded-cpu-performance/

Throughout the 80’s and 90’s, CPUs were able to run virtually any kind of software twice as fast every 18-20 months. The rate of change was incredible. Your 486SX-16 was almost obsolete by the time you got it through the door. But eventually, at some point in the mid-2000’s, progress slowed down considerably for single-threaded software – which was most software.

Perhaps the turning point came in May 2004, when Intel canceled its latest single-core development effort to focus on multicore designs. Later that year, Herb Sutter wrote his now-famous article, The Free Lunch Is Over. Not all software will run remarkably faster year-over-year anymore, he warned us. Concurrent software would continue its meteoric rise, but single-threaded software was about to get left in the dust.

If you’re willing to trust this line, it seems that in the eight years since January 2004, mainstream performance has increased by a factor of about 4.6x, which works out to 21% per year. Compare that to the 28x increase between 1996 and 2004! Things have really slowed down.

We can also look at about the twelve years since then, which is even slower:

https://www.cpubenchmark.net/compare/2026vs6296/Intel-i7-4960X-vs-Intel-Ultra-9-285K

This is using a benchmark to compare the single-threaded performance of the i7 4960X (Intel's high-end processor back at the start of 2013) to that of the Intel Ultra 9 285K, the current one. In those ~12 years, the latest processor has managed to get single-threaded performance about (5068/2070)=~2.448 times the 12-year-old processor. That's (5068/2070)^(1/12)=1.07747, about a 7.7% performance improvement per year. The age of a processor doesn't matter nearly as much in that environment.

We still have had significant parallel computation increases. GPUs in particular have gotten considerably more powerful. But unlike with serial compute, parallel compute isn't a "free" performance improvement -- software needs to be rewritten to take advantage of that, it's often hard to parallelize solving problems, and some problems cannot be solved in parallel.

Honestly, I'd say that the most-noticeable shift is away from rotational drives to SSDs -- there are tasks for which SSDs can greatly outperform rotational drives.

[-] shortwavesurfer@lemmy.zip 3 points 3 weeks ago

You know, that would explain a lot because I had no idea that there was an authentication pin and that's total bullshit.

load more comments (1 replies)
[-] QuarterSwede@lemmy.world 10 points 3 weeks ago* (last edited 3 weeks ago)

I bought a former office HP EliteDesk 800 G2 16GB for $120 on eBay or Amazon (can’t recall) 2 years ago with the intention of it just being my server. I ended up not unhooking the monitor and leaving it on my desk since it’s plenty fast for my needs. No massive PC gaming rig but it plays Steam indie titles and even 3D modeling and slicing apps at full speed. I just haven’t needed to get anything else.

[-] shortwavesurfer@lemmy.zip 7 points 3 weeks ago* (last edited 3 weeks ago)

Being blind, I don't play video games and don't do any kind of 3D graphics and stuff like that. So many, many computers would fit my specifications.

Edit: My laptop right now is a Dell Latitude E5400 from like 2014 with eight gigabytes of RAM and a 7200 RPM drive with an Intel Core i5 and it works well enough. Honestly, the only problem with it is that it does not charge the battery. So as soon as it is unplugged from the wall, it just dies. And it's not the battery itself because I've tried getting new batteries for it. It's something in the charging circuitry. It works fine when it's on wall power, but it just does not charge the battery. I figure with it being 10 years old already, at some point I will have to replace it.

[-] tal@lemmy.today 3 points 3 weeks ago* (last edited 3 weeks ago)

It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.

And it’s not the battery itself because I’ve tried getting new batteries for it. It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.

At least some Dell laptops authenticate to the charger so that only "authentic Dell chargers" can charge the battery, though they'll run off third-party chargers without charging the battery.

Unfortunately, it's a common problem -- and I've seen this myself -- for the authentication pin on an "authentic Dell charger" to become slightly bent or something, at which it will no longer authenticate and the laptop will refuse to charge the battery.

I bet the charger on yours is a barrel charger with that pin down the middle.

hits Amazon

Yeah, looks like it.

https://www.amazon.com/dp/B086VYSZVL?psc=1

I don't have a great picture for the 65W one, but the 45W charger here has an image looking down the charger barrel showing that internal pin.

If you want to keep using that laptop and want to use the battery, I'd try swapping out the charger. If you don't have an official Dell charger, make sure that the one you get is one of those (unless some "universal charger" has managed to break their authentication scheme in the intervening years; I haven't been following things).

EDIT: Even one of the top reviews on that Amazon page mentions it:

I have a DELL, that has the straight barrel plug with the pin in it. THEY REALLY made a BAD DECISION when they made these DELL laptops with that type of plug instead of making it with a dog leg style plug. I have to replace my charger cord A LOT because the pin gets bent inside and it stops charging at that plug, but the rest of the charger is still good...

[-] cyborganism@lemmy.ca 3 points 3 weeks ago

Oh hey, I have question for you then. Are you using any braille system with your computer? Or is it a kind of voice reader thing you have going on? What do you use for reading posts and comments on Lemmy?

[-] shortwavesurfer@lemmy.zip 4 points 3 weeks ago* (last edited 3 weeks ago)

I use my phone a lot more frequently than I use my computer and I use the TalkBack screen reader on my phone primarily. I can read and write Braille of course and have been able to do so since I was a little kid but I don't do it very often primarily because I've always found reading to be slow for me and so I prefer audio. I'm able to better absorb information through audio than through reading it directly and always have been.

Edit: I'm not totally blind so my primary navigation is through memorization of where things are and then to read posts and stuff like that that's long I use the screen reader. So, for example, on my home screen, I know where I've placed my app icons, so I can just easily navigate to them, and in settings, for example, I know roughly where the menus are that I'm looking for, and so can navigate to them quickly. I also use the magnification gestures a lot. So, primarily, I navigate with memorization, magnification gestures, and screen reader for longer stuff.

load more comments (4 replies)
load more comments (4 replies)

I was that way for the longest time. I was more than content with my 4 core 8 thread 4th Gen. i7 laptop. I only upgraded to an 11th Gen. i9 system because I wanted to play some games on the go.

But after I upgraded to that system I started to do so much more, and all at once. Mostly because I actually could, and the old system would cry in pain long before then. But Mid last year I finally broke and bought a 13th Gen. i9 system to replace it and man do I flog the shit out of this computer. Just having the spare power lying around made me want to do more and more with it.

load more comments (6 replies)
[-] SaharaMaleikuhm@feddit.org 3 points 3 weeks ago* (last edited 3 weeks ago)

For real, I'm happily using an APU for 90% of the time. I barely need a dedicated GPU at all any more. I use Mint btw.

[-] SynopsisTantilize@lemm.ee 2 points 3 weeks ago

Yep. Give me a 4c/8t 16gb ram and a med-low GPU and I have nothing to complain about.

[-] shortwavesurfer@lemmy.zip 2 points 3 weeks ago

My current laptop is the Dell Latitude E5400 and it has like 4 threads with 8 gigs of RAM and a 7200 RPM drive and it works well enough even though it's 10 years old. Honestly, the only problem with it is that it does not charge the battery. It's something in the charging circuitry. Since it works fine when it's on wall power, but it absolutely will not charge a battery anymore.

load more comments (1 replies)
[-] j4k3@lemmy.world 31 points 3 weeks ago

NVCC is still proprietary and full of telemetry. You cannot build CUDA without it.

[-] ICastFist@programming.dev 23 points 3 weeks ago

Linux, after all, already runs on the Grace Blackwell Superchip. Windows doesn’t.

And why is that?

Project DIGITS features the new NVIDIA GB10 Grace Blackwell Superchip, offering a petaflop of AI computing performance for prototyping, fine-tuning and running large AI models.

With the Grace Blackwell architecture, enterprises and researchers can prototype, fine-tune and test models on local Project DIGITS systems running Linux-based NVIDIA DGX OS, and then deploy them seamlessly on NVIDIA DGX Cloud™, accelerated cloud instances or data center infrastructure.

Oh, because it's not a fucking consumer product. It's for enterprises that need a cheap supercomputer

[-] Blackmist@feddit.uk 19 points 3 weeks ago

*monkey paw closes*

But it's just for AI bullshit.

[-] qaz@lemmy.world 4 points 3 weeks ago

I don't care why they got their shit together, I'm happy as long as they fix the open source drivers.

[-] vext01@lemmy.sdf.org 13 points 3 weeks ago

Where's the PC? Is it the brick on the desk? 🤣

[-] lig@lemmings.world 11 points 3 weeks ago

Well, it's still a modified custom distro and other distros will need to invest extra effort to be able to run there. So, no actual freedom of choice for users again...

load more comments (5 replies)
[-] winni@lemmy.world 9 points 3 weeks ago

I hope to see some nice Risc-V PCs soon

[-] AnUnusualRelic@lemmy.world 6 points 3 weeks ago* (last edited 3 weeks ago)

Or you can just buy any random potato computer (or assemble it yourself from stuff you found) and still run Linux on it.

[-] collapse_already@lemmy.ml 4 points 3 weeks ago

Haven't they been making things like the Jetson AGX for years? I guess this is an announcement of the next generation.

load more comments (5 replies)
load more comments
view more: next ›
this post was submitted on 13 Jan 2025
397 points (94.2% liked)

Technology

61433 readers
1446 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS