201
top 38 comments
sorted by: hot top controversial new old
[-] tuff_wizard@aussie.zone 47 points 1 year ago
[-] cantstopthesignal@sh.itjust.works 27 points 1 year ago

Silicon oxide to be precise! Definitely none of that in microchips.

[-] jopepa@lemmy.world 9 points 1 year ago* (last edited 1 year ago)

But the rgb potential … they’ll pay hundreds more just so their computers can look like a 90s tv stuck between channels

[-] Lmaydev@programming.dev 2 points 1 year ago

Damn right we will!

[-] 8BitRoadTrip@lemm.ee 22 points 1 year ago

Can I get a notification when the first isolinears drop?

[-] RojoSanIchiban@lemmy.world 8 points 1 year ago

Why, you want to make Data stick them back in their proper slots in Engineering after they drop [on the floor]? Gosh!

[-] NoSpotOfGround@lemmy.world 7 points 1 year ago

First season gang, represent!

[-] NegativeLookBehind@kbin.social 9 points 1 year ago

Bigger? Aren’t they supposed to get smaller?

[-] frezik@midwest.social 14 points 1 year ago

No. If you're thinking Moore's Law, all that says is that semiconductors would double their number of transistors every 2 years (or so) for a given price point. This is basically making a really big package while keeping the price the same.

BTW, that price limit is one that isn't really talked about, but it's in Moore's original paper (unlike things people usually bring up, like clock rates or single threaded speed), and it skewers Moore's Law dead. If you take the price of the old 8086, adjust for inflation, and double its transistors every 2 years since, there's nothing that comes close to the numbers you get. IIRC, it's about an order of magnitude too few transistors for CPUs on offer at that price.

[-] Chailles@lemmy.world 2 points 1 year ago

I think the point they were wondering was that a larger computer chip doesn't seem like progress. The overall density of transistors is the same, so how exactly does scaling that up do anything? Or why does using glass make it better?

Granted, reading the article answers exactly that (though I'll admit, I don't entirely understand it). The current material limits how much of the computer chip somehow, this new material allows for more... something.

[-] Agent641@lemmy.world 12 points 1 year ago

I wont rest until my CPU is delivered to my house on a pallet, just as god intended.

[-] herrvogel@lemmy.world 1 points 1 year ago

Longing for the day when I'll finally get to ask my friend if I can borrow his 18-wheeler so I can haul home the CPU I bought on craigslist.

[-] space@lemmy.dbzer0.com 3 points 1 year ago

The issue is that chip sizes are limited (among other things, like the speed of light) by the ability to take away the heat they produce. For example the i9 processor itself is not that big, but needs a massive cooler.

[-] TonyTonyChopper@mander.xyz 3 points 1 year ago

The article is talking about how they make the boards that cpu chiplets are placed on. The chiplets themselves have technology that is getting smaller. The point of an upgrade in the board is to fit more of these chiplets in a device, so you can process more information in parallel. That's why they're focusing on AI computing which relies heavily on parallel processing.

[-] Lemmylaugh@lemmy.ml 8 points 1 year ago

Whatever happened to quantum computing that was supposed to replace integrated circuits for cpus?

[-] Chobbes@lemmy.world 35 points 1 year ago

That’s not really something that’s on the horizon at all. There’s some experimental quantum computing stuff, but it’s not really practical for anything yet (and certainly not in a personal computer!) It’s also likely not going to be better at the stuff we use normal CPUs for. Eventually they might be useful for certain classes of problems, but probably in more of a coprocessor like capacity (kind of like a side unit like a GPU that’s good at certain tasks). Obviously it’s unknown what the future holds, but I don’t think quantum computing is going to replace silicon any time soon.

[-] GigglyBobble@kbin.social 16 points 1 year ago* (last edited 1 year ago)

Quantum computers were never supposed to replace conventional computers. Their theoretical performance is only superior for a specific set of problems that usually are not relevant for every-day computing.

[-] Chobbes@lemmy.world 1 points 1 year ago

For sure, though I would not be entirely surprised if the class of problems we care about on a daily basis changes if it ever becomes commercially viable. But currently people mostly care about breaking cryptography which… boo.

The primary use case for quantum computing is for governments to break encryption.

[-] Chobbes@lemmy.world 3 points 1 year ago

I just want to factor some large prime numbers as a treat 🥺

[-] foobaz@lemmy.world 2 points 1 year ago

It's just a hobby bro 🤷‍♂️

[-] DokPsy@infosec.pub 2 points 1 year ago

I think it'll take a new component/circuit design for quantum to be viable for home computing similar to the transformation that happened to computers after the addition of the transistor

[-] frezik@midwest.social 3 points 1 year ago

As of yet, quantum computers need exotic cooling. Perhaps there will be some clever way around that, but it may not be solvable. That would keep it forever out of reach of common home or office use.

[-] DokPsy@infosec.pub -1 points 1 year ago

And digital computers needed tube relays and entire buildings to work. With innovation and time, it'll become more easily handled

[-] TimeSquirrel@kbin.social 1 points 1 year ago* (last edited 1 year ago)

With innovation and time, it’ll become more easily handled

Not if you're literally bumping against the laws of physics of the universe. There may be some things that will never come to pass, technologically. FTL travel might be one of them, for example.

[-] DokPsy@infosec.pub 1 points 1 year ago

Honestly the laws of physics are constantly in flux and there's no telling what we could create to circumvent the limits we're currently pushing.

As I mentioned in my example: before the innovations with transistors, there was no way to make a portable computer. It was physically impossible

[-] frezik@midwest.social 1 points 1 year ago

You can't just assume any one thing will work out. There are plenty of dead ends in technology.

[-] DokPsy@infosec.pub 1 points 1 year ago

While true, it doesn't mean we should stop. At worst, we find techniques that improve other areas of technology

[-] bobman@unilem.org -2 points 1 year ago* (last edited 1 year ago)

I'd love to see us figure out a way to cool quantum computers for the same price it costs to power conventional ones.

Imagine what such efficiency gains would mean for food preservation in poor nations.

[-] DokPsy@infosec.pub 1 points 1 year ago

I'm more expecting innovations to reduce the need for the super cooling but same

[-] herrvogel@lemmy.world 1 points 1 year ago

I doubt quantum computing is ever gonna be viable for home computing. The benefits they offer over conventional computing are largely irrelevant to almost anything you might be doing at home, and better materials or manufacturing methods won't change that.

[-] DokPsy@infosec.pub 1 points 1 year ago

Depends on how we approach viability, imo

Can we currently see a reason for it with its current abilities/functions? No

But

We can look right at the history of conventional computing to predict a possible timeline for it. Single purpose computational machines that took a lot of power, a lot of room, and were fairly rare. Used for military or research purposes. Multi purpose machines that could run user created calculations and were slightly smaller and efficient. Begins to be used in more academic settings Multipurpose machines capable of being used to aid general office staff, continue to become more compact and efficient Portability becomes possible for select few with a need And so on until we arrive to now where nearly everything and everyone has a computer

[-] Chocrates@lemmy.world 25 points 1 year ago* (last edited 1 year ago)

They are not general purpose computers and currently need super conductors that need cryo coolong

[-] jopepa@lemmy.world 8 points 1 year ago

Link to Youtube of primitive tech building one by their shingle drying rack.

[-] metaStatic@kbin.social 2 points 1 year ago

It will be a cloud subscription

[-] Chobbes@lemmy.world 1 points 1 year ago

Technically you can already rent time on a quantum computer from IBM… so it kind of already is!

[-] Kit@lemmy.blahaj.zone 3 points 1 year ago

I first read about this on the News app on the Nintendo Wii. 15 years ago. Wake me up when I can walk to Best Buy and buy one.

[-] bobman@unilem.org 3 points 1 year ago

Last I checked, glass has a very specific meaning in a scientific context.

this post was submitted on 20 Sep 2023
201 points (98.6% liked)

Technology

59623 readers
1372 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS