124
submitted 1 year ago by ijeff@lemdro.id to c/technology@beehaw.org
you are viewing a single comment's thread
view the rest of the comments
[-] upstream@beehaw.org 7 points 1 year ago

Apple has shown that the market could be willing to adapt.

But then again, they’ve always had more leverage than the Wintel-crowd.

But what people seem to ignore is that there is another option as well: hardware emulation.

IIRC correctly old AMD CPU’s, notably the K6, was actually a RISC core with a translation layer turning X86 instructions into the necessary chain of RISC instructions.

That could also be a potential approach to swapping outright. If 80% of your code runs natively and then 20% passes this hardware layer where the energy loss is bigger than the performance loss you might have a compelling product.

[-] DaPorkchop_@lemmy.ml 6 points 1 year ago

Virtually all modern x86 chips work that way

[-] barsoap@lemm.ee 2 points 1 year ago* (last edited 1 year ago)

Microcoding has been a thing since the 1950s, it's the default. Early RISCs tried to get away with it and for a brief time RISCs weren't microcoded kinda by definition, but it snuck back in because it's just too useful to not hard-wire everything. You maybe get away with it on MIPS but Arm? Tough luck. RISC-V can be done and it can make microcontroller-scale chips simpler, but you can also implement the RV32I (full) insn set in terms of RVC (compressed subset) and be faster. Not to mention that when you get to things like the vector extensions you definitely want to use microcode. The Cray-1 was hardwired, but they, too, dropped it for a reason.

I guess in modern days RISC more or less means "a decent chunk of the instruction set will not be microcoded but can instead be used as microcode", whereas with modern CISC processors the instruction set and the microcode may have no direct correspondences at all.

[-] DJDarren@thelemmy.club 5 points 1 year ago

Apple has shown that the market could be willing to adapt.

It's less that they'll adapt, and more that they don't really care. And particularly in the case of Apple users: their apps are (mostly) available on their Macs already. The vast majority of people couldn't tell you what architecture their computer runs on and will just happily use whatever works and doesn't cost them the earth.

[-] upstream@beehaw.org 1 points 1 year ago

I didn’t mean the customers, but sure.

this post was submitted on 28 Oct 2023
124 points (100.0% liked)

Technology

37747 readers
221 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS