97
submitted 11 months ago by misk@sopuli.xyz to c/technology@lemmy.world

Here comes the push.

you are viewing a single comment's thread
view the rest of the comments
[-] just_another_person@lemmy.world 7 points 11 months ago

"Quick...we need to get everyone to buy our GPU-based bullshit before people figure out FPGA is wildly better"

[-] drdabbles@lemmy.world -1 points 11 months ago

Dedicated ASIC is where all the hotness lies. Flexibility of FPGA doesn't seem to overcome its overhead for most users. Not sure if it will change when custom ASIC becomes too expensive again, and all the magic money furnaces run out of bills to burn.

[-] just_another_person@lemmy.world -1 points 11 months ago

ASIC are single purpose at the benefit of potential power efficiency improvements. Not at all useful something like running neutral networks, especially not when they are being retrained and updated.

FPGAs are fully (re)programmable. There's a reason why datacenters don't lease ASIC instances.

[-] drdabbles@lemmy.world -2 points 11 months ago

Not at all useful something like running neutral networks

Um. lol What? You may want to do your research here, because you're so far off base I don't think you're even playing the right game.

There’s a reason why datacenters don’t lease ASIC instances.

Ok, so you should just go ahead and tell all the ASIC companies then.

https://www.allaboutcircuits.com/news/intel-and-google-collaborate-on-computing-asic-data-centers/

https://www.datacenterfrontier.com/servers/article/33005340/closer-look-metas-custom-asic-for-ai-computing

https://ieeexplore.ieee.org/document/7551392

Seriously. You realize that the most successful TPUs in the industry are ASICs, right? And that all the "AI" components in your phone are too? What are you even talking about here?

[-] just_another_person@lemmy.world 0 points 11 months ago

TPU units are specific to individual model frameworks, and engineers avoid using them for that reason. The most successful adoptions for them so far are vendor locked-in NN Models a la Amazon (Trainium), and Google (Coral), and neither of them has wide adoption since they have limited scopes. The GPU game being flexible in this arena is exactly why companies like OpenAI are struggling to justify the costs in using them over TPUs: it's easy to run up front, but the cost is insane, and TPU is even more expensive in most cases. It's also inflexible should you need to do something like multi-model inference (detection+evaluation+result...etc).

As I said, ASICs are single purpose, so you're stuck running a limited model engine (Tensorflow) and instruction set. They also take a lot of engineering effort to design, so unless you're going all-in on a specific engine and thinking you're going to be good for years, it's short sighted to do so. If you read up, you'll see the most commonly deployed edge boards in the world are...Jetsons.

Enter FPGAs.

FPGAs have speedup improvements for certain things like transcoding and inference in the 2x-5x range for specific workloads, and much higher for ML purposes and in-memory datasets (think Apache Ignite+Arrow workloads), and at a massive reduction in power and cooling, so obviously very attractive for datacenters to put into production. The newer slew of chips out are even reprogrammable "on the fly", meaning a simple context switch and flash can take milliseconds, and multi-purpose workloads can exist in a single application, where this was problematic before.

So unless you've got some articles about the most prescient AI companies currently using GPUs and moving to ASIC, the field is wide open for FPGA, and the datacenter adoption of such says it's the path forward unless Nvidia starts kicking out more efficient devices.

[-] drdabbles@lemmy.world -1 points 11 months ago

Now ask open AI to type for you what the draw backs of FPGA is. Also the newest slew of chips is using partially charged NAND gates instead of FPGA.

Almost all ASIC being used right now is implementing the basic math functions, activations, etc. and the higher level work is happening in more generalized silicon. You can not get the transistor densities necessary for modern accelerator work in FPGA.

[-] just_another_person@lemmy.world 0 points 11 months ago

Friend, I do this for a living, and I have no idea why you're even bringing gating into the equation, because it doesn't even matter.

I'm assuming you're a big crypto fan, because that's about all I could say of ASIC in an HPC type of environment to be good for. Companies who pay the insane amounts of money for "AI" right now want a CHEAP solution, and ASIC is the most short-term, e-wastey, inflexible solve to that problem. When you get a job in the industry and understand the different vectors, let's talk. Otherwise, you're just spouting junk.

[-] drdabbles@lemmy.world 0 points 11 months ago

I’m assuming you’re a big crypto fan

Swing and a miss.

because that’s about all I could say of ASIC in an HPC type of environment to be good for

Really? Gee, I think switching fabrics might have a thing to tell you. For someone that does this for a living, to not know the extremely common places that ASICs are used is a bit of a shock.

want a CHEAP solution

Yeah, I already covered that in my initial comment, thanks for repeating my idea back to me.

and ASIC is the most short-term

Literally being atabled to the Intel tiles in Sapphire Rapids and beyond. Used in every switch, network card, and millions of other devices. Every accelerator you can list is an ASIC. Shit, I've got a Xilinx Alveo 30 in my basement at home. But yeah, because you can get an FPGA instance in AWS, you think you know that ASICs aren't used. lmao

e-wastey

I've got bad news for you about ML as a whole.

inflexible

Sometimes the flexibility of a device's application isn't the device itself, but how it's used. Again, if I can do thousands, tens of thousands, or hundreds of thousands of integer operations in a tenth of the power, and a tenth of the clock cycles, then load those results into a segment of activation functions that can do the same, and all I have to do is move this data with HBM and perhaps add some cheap ARM cores, bridge all of this into a single SoC product, and sell them on the open market, well then I've created every single modern ARM product that has ML acceleration. And also nvidia's latest products.

Woops.

When you get a job in the industry

I've been a hardware engineer for longer than you've been alive, most likely. I built my first FPGA product in the 90s. I strongly suspect you just found this hammer and don't actually know what the market as a whole entails, let alone the long LONG history of all of these things.

Do look up ASICs in switching, BTW. You might learn something.

[-] just_another_person@lemmy.world 0 points 11 months ago

Let's just shut this down right now. If you built FPGAs ever, it was in college in the 90s, at an awful program of a US university that trained you in SQL on the side and had zero idea of how hardware works. I'm sorry for that.

The world has changed since 30 years ago, and the future of integer operations is in reprogrammable chips. All the benefit of a fab chip, and none of the downside in a cloud environment.

The very idea that you think all these companies are looking to design and build their own single purpose chips for things like inference shows you have zero idea of where the industry is headed.

You're only describing how ASIC is used in switches, cool. That's what it's meant for. That's not how general use computing works in the world anymore, buddy. It's never going to be a co-proc in a laptop that can load models and do general inference, or be a useful function for localized NN. It's simply for the single purpose uses as you said.

[-] drdabbles@lemmy.world -1 points 11 months ago

I mean, you're such an absolute know-nothing that it's hilarious. Nice xenophobic bullshit sprinkled in too. Sorry, no university for me, let alone FPGA in university in the 90s. When my friends were in university they were still spending their time learn Java.

The world has changed since 30 years ago

Indeed. And people like me have been there every step of the way. Your ageism is showing.

and the future of integer operations is in reprogrammable chips

Yes, I remember hearing this exact sentiment 30 years ago. Right around the time we were hearing (again) how neural networks were going to take over the world. People like you are a dime a dozen and end up learning their lessons in a painfully humbling experience. Good luck with that, I hope you take it for the lesson it is.

All the benefit of a fab chip

Except the amount of wasted energy, and extreme amount of logic necessary to make it actually work. You know. The very fucking problem everybody's working hard to address.

The very idea that you think all these companies are looking to design and build their own single purpose chips

The very idea that you haven't kept up with the industry and how many companies have developed their own silicon is laugh out loud comedy to me. Hahahaha. TSMC has some news for you.

You’re only describing how ASIC is used in switches

Nope, I actually described how they are used in SoCs, not in switching fabrics.

That’s not how general use computing works in the world anymore, buddy

Except all those Intel processors I mentioned, those ARM chips in your iPhones and Pixels, the ARM processors in your macbooks. You know. Real nobodies in the industry.

It’s never going to be a co-proc in a laptop that can load models and do general inference, or be a useful function for localized NN.

Intel has news for you. It's impressive how in touch you pretend to be in "the industry" but how little you seem to know about actual products being actually sold today.

Hey, quick question. Does nvidia have FPGAs in their GPUs? No? Hmm. Is the H100 just a huge set of FPGA? No? Oh, weird. I wonder why, since you in all your genuis has said that's the way everybody's going. Strange that their entire product roadmap shows zero FPGA on their DPUs, GPUs, or on their soon to arrive SoCs. You should call Jensen, I bet he has so much to learn from a know-it-all like you that has some amazing ideas about US universities. Hey, where is it that all these tech startup CEOs went to university?

Tell you what. Don't bother responding, nothing you've said holds any water or value.

[-] just_another_person@lemmy.world 0 points 11 months ago

Because literally everyone else saw the writing on the wall and is preparing FPGA chips EXCEPT for NVIDIA. 🤦

NVidia is just now trying to make their own ARM chips ffs. 5 years late. You're dated and outmoded. Get with the future.

[-] victorz@lemmy.world 0 points 11 months ago

Aaand we're back on Reddit again...

[-] drdabbles@lemmy.world 0 points 11 months ago

They can be a xenophobic, ageist jagoff all they want. I'm not engaging with them anymore. They're the carpenter that thinks a hammer solves all problems, if we pretend they actually did anything with FPGA as their day job.

[-] ZahzenEclipse@kbin.social 3 points 11 months ago* (last edited 11 months ago)

They didn't say anything xenophobic. They may have played into the ageist stuff but it was after you tried to play "I've been doing since you before you were born" card, and that makes it fair game imo. You were being unnecessarily aggressive from the start of this exchange and I think they were matching your energy. This is my outside perspective but you may lash out at me too. I don't have a dog in this fight, I'd have to do research before figuring out if either of you knows what you're talking about.

[-] drdabbles@lemmy.world -3 points 11 months ago
[-] victorz@lemmy.world 1 points 11 months ago

Kinda skipping over everything else he said but 👍

[-] drdabbles@lemmy.world -1 points 11 months ago

Not really. Not worth responding to the rest.

[-] victorz@lemmy.world 0 points 11 months ago

Uh huh... Bet it's not. Because that would make things difficult.

And the classic down vote for calling out. 😆👍 Hard to shake the Reddit ways.

Have a good one!

[-] drdabbles@lemmy.world -1 points 11 months ago

Let's review. They said there was nothing xenophobic. But the original weirdo had some BS to say about american universities. Textbook definition. They said that there was no ageism until I said something, completely ignoring the fact that the original person initially claimed I was younger then them so I had no experience. I responded noting that I've been in the industry quite some time, not as an argument from experience but as a retort to the claim I was new to all of this.

The fact is, the person you're now defending clearly didn't read the thread, and you're just here concern trolling. I provided links to retort the frankly idiotic claims about ASICs not being a far more popular choice than FPGAs, and it's hysterical to see you people coming through worried about the discourse rather than the facts of the matter.

Bye now.

[-] victorz@lemmy.world 1 points 11 months ago* (last edited 11 months ago)

Discourse was the fun part. I don't give a shit about what you two are actually talking about lol. Sounds too complicated for me.

I'm not defending anyone. You both were bickering equally toward one another. I'm not on your or their side.

Nice down vote. 👍 Have a good one!

[-] victorz@lemmy.world 1 points 11 months ago* (last edited 11 months ago)

Granted, it was a very controlled Reddit argument. It had all the elements, but with a bit of class. 😆

[-] ZahzenEclipse@kbin.social 1 points 11 months ago

Social media will always devolve into that. I like seeing arguments personally but when it devolves into name calling and ego stroking it gets annoying real quick

[-] victorz@lemmy.world 1 points 11 months ago

Sure does 😮‍💨 Take care!

[-] ZahzenEclipse@kbin.social -1 points 11 months ago* (last edited 11 months ago)

You're such a dick for no reason. It definitely bolsters your claims your an old school tech guy lol

[-] drdabbles@lemmy.world 0 points 11 months ago

Not for no reason. They made claims, I provided links, they whined about it. They provided zero links backing up their 40 year old claim that FPGA would replace anything that didn't run away fast enough.

this post was submitted on 24 Nov 2023
97 points (92.9% liked)

Technology

59340 readers
1411 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS