73
submitted 1 year ago by jannis@feddit.de to c/greentext@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] AlmightySnoo@lemmy.world 5 points 1 year ago* (last edited 1 year ago)

ROCM wasn’t a thing when I bought. You need(ed) NVidia for machine learning and other GPGPU stuff

Same for me, had to buy an Alienware laptop with an NVIDIA GPU during my PhD for some GPGPU coding I had to do as CUDA was pretty much the only choice back then and OpenCL was a joke in terms of performance and wasn't getting much love from GPU manufacturers. But right now, I know for sure I won't ever buy an NVIDIA GPU again, ROCm works wonderfully well even on an APU (in my case, a Radeon 680M integrated GPU) and it's also future-proof since you're almost writing CUDA code so if you ever switch to an NVIDIA GPU again you mostly will just have to replace "hip" with "cuda" in your code + some magic constants (warp length in particular).

this post was submitted on 29 Sep 2023
73 points (94.0% liked)

> Greentext

7547 readers
566 users here now

founded 2 years ago
MODERATORS