ROCM wasn’t a thing when I bought. You need(ed) NVidia for machine learning and other GPGPU stuff
Same for me, had to buy an Alienware laptop with an NVIDIA GPU during my PhD for some GPGPU coding I had to do as CUDA was pretty much the only choice back then and OpenCL was a joke in terms of performance and wasn't getting much love from GPU manufacturers. But right now, I know for sure I won't ever buy an NVIDIA GPU again, ROCm works wonderfully well even on an APU (in my case, a Radeon 680M integrated GPU) and it's also future-proof since you're almost writing CUDA code so if you ever switch to an NVIDIA GPU again you mostly will just have to replace "hip" with "cuda" in your code + some magic constants (warp length in particular).
Same for me, had to buy an Alienware laptop with an NVIDIA GPU during my PhD for some GPGPU coding I had to do as CUDA was pretty much the only choice back then and OpenCL was a joke in terms of performance and wasn't getting much love from GPU manufacturers. But right now, I know for sure I won't ever buy an NVIDIA GPU again, ROCm works wonderfully well even on an APU (in my case, a Radeon 680M integrated GPU) and it's also future-proof since you're almost writing CUDA code so if you ever switch to an NVIDIA GPU again you mostly will just have to replace "hip" with "cuda" in your code + some magic constants (warp length in particular).