685
submitted 9 months ago by OttoVonNoob@lemmy.ca to c/memes@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] RealFknNito@lemmy.world 57 points 9 months ago

Nvidia? Ew. Put it back in the chest.

Good format though ty.

[-] DmMacniel@feddit.de 13 points 9 months ago
[-] Rootiest@lemmy.world 4 points 9 months ago

Afaik AMD still doesn't have the same kind of support for AI software.

TBH I haven't had any issues gaming with my Nvidia card either

[-] Gabu@lemmy.world 8 points 9 months ago

AMD's support for AI is just fine, you just have to choose a path - if you're on Linux, use their CUDA translation software (ROCm), if you're on Windows, use DirectML.

[-] ylai@lemmy.ml 5 points 9 months ago* (last edited 9 months ago)

AMD’s support for AI is just fine

This is quite untrue, especially if you do actual research and not just run other people’s models. For example, ROCm is missing in many sparse autograd frameworks, e.g. pytorch_sparse, or having a viable alternative to Nvidias MinkowskiEngine. This is needed if you do any state-of-the-art convnets with attention-like sparsity.

this post was submitted on 25 Nov 2023
685 points (96.9% liked)

Memes

45185 readers
1604 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS