12
submitted 7 months ago by alessandro@lemmy.ca to c/pcgaming@lemmy.ca
you are viewing a single comment's thread
view the rest of the comments
[-] henfredemars@infosec.pub 29 points 7 months ago* (last edited 7 months ago)

If you replace my GPU with AI hardware, are you implying that I'm supposed to use this AI hardware to drive my games instead of a GPU? It seems much more plausible that GPU will gain AI capabilities rather than be replaced by AI hardware. Why would I not need a GPU anymore?

This smells more like a big dream to me than an observation in where the industry is going.

[-] GlitterInfection@lemmy.world 13 points 7 months ago* (last edited 7 months ago)

They are saying that a hardware AI would be able to replace your GPU's functionality, so yes, it would be the thing that game developers would use to output pixels to your screen.

It's a bold statement in a way, but there's solid reason behind wanting to make this happen.

Generative AI is currently capable of creating images that contain extremely complex rendering techniques. Global illumination, with an infinite number of area light sources, is just kind of free. Subsurface Scattering is just how light works in this model of the world, and doesn't require pre-processing, multiple render passes, a g-buffer, or costly ray casts to get there. Reflections don't require screen-space calculations, irradiance probes, or any other weird tricks. Transparency is order independent because it doesn't make sense for it not to be, and light diffuses or diffracts because of course it does.

Modern high-end GPUs are hacks we have settled on for pushing information into pixels on the screen. They use a combination of a typical raster-based graphics pipeline, hardware accelerated ray tracing, and ai upscaling and denoising techniques to approximate solutions to a lot of these problems.

There are definitely things that developers would need in order for AI hardware to replace GPUs, such as any kind of temporal consistency, and significantly more control over the resulting pixels.

And to get both game developers and consumers to transition over you'll need it to first be part of even low end GPUs for quite a while.

It's not a terrible idea at its core. It's definitely not 5-10 years out, though.

[-] not_that_guy05@lemmy.world 3 points 7 months ago

I don't know it seems that Nvidia knows what they are doing. They are still ahead of the game in AI due to the fact that they were trying to be in it before other people. From what I heard from NPR they are like 5-10 years ahead of everybody else in the AI game. Might be a big dream but if they do what they did with AI, it could be possible.

this post was submitted on 22 Mar 2024
12 points (70.0% liked)

PC Gaming

8417 readers
421 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS