60
submitted 5 months ago by misk@sopuli.xyz to c/hardware@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] SkyezOpen@lemmy.world 2 points 5 months ago

90hz is generally enough for most people to not get motion sick. Some headsets do 120 which is like 8ms frame time. Humans can barely detect a flash of light that lasts for that long.

[-] chonglibloodsport@lemmy.world 2 points 5 months ago

You absolutely can tell the difference between 90Hz and 1kHz. Just draw a squiggly line! See this video for a rather dramatic demonstration:

Microsoft Research: High Performance Touch

[-] SoleInvictus@lemmy.blahaj.zone 3 points 5 months ago* (last edited 5 months ago)

This is a demonstration of latency, not frame rate. Did you intend to link something different?

[-] chonglibloodsport@lemmy.world 3 points 5 months ago

A 1000Hz display necessarily has a latency of 1ms between frames. For 100Hz, that’s 10ms.

But this is only the lower bound. You have to include all other sources of latency, such as software, input hardware, drivers, graphics card, etc.

[-] SoleInvictus@lemmy.blahaj.zone 2 points 5 months ago* (last edited 5 months ago)

Ahhh, now I see the connection! It's the update interval. I had to chew on it for a minute but the math checks out.

[-] biber@feddit.de 1 points 5 months ago

The last sentence is simply incorrect. Humans can detect single photons in specific environments. https://www.nature.com/articles/ncomms12172

In real environments it depends very much on the brightness of the flash of light.

this post was submitted on 21 May 2024
60 points (94.1% liked)

Hardware

5011 readers
1 users here now

This is a community dedicated to the hardware aspect of technology, from PC parts, to gadgets, to servers, to industrial control equipment, to semiconductors.

Rules:

founded 4 years ago
MODERATORS