329
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 21 Aug 2024
329 points (98.8% liked)
Technology
59670 readers
1604 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
The problem was solved by Nvidia, then AMD made it cheap and accessible and not requiring a dedicated hardware module.
For years and years Nvidia increased artificially by up to 150 euros many Gsync screens and for no legitimate reason. Initially there was NO compatibility with free sync at all.
Nvidia wasn't kindly solving a gamers problem at least to after the first year of release of that tech. They were forcibly selling expensive hardware modules nobody needed or wanted. And long after freesync showed you could do it just as well without this expensive requirements.
This hardware module they insisted on selling wasn't solving a technical problem but a money one.
I don't even think anyone was ever able to differentiate between the different qualities of "sync techs".
There absolutely was a legitimate reason. The hardware was not capable of processing the signals. They didn't use FPGAs on a whim. They did it because they were necessary to handle the signals properly.
And you just haven't followed the tech if you think they were indistinguishable. Gsync has supported a much wider variance of frame times over its entire lifespan.