12
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 04 May 2024
12 points (80.0% liked)
techsupport
2630 readers
49 users here now
The Lemmy community will help you with your tech problems and questions about anything here. Do not be shy, we will try to help you.
If something works or if you find a solution to your problem let us know it will be greatly apreciated.
Rules: instance rules + stay on topic
Partnered communities:
founded 2 years ago
MODERATORS
Only reason I can think that disabling the monitor in the Device Manager is that it disabled freesync or HDR or stops windows trying to send CEC power saving commands or whatever.
I have no idea why it's a fix, but it clearly stopping windows from trying to do something which causes the screen to flash.
I would be surprised if changing between limited and full-range fixed it. That's a hangover from old broadcast standards. It doesn't change the data rates.
It seems like it's working at lower data rates?
So 1440p60 is working but 1440p144 isn't?
Which points to an HDMI cable issue, a GPU issue or a GPU driver issue (and I mention GPU/Driver because of freesync)