[-] Zamundaaa@discuss.tchncs.de 1 points 3 days ago* (last edited 3 days ago)

Again, the reference luminance mapping is all about how applications should use the Wayland protocol.

How to map SDR to HDR can indeed be made much more complicated, from simple gamma adjustments to some full blown ITM meant for images or videos, like what BT.2446 suggests, but as far as applications are concerned, those are edge cases that they don't really need to be prepared for.

It's not like they have a different choice - unless the compositor supports custom reference luminance levels (which KWin does, but not all others do), and they support custom reference luminance themselves, then they need some logic to calculate peak luminance levels. If the compositor steps outside of those common expectations for reference luminance mapping, then the result may not be ideal, but there is no way for the application to do better.

[-] Zamundaaa@discuss.tchncs.de 1 points 3 days ago* (last edited 3 days ago)

For one it assumes all sRGB monitors utilize gamma2.2 for decoding images

Assuming that all monitors do anything specific at all would be a folly, no. There are no assumptions there, the sRGB spec has no ambiguity when it comes to the transfer function of the display.

That a certain percentage of displays don't behave like expected is annoying, but doesn't really change anything (beyond allowing the user to change the assumed transfer function in SDR mode).

this is why windows HDR uses the inverse oetf. Decoding content graded on a pure 2.2 display with the inverse oetf is way better then decoding content graded on an inverse oetf display with a pure 2.2. Windows took the safe route of making sure most content looks at least OK. I would not say that windows HDR is wrong, it’s not right, but it’s not wrong either. this is just the mess that sRGB gave us.

The most likely actual reason Window uses the piece-wise transfer function for HDR is that it did that in SDR mode too - where however the default ICC profile was also piece-wise sRGB, so it canceled out on 99% of PCs, and had no negative effects.

Another time you should be using the inverse sRGB OETF to linearize content when the original content was encoded using the sRGB oetf and you want to go back to that working data, but this applies less to compositors and more to authoring workflows.

Makes sense.

People have been adjusting monitor brightness for ages. Sometimes manually, sometimes with DDC etc.

That's a very different thing. Pushing viewing environment adjustments to the display side makes some amount of sense with SDR monitors - when you get an SDR display with increased luminance capabilities vs. the old one, you change the monitor to display the content comfortably in your environment.

With HDR though, if the operating system considers PQ content to be absolute in luminance, you can't properly adjust that on the monitor side anymore, because a lot of monitors completely lock you out of brightness controls in HDR mode, and the vast majority of the ones that do allow you to adjust it, only allow you to reduce luminance, not increase it above "PQ absolute".

Another issue that is brought up is “graphics white” BT.2408 is a suggestion, not a hard coded spec, many different specs or suggestions use a different “graphics white” value.

I didn't claim that PQ had only one specification that uses it, I split up SMPTE ST 2084, rec.2100 and BT.2408 for a reason. I didn't dive into it further because a hundred pages of diving into every detail that's irrelevant in practice is counter productive to people actually learning useful things.

A good example of this is JXL.

Can you expand on what you mean with that?

2408 also very explicitly says ‘The signal level of “HDR Reference White” is not directly related to the signal level of SDR “peak white”.’

That "directly" is very important, as it does very much make both these signal levels the same. As I wrote in the blog post, the spec is all about broadcasts and video.

Other systems do sometimes split these two things up, but that nearly always just results in a bad user experience. I won't rant anymore about the crapshow that is HDR on Windows, but my LG TV cranks up brightness of its UI to the absolute maximum while an HDR video is playing. If they would adhere to the recommendations of BT.2408, they would work much better.

this is important to note because this directly contradicts the some of the seemingly core assumptions made in the article, and even some of the bullet points like “a reference luminance, also known as HDR reference white, graphics white or SDR white” and “SDR things, like user interfaces in games, should use the reference luminance too”

No contradictions at all. The Wayland protocol defines these things to be the same, so for application developers they just are the same, end of story.

This needs to be expanded upon that this does NOT correlate to what the general user understands HDR and SDR to be. HDR and SDR in the terms of video content is no more then a marketing term and without context it can be hard to define what it is, However it is abundantly clear from this quote here that how they are interpreting HDR and SDR (which is a very valid technically inclined way of interpreting it) does NOT fall inline with general user expectation.

That's just absolute nonsense. The very very vast majority of users do not have any clue whatsoever what transfer function content is using, or even what a transfer function, buffer encoding or even buffers are, the only difference they can see is that HDR gets brighter than SDR.

And again, this too is about how applications should use the Wayland protocol. This is the only way to define it that makes any sense.

[-] Zamundaaa@discuss.tchncs.de 37 points 2 months ago

That's not right. Most monitors use 8 bits per color / 24 bits per pixel, though some are still using 6 bpc / 18bpp.

HDR doesn't mean or really require more than 8bpc, it's more complicated than that. To skip all the complicated details, it means more brightness, more contrast and better colors, and it makes a big difference for OLED displays especially.

[-] Zamundaaa@discuss.tchncs.de 28 points 5 months ago

Some day it will make it's way to banking apps/sites being unusable on OSes other than approved "secure" ones.

That day was years ago. Many banking apps refuse to start if you even just have your bootloader unlocked, and some banking websites only support Chrome, some really crappy ones even only Chrome or Edge on Windows specifically.

[-] Zamundaaa@discuss.tchncs.de 20 points 9 months ago* (last edited 9 months ago)

KDE did bother, this does neither happen with KScreenlocker, nor do non-screenlocker windows show in another way, because the screen locker is integrated with the compositor.

If the compositor crashes or gets disabled somehow ofc though, that integration doesn't help either and you have to rely on a mountain of bad hacks as well as the hope that the screen locker doesn't also crash for nothing to happen in that case, but it's as close to secure screen locking as you get on Xorg... in the end the solution for secure screen locking is still Wayland.

[-] Zamundaaa@discuss.tchncs.de 23 points 1 year ago

You'll need to specify what DE you're using. This comes built in with KDE Plasma: Meta+left and then quickly also up for top left corner, Meta+right and then quickly also down for bottom right corner etc.

I don't knowt what exact shortcuts other DEs use, but I think most that aren't Gnome support quarter tiling too

[-] Zamundaaa@discuss.tchncs.de 60 points 2 years ago* (last edited 2 years ago)

I'd recommend you to make backups either way. I've had a SSD with SMART status "good" very suddenly die before, so don't take any chances!

[-] Zamundaaa@discuss.tchncs.de 25 points 2 years ago

All it ever was intended for was to make us feel like something was being done while doing absolutely nothing.

It certainly does help a little bit. But it's of course still not a coincidence that companies are pushing for it instead of more effective measures... It's not just cheap but it also pushes people to believe that measures to save the environment are all useless and annoying, and makes them less likely to want more to happen.

[-] Zamundaaa@discuss.tchncs.de 21 points 2 years ago

The very next words are "but it was my responsibility"... what exactly is bad about that statement if you don't intentionally cherry pick a bad quote?

[-] Zamundaaa@discuss.tchncs.de 30 points 2 years ago

Why would they do that? They're intentionally not supporting OpenGL, so that people use their proprietary API

[-] Zamundaaa@discuss.tchncs.de 20 points 2 years ago* (last edited 2 years ago)

Telemetry wasn't a factor iirc. The biggest reasons for this change were that

  • defaults like this (that only apply to new installations) should make life easy for newcomers, not for the existing users. Those users come from Windows, MacOS or other Linux DEs, which all use double click
  • it already is the default in pretty much all popular distros. KUbuntu, Fedora, Manjaro, SteamOS ~~and I think also OpenSuse~~ are double click by default
[-] Zamundaaa@discuss.tchncs.de 32 points 2 years ago

... or targeting Microsoft again too

view more: next ›

Zamundaaa

joined 2 years ago