Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Windows only.


To be clearer, Krita supports both Windows and Linux. HDR support in Krita is currently Windows only and they are waiting for Intel for HDR support in Linux.

> Of course, at this moment, only Windows 10 supports HDR monitors, and only with some very specific hardware. Your CPU and GPU need to be new enough, and you need to have a monitor that supports HDR. We know that the brave folks at Intel are working on HDR support for Linux, though!


> Of course, at this moment, only Windows 10 supports HDR monitors

Don't they consider P3 to be HDR, or is there something else that disqualifies the Macs that have Display P3?


Display P3 is a color gamut, it defines what color each number means. This is orthogonal to bit-depth, which is how many bits are used to represent each number. This means with the same color gamut but with HDR (more bits) you can have smaller steps between colors, improving color banding, gradients, etc. You can have any combination of (sRGB, P3) x (8-bit per channel, 10-bit per channel), 10-bit per channel being a common implementation of HDR these days.

I don’t know if Apple makes HDR displays.


I don't think they do. The 27" 5K on the latest models looks like it's their most advanced model, and it's 527cd/m2 and contrast ratio of 960:1, while the Ultra HD Alliance certifies HDR displays at a minimum of 540cd/m2 and 1,080,000:1 for LDC displays.

So they're super nice, I'm very happy with my 2017 model and it's the best display I've ever used, but not HDR.


They claim to support "one billion colors", which is roughly 10 bits per channel.

There was a lawsuit about dithered displays back in the 90s that made them write "millions of colors" instead of "16 million colors", so I'd say it's unlikely that they're not actually providing those 10 bits.

So their fanciest displays are both Display P3 and 10 bits per channel.


10 bits per channel does not necessarily mean you have HDR.

In the past, it was used to get finer gradations, which is in fact fairly important if you are working with black and white , to avoid banding.

HDR involves mapping the 10 bits over a different tone response curve that covers a wider output dynamic range.


The actual dynamic range of the displays, and the tone response curves mapping input number to output brightness.


Go used to not even run on Windows, Atom was only on Mac at first. It's not a big deal if it's on one platform currently. You need to get it working on one platform first, and then work on porting it to other platforms.


Which the post clearly states and is, as far as I can tell, because Windows and DirectX is the only platform that supports it.


That's not true; OpenGL, Vulkan and Metal also support HDR.


With currently shipping hardware and drivers? I know OpenGL can render HDR internally, but can you send that to a HDR-capable device today?


It just doesn't exist yet. There is a possibility that AMD might start supporting HDR this year. Intel is working on it. Nvidia hasn't gotten much further than https://www.x.org/wiki/Events/XDC2016/Program/xdc-2016-hdr.p... . It'll come, but we're not there yet.

Mind, even on Windows, it's pretty messy and you have to do a lot of figuring out and hacking platform layers to make it work.


not yet on linux sadly. Note that Krita works fine on linux - it comes from the KDE world anyways.


And it is shipped as an AppImage, so you don't need to wait for it to get into your package repo or deal with a lot of dependency nonsense.


So what? It doesn't need to be cross-platform at the beginning.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: