The issue I’m having is that the video card LUT that’s created, while targeting sRGB at D6500, is coming out at 7000+ CCT when verifying the profile and when reporting on the calibrated display.ĭuring refinement passes, does Argyll actually measure the results of applying the LUT values so it knows how much to increase or decrease them by? My theory goes something like, “the LUT table is interpreted in HDR space not in SDR space”, but I’m not really I’ve been using Displa圜al to create monitor profiles for Windows 11 in HDR mode on my OLED laptop. I’m not sure if this is a bug in W11 HDR mode, or if its an artifact of how HDR works. I was wondering if you guys have any thoughts on what’s happening? During refinement passes, does Argyll actually measure the results of applying the LUT values so it knows how much to increase or decrease them by? While this works reasonably well, it is a clunky work-around and results in calibrations that are of lower quality. I then incorporate this corrected LUT into a new profile. So if the default LUT value for red at index 255 is 65535, and the calibrated value is 65531, the “corrected” value would be 65534 and so on for each LUT table index. I’ve worked around this by reducing the difference between each default LUT value and the calibrated LUT value by a factor of 4. The issue I’m having is that the video card LUT that’s created, while targeting sRGB at D6500, is coming out at 7000+ CCT when verifying the profile and when reporting on the calibrated display. I’ve been using Displa圜al to create monitor profiles for Windows 11 in HDR mode on my OLED laptop.
0 Comments
Leave a Reply. |