Contribute
Register

Full RGB range GTX 970 and U2414H

Status
Not open for further replies.
Joined
Mar 10, 2013
Messages
31
Motherboard
Asus Sabertooth Z97 M2
CPU
i5-4690k
Graphics
Asus GTX 960
Mac
  1. iMac
Classic Mac
  1. 0
Mobile Phone
  1. 0
Hello people,

I want to figure what is my real RGB range. My monitor is attached with mDP-DP cable, selected DP 1.1 in monitor settings. In win 8.1 it's shown like HDTV in OS X like television. In windows, I change nvidia control panel from Limited to Full RGB and picture become more sharper. In OS X that option does not exist. Is there any app or tool, anything to switch to Full RGB.
I used known script for OS X and now is shown like Display with forced RGB mode (EDID override). But no any info is it Full 0-255 or Limited 16-235. It seems that picture is sharper in windows and lvl of black and white is higher.
 
From my experience with my TV, Mac OS X seems to default to full range (just make a gradient in a drawing software and see if you see all of it). Your TV should also be set accordingly...
 
I'm looking for a way to enable full range too. Limited range is driving me nuts.
 
It should default to full range if using DisplayPort.
 
It doesn't appear to be defaulting to full range over DisplayPort. I also have no way of knowing whether my GPU is outputting 8-bit or 10-bit signals to my monitor. In Windows, there is the NVIDIA Control Panel, and I get these options:
0OEBgef.png

It would be really nice to have these options in macOS too.
 
It doesn't appear to be defaulting to full range over DisplayPort. I also have no way of knowing whether my GPU is outputting 8-bit or 10-bit signals to my monitor. In Windows, there is the NVIDIA Control Panel, and I get these options:
0OEBgef.png

It would be really nice to have these options in macOS too.

You can see if it is outputting 10bit in System Profiler.
 
You can see if it is outputting 10bit in System Profiler.

This is what I'm seeing in System Profiler....
QwfCJxl.png

Also, I'm starting to wonder if the issue I'm having is actually just in Chrome. Maybe my hackintosh is outputting full range to my monitor. But for some reason Chrome displays YouTube videos incorrectly.

Here's a screenshot of a YouTube video. Left is Chrome (the image looks too washed out). Right is Safari (the color looks right).
YLvsg4h.jpg

I've found that if I disable hardware acceleration in Chrome, then the color becomes normal. But I'd like to keep hardware acceleration for 4K and 8K videos.....
 
This is what I'm seeing in System Profiler....
QwfCJxl.png

Also, I'm starting to wonder if the issue I'm having is actually just in Chrome. Maybe my hackintosh is outputting full range to my monitor. But for some reason Chrome displays YouTube videos incorrectly.

Here's a screenshot of a YouTube video. Left is Chrome (the image looks too washed out). Right is Safari (the color looks right).
YLvsg4h.jpg

I've found that if I disable hardware acceleration in Chrome, then the color becomes normal. But I'd like to keep hardware acceleration for 4K and 8K videos.....

If it was running in 10 bit the Pixel depth would say 30-bit Color (ARGB2101010).
 
If it was running in 10 bit the Pixel depth would say 30-bit Color (ARGB2101010).

See, that's annoying.... My GPU supports 10-bit output, and my monitor is 10-bit. But there's no place where I can toggle on 10-bit. :(
 
Status
Not open for further replies.
Back
Top