Contribute
Register

Adding/Using HiDPI custom resolutions

Status
Not open for further replies.
What is the difference between the timing in the EDID and the correct timing? Show the two timings so we can compare.

The timing that works is the same as in the EDID. That would have saved me some time had I paid more attention. The standard 3840x2160/60Hz in 'Current Resolutions' and CVT-RB in 'Custom Resolution' did not work.

Not sure what you mean by "sharper".

Better than non-HiDPI, but text is fussier than in Windows, which is razor sharp.

You are using a MacBook Pro 15" Mid-2015? 2.2 GHz or 2.5 GHz? How are you connecting the TV? It has an HDMI 1.4 port so it cannot to 4K 60Hz from that port. Use a DisplayPort to HDMI 2.0 adapter for that.

2.2Ghz. I'm using the iVANKY 4K/60Hz MiniDP to HDMI adapter.
 
The timing that works is the same as in the EDID. That would have saved me some time had I paid more attention. The standard 3840x2160/60Hz in 'Current Resolutions' and CVT-RB in 'Custom Resolution' did not work.
Now the question is, what is the difference between the timing that works and the timing that doesn't work?
594 MHz is for HDMI 2.0. 533.31 MHz is for DisplayPort.

Better than non-HiDPI, but text is fussier than in Windows, which is razor sharp.

2.2Ghz. I'm using the iVANKY 4K/60Hz MiniDP to HDMI adapter.
Is this the item?
https://www.amazon.com/dp/B07BBFR1FJ/?tag=tonymacx86com-20

The picture shows a PTN3361BBS chip which is a level shifter, not a DisplayPort to HDMI 2.0 converter, which means it cannot do 4K 60Hz (unless it is somehow using YCbCr 4:2:0 8 bpc). Maybe they used the wrong picture?

Please attach output from AGDCDiagnose command so we can see the connection type to the adapter.
 
Now the question is, what is the difference between the timing that works and the timing that doesn't work?
594 MHz is for HDMI 2.0. 533.31 MHz is for DisplayPort.
Uploading screenshots.

Yes.

The picture shows a PTN3361BBS chip which is a level shifter, not a DisplayPort to HDMI 2.0 converter, which means it cannot do 4K 60Hz (unless it is somehow using YCbCr 4:2:0 8 bpc). Maybe they used the wrong picture?
My TV says YCbCr 4:2:0 8-bit in both Mac and Windows. Probably limitation of my TV? 'Data Block #8' in EDID seems to state that as well.

What adapter would you recommend I try?

Please attach output from AGDCDiagnose command so we can see the connection type to the adapter.
Attached! Thanks for taking the time to help me :)
 

Attachments

  • AGDCDiagnose_a.txt
    39.3 KB · Views: 110
  • NotWorking.png
    NotWorking.png
    1.1 MB · Views: 148
  • NotWorkingCVT-RB.png
    NotWorkingCVT-RB.png
    949.9 KB · Views: 135
  • Philips FTV.txt
    10.5 KB · Views: 93
  • Working.png
    Working.png
    1.1 MB · Views: 135
My TV says YCbCr 4:2:0 8-bit in both Mac and Windows. Probably limitation of my TV? 'Data Block #8' in EDID seems to state that as well.
4:2:0 shouldn't be bad for greyscale text. It's only bad for colored text.

You could try forcing RGB by editing the EDID in the override file (edit the one created by SwitchResX).

But if it says the same thing in macOS and Windows, then this is not the problem.

What adapter would you recommend I try?
AGDCDiagnose seems to indicate that you are connecting with 4 lanes of HBR2 (4 x HBR2) to a Parade PS176
That adapter should have no problem supporting HDMI 2.0.
Code:
translateoui 000-028-248
oui: 0x001CF8 = Parade Technologies, Ltd.
 
4:2:0 shouldn't be bad for greyscale text. It's only bad for colored text.

Hard to take pictures, but I can't tell a difference between 4:2:0 (60Hz), and 4:4:4 (30Hz). Also, here's an example of how Mac OS is blurry compared to Windows.

EDIT: I messed up and didn't take the chroma pics at 100% scaling so I deleted them. I also found a way to turn on 4:4:4 in my TV and it works on Windows, but not Mac with my old settings. Will mess around and see what happens.
 

Attachments

  • MacOS.jpeg
    MacOS.jpeg
    7.1 MB · Views: 180
  • Windows.jpeg
    Windows.jpeg
    6.2 MB · Views: 174
Last edited:
Hard to take pictures, but I can't tell a difference between 4:2:0 (60Hz), and 4:4:4 (30Hz). Also, here's an example of how Mac OS is blurry compared to Windows.

EDIT: I messed up and didn't take the chroma pics at 100% scaling so I deleted them. I also found a way to turn on 4:4:4 in my TV and it works on Windows, but not Mac with my old settings. Will mess around and see what happens.
You haven't shown what settings you are using in Windows (or macOS).

Are you using the 1440x720 HiDPI mode in macOS? You know that's only using 2560x1440 scaled up to 3840x2160.
If Windows is using 3840x2160 then of course it's going to look better.

What you want is a 1440x720 HiDPI mode in macOS with a x3 or x4 scale but macOS only lets you use x2 for HiDPI modes.
 
You haven't shown what settings you are using in Windows (or macOS).

Are you using the 1440x720 HiDPI mode in macOS? You know that's only using 2560x1440 scaled up to 3840x2160.
If Windows is using 3840x2160 then of course it's going to look better.

What you want is a 1440x720 HiDPI mode in macOS with a x3 or x4 scale but macOS only lets you use x2 for HiDPI modes.

Running 1280x720p HiDPI, but x2 scaling isn't going to work for me so I will give up on MacOS. Windows works great now and HDR is finally enabled. Youtube HDR is lagging terribly tho (but that's a seperate issue). Thanks for your help :)
 
Running 1280x720p HiDPI, but x2 scaling isn't going to work for me so I will give up on MacOS. Windows works great now and HDR is finally enabled. Youtube HDR is lagging terribly tho (but that's a seperate issue).
Text was too small for you in 1920x1080 HiDPI mode?

I think with iOS Apple must be pretty good at arbitrary scaling.
macOS did have a scale that developers could change with an older version of Quartz Debug.app. Then Apple settled on x2 scale.
The ability to use other scaling factors probably still exists in the macOS but developers have tested their software only with x2 so it might not work well even if you can figure out how to enable it.
 
Last edited:
macOS did have a scale that developers could change with an older version of Quartz Debug.app. Then Apple settled on x2 scale.

The ability to use other scaling factors probably still exists in the macOS but developers have tested their software only with x2 so it might not work well even if you can figure out how to enable it.
Quartz Debug.app v3.0 from Leopard (10.5.8) developer tools (Performance Tools) works on my Power Mac G5 (Quad). It supports arbitrary scale factor from x1 to x3. Here's a screenshot of x3 on a 1920x1200 display:
HiDPI x3.png


My Mac Pro (Early 2008) (MacPro3,1) has 64 GB of RAM and an EVGA Nvidia GeForce GTX 680 Mac Edition.

To boot Leopard (10.5.8), I have to use maxmem=32768 because the kernel is only 32-bit. I also have to remove NVDAResman.kext otherwise a kernel panic occurs with the GTX 680. Leopard doesn't have support for the GTX 680, so there's only the one boot resolution but I can enable x3 scaling. I can't take screenshots though.

Snow Leopard (10.6.8) is similar - I either need to set maxmem=32768 for 32-bit kernel or boot to 64-bit kernel maxmem=63488 arch=x86_64 (maxmem is set to 62 GB because the CPU doesn't like 64 GB - it slows things down for some reason). NVDAResman.kext does not need to be removed. Quartz Debug.app v4.0 from Snow Leopard developer tools works, but I prefer v3.0 because it honours the disabling of the "Restore scale factor to default (1.0) on quit" option better. With it disabled, you can log out and then log in to make the Finder use the x3 scaling (arbitrary scaling is per app in Leopard and Snow Leopard).

Lion (10.7.5) supports multiple resolutions with the GTX 680 even though real support for the GTX 680 is supposed to start with Mountain Lion (10.8.5). It supports HiDPI x2 modes. Lion does not support the arbitrary scale factor. Quartz Debug.app v4.2 has a new option "Enable HiDPI display modes" that replaces the arbitrary scale factor.

Anyway, the point is that the ability to use an arbitrary scale factor existed. I believe macOS today continues to have much of that code - an app is told to use a scale factor number - it's not a boolean that is on or off. It might be possible to change that number. macOS creates HiDPI display modes since Lion (10.7.5). The display mode includes a number for the scale factor. Perhaps that number can be changed. For example, when macOS adds an HiDPI x2 mode for each resolution, maybe we can patch it to add an x3 mode also. Maybe this can be done with a Lilu + WhateverGreen patch... With high resolution 8K displays, you might want to uses scale factors up to x4 and beyond.
 
Text was too small for you in 1920x1080 HiDPI mode?

You know what... I could probably live with it. Maybe zoom in some applications if my eyes get strained.

YCbCr 4:4:4 at 3840x2160/60Hz seems to work from my Mac now, but 1920x1080 HiDPI does not. Timing is the same as in the resolution I just mentioned - and the same as in 'Descriptor #0' in EDID. Also tried making a custom scaling of 3840x2162. Not sure if that should affect the old HiDPI setting, or create a new one, but the old one still didn't work, and no new showed up even tho it says active.

Attaching new EDID as it changed a little bit (same timing) since enabling the UHD HDMI feature on my TV.
 

Attachments

  • Philips FTV_new.txt
    11.6 KB · Views: 137
Status
Not open for further replies.
Back
Top