Contribute
Register

Native 4K with Radeon 5500 XT

Status
Not open for further replies.
Joined
Nov 17, 2020
Messages
74
Motherboard
Gigabyte Z490I Aorus Ultra
CPU
i9-10900K
Graphics
RX 5500 XT
Mac
  1. iMac
  2. MacBook Pro
Mobile Phone
  1. iOS
I'm having an issue getting my display to output higher than 1080p. I can adjust the scaling on the display through system prefs to achieve a higher resolution, but I feel like I shouldn't have to do that... Maybe? I've got a 5k iMac that, by default, outputs 2560x1440, so I would expect a similar behavior from the 4k screen. Maybe it wouldn't be cranked all the way up to 4k, but I would expect better than 1080 for sure.

For what it's worth, if I do adjust the screen size to a "Scaled" option, I can get to something that looks more like what you would expect it to look like "out of the box," but the login screen still renders at 1080, and presumably also would any other user accounts that are created since I'm just adjusting my personal preference by changing the display option in the system prefs.

I'm running Opencore (whatever the newest is, I updated it last night), verified my config.plist file using the sanity checker, and verified that NVRAM -> Add -> 4D1EDE05-38C7-4A6A-9CC6-4BCCA8B38C14 is `02`in addition to the other things listed here, with the exception of setting AppleDebug to false (I'm still debugging, after all).

I know this has got to be something dumb that I'm overlooking because I'm new to this, so I'm looking forward to someone pointing out the stupid thing I forgot to do.

Video card: https://www.newegg.com/asrock-radeo...gd-8go/p/N82E16814930027?Item=N82E16814930027
 
I'm not 100% sure what I'm looking at here, but it seems to show that the monitor is capable of displaying 4k, but that the "UI Looks like" 1920x1080. Which... What? I just checked the same info on my real iMac (2017 27" 5k) and it doesn't have the "UI Looks like" bit. Just plain old "Resolution: 5120x2880".

Adjusting the display resolution to one of the "Scaled" options changes the "UI Looks like" value, but again, I'm trying to avoid having to do that because I just don't think it should be required. "I don't think" -- I have no friggin' clue.
 

Attachments

  • Screen Shot 2020-11-23 at 1.36.32 PM.png
    Screen Shot 2020-11-23 at 1.36.32 PM.png
    545 KB · Views: 187
No it shouldn't be required, but this is a Hack after all. Sometimes components just don't work the way we expect. Your RX 5500 XT seems to be one of those components.

Change the resolution to what you require using the scaled list, the default is obviously not going to match what the screen is capable of providing.

You may find this changes in the future. It maybe something that the developers of WhateverGreen.kext can resolve, given the correct information and time to create the fix.
 
Yeah, I just scaled everything up and called it a day.

It is indeed a hack :) Last night I pulled the intel wifi card off the board because it wouldn't let the broadcom card do bluetooth or airdrop, and today I routed antennas for the broadcom card through the case and out of the holes left by the wifi card, and taped them down to the case. Hacks on hacks on hacks!
 
After some more research and testing, I don't think this is possible. It only just occurred to me to hook my iMac up to this monitor (https://www.newegg.com/p/N82E16824025512?Item=9SIA4P087Z7022) and what do you know, even a real Mac thinks that 1080 is the best resolution for the display. Other than getting a 5k monitor, I'm not sure there is a fix for this, unfortunately.
 
Sometimes that is the case, there isn't a fix available. Doesn't mean there won't be, but at present you need to use the scaled display feature to get the correct resolution.
 
According to the AMD specification for your RX 5500 XT is capable of running a 4K screen @60Hz using the HDMI connector. This is assuming you have a 4K capable HDMI cable.

The DisplayPort Connections on the RX 5500 XT are version 1.4, which are also capable of running a 4K screen, not sure if it can run a pair of them, but it can run the 4K screen @ 144Hz. Using the correct 1.4 compatible cable is essential for this type of resolution.

AMD Specification for RX 5500 XT - https://www.amd.com/en/products/graphics/amd-radeon-rx-5500-xt

I would suggest you look at the cables you are using to see if they are compatible, you probably have done this already. Ensure the connectors are fully seated in the display and dGPU, again I expect you have done this many times already.

MacOS should see and set the default display resolution based on the component capabilities. If one doesn't support the 4K resolution then it would be expected to automatically drop it to a supported resolution. The fact you can select the 4K resolution from the Scaled list is just bewildering, as you would expect the highest Scaled resolution to be the default.

I have a number of old AMD fanless graphics cards that don't support 2560x1440, so when I plug a hack containing one of these dGPU's in to one of my Dell Ultrasharp monitors, macOS automatically sets the display resolution to the maximum the dGPU can support. This is the normal and expected action. Your system is being told that the maximum resolution is lower than what it can actually support.

Sorry Im rambling along with no destination in view.
 
It was worth a punt, sorry we can't resolve this in the proper manner.
 
Status
Not open for further replies.
Back
Top