Contribute
Register

AMD Radeon Performance Enhanced SSDT

it’s purely cosmetic but the card needs AMD in the name. Change the model in the SSDT to “AMD Radeon RX 5600XT”

Ah hah! Well now the graphics card shows up, but I still don't have HVEC encoding. How do I enable that?

Thanks for your help!!
 

Attachments

  • Screen Shot 2020-05-07 at 10.17.33 AM.png
    Screen Shot 2020-05-07 at 10.17.33 AM.png
    132.3 KB · Views: 102
  • SSDT-RX-5600-XT-V1.aml
    694 bytes · Views: 133
  • config.plist
    23.4 KB · Views: 107
it’s purely cosmetic but the card needs AMD in the name. Change the model in the SSDT to “AMD Radeon RX 5600XT”

I saw that if you don’t add the correct name for your GPU, the system info show metal “supported“ instead of the version number.
Maybe it will give a lower metal score? Didn’t tried yet.
 
I saw that if you don’t add the correct name for your GPU, the system info show metal “supported“ instead of the version number.
Maybe it will give a lower metal score? Didn’t tried yet.

Metal score was unaffected for me, though I can confirm that Metal shows as Supported in System Report.
Geekbench 5 now shows the wrong GPU name ("AMD Radeon HD GTX10 Family Unknown" vs "AMD Radeon RX 5600 XT") but c'est la vie.

Any idea how to get VideoProc to show HVEC and H.264 encoding as Enabled?

Of note: I played an 8K VP90 video just now and it was breathtakingly smooth! Spectacular image, clarity, and smoothness!

EDIT: VP90 is a Google-created competitor to the HVEC codec. I just tested 8-bit HVEC, 10-bit HVEC, and a VP90 video -- all work flawlessly. Still wondering if I can get VideoProc to show the those codecs as supported!
Useful sample video source for HVEC here.
 

Attachments

  • Screen Shot 2020-05-07 at 10.33.48 AM.png
    Screen Shot 2020-05-07 at 10.33.48 AM.png
    29.9 KB · Views: 112
  • Screen Shot 2020-05-07 at 10.35.02 AM.png
    Screen Shot 2020-05-07 at 10.35.02 AM.png
    164.5 KB · Views: 128
  • Screen Shot 2020-05-07 at 10.37.30 AM.png
    Screen Shot 2020-05-07 at 10.37.30 AM.png
    75.1 KB · Views: 109
  • Screen Shot 2020-05-07 at 10.46.26 AM.png
    Screen Shot 2020-05-07 at 10.46.26 AM.png
    42 KB · Views: 94
  • Screen Shot 2020-05-07 at 10.46.40 AM.png
    Screen Shot 2020-05-07 at 10.46.40 AM.png
    66.9 KB · Views: 112
Last edited:
Hi all.
I did little changes on my DeviceProperties (over the VGA Controller entry) and I could recover the benchmarks that I've obtained using the SMBIOS iMac15,1 now with iMacPro1,1.

First, my table:
My track:
Tool Before After After (OpenCore) Vendor/Device (OpenCore)
LuxMark 3.1 12479 12583 12657 (14471 CPU+GPU) 12657
Geekbench 4.4.1 (metal) 109637 113231 117119 115265
Geekbench 4.4.1 (openCL) 123953 129062 129323 135855


The metal number is low comparing with the previous ones, but as I run the test again the results increase sometimes and other worse, hahaha.

The device entry (for RX 580 8GB):
Code:
            <key>PciRoot(0x0)/Pci(0x1,0x0)/Pci(0x0,0x0)</key>
            <dict>
                <key>AAPL,slot-name</key>
                <string>Slot-1</string>
                <key>@0,name</key>
                <string>ATY,Orinoco</string>
                <key>ATY,EFIVersion</key>
                <data>MDEuMDEuMTkw</data>
                <key>ATY,Part#</key>
                <data>MTEzLTRFMzUzQlUtTzRF</data>
                <key>ATY,Card#</key>
                <data>MTEzLTRFMzUzQlUtTzRF</data>
                <key>ATY,Copyright</key>
                <string>(C) 1988-2017, AMD Technologies Inc.</string>
                <key>ATY,Rom#</key>
                <string>GV-RX580GAMING-8GD/F10/058AE</string>
                <key>ATY,VendorID</key>
                <data>AhA=</data>
                <key>ATY,DeviceID</key>
                <data>32c=</data>
                <key>device_type</key>
                <data>QVRZLE9yaW5vY29QYXJlbnQ=</data>
                <key>model</key>
                <data>UmFkZW9uIFJYIDU4MA==</data>
                <key>hda-gfx</key>
                <string>onboard-1</string>
                <key>PP_EnableLoadFalconSmcFirmware</key>
                <integer>1</integer>
                <key>PP_Falcon_QuickTransition_Enable</key>
                <integer>1</integer>
                <key>CFG_NVV</key>
                <integer>2</integer>
                <key>CFG_PTPL2_TBL</key>
                <data>ggAAAHwAAAB2AAAAcAAAAGoAAABkAAAAXgAAAFgAAABSAAAATAAAAEYAAABAAAAAOgAAADQAAAAuAAAAKAAAAA==</data>
                <key>CFG_TPS1S</key>
                <integer>1</integer>
                <key>CFG_USE_AGDC</key>
                <integer>1</integer>
                <key>CFG_USE_CP2</key>
                <integer>1</integer>
                <key>CFG_USE_SCANOUT</key>
                <integer>1</integer>
                <key>CFG_USE_TCON</key>
                <integer>1</integer>
                <key>PP_WorkLoadPolicyMask</key>
                <integer>1</integer>
            </dict>

I obtained some entries (CFG_* and PP_*) from the latest version of RadeonBoost.kext, just open it and lookup the key RX480580590Boost, then copy the values.
Also, other entries like VendorID, DeviceID, Rom# and Part#, I obtained from old post about of how to change the graphics card bios, in order to change the Part# & Card# to make 'macOS full compatible'. In other words, make it visible to your mac.
The rest of the properties and values are from the DeviceProperties shared by @mattystonnie.

The important part are the values for VendorID and DeviceID. Without this the values from benchmark are lower.

Hope this helps!
Cheers!
 
Last edited:
Hi all.
I did little changes on my DeviceProperties (over the VGA Controller entry) and I could recover the benchmarks that I've obtained using the SMBIOS iMac15,1 now with iMacPro1,1.

First, my table:
My track:
Tool Before After After (OpenCore) Vendor/Device (OpenCore)
LuxMark 3.1 12479 12583 12657 (14471 CPU+GPU) 12657
Geekbench 4.4.1 (metal) 109637 113231 117119 115265
Geekbench 4.4.1 (openCL) 123953 129062 129323 135855


The metal number is low comparing with the previous ones, but as I run the test again the results increase sometimes and other worse, hahaha.

The device entry (for RX 580 8GB):
Code:
            <key>PciRoot(0x0)/Pci(0x1,0x0)/Pci(0x0,0x0)</key>
            <dict>
                <key>AAPL,slot-name</key>
                <string>Slot-1</string>
                <key>@0,name</key>
                <string>ATY,Orinoco</string>
                <key>ATY,EFIVersion</key>
                <data>MDEuMDEuMTkw</data>
                <key>ATY,Part#</key>
                <data>MTEzLTRFMzUzQlUtTzRF</data>
                <key>ATY,Card#</key>
                <data>MTEzLTRFMzUzQlUtTzRF</data>
                <key>ATY,Copyright</key>
                <string>(C) 1988-2017, AMD Technologies Inc.</string>
                <key>ATY,Rom#</key>
                <string>GV-RX580GAMING-8GD/F10/058AE</string>
                <key>ATY,VendorID</key>
                <data>AhA=</data>
                <key>ATY,DeviceID</key>
                <data>32c=</data>
                <key>device_type</key>
                <data>QVRZLE9yaW5vY29QYXJlbnQ=</data>
                <key>model</key>
                <data>UmFkZW9uIFJYIDU4MA==</data>
                <key>hda-gfx</key>
                <string>onboard-1</string>
                <key>PP_EnableLoadFalconSmcFirmware</key>
                <integer>1</integer>
                <key>PP_Falcon_QuickTransition_Enable</key>
                <integer>1</integer>
                <key>CFG_NVV</key>
                <integer>2</integer>
                <key>CFG_PTPL2_TBL</key>
                <data>ggAAAHwAAAB2AAAAcAAAAGoAAABkAAAAXgAAAFgAAABSAAAATAAAAEYAAABAAAAAOgAAADQAAAAuAAAAKAAAAA==</data>
                <key>CFG_TPS1S</key>
                <integer>1</integer>
                <key>CFG_USE_AGDC</key>
                <integer>1</integer>
                <key>CFG_USE_CP2</key>
                <integer>1</integer>
                <key>CFG_USE_SCANOUT</key>
                <integer>1</integer>
                <key>CFG_USE_TCON</key>
                <integer>1</integer>
                <key>PP_WorkLoadPolicyMask</key>
                <integer>1</integer>
            </dict>

I obtained some entries (CFG_* and PP_*) from the latest version of RadeonBoost.kext, just open it and lookup the key RX480580590Boost, then copy the values.
Also, other entries like VendorID, DeviceID, Rom# and Part#, I obtained from old post about of how to change the graphics card bios, in order to change the Part# & Card# to make 'macOS full compatible'. In other words, make it visible to your mac.

The important part are the values for VendorID and DeviceID. Without this the values from benchmark are lower.

Hope this helps!
Cheers!

Some are useless.

Did you get this line from radeon boo**?
<key>ATY,EFIVersion</key>
<data>MDEuMDEuMTkw</data>
 
Last edited:
Metal score was unaffected for me, though I can confirm that Metal shows as Supported in System Report.
Geekbench 5 now shows the wrong GPU name ("AMD Radeon HD GTX10 Family Unknown" vs "AMD Radeon RX 5600 XT") but c'est la vie.

Any idea how to get VideoProc to show HVEC and H.264 encoding as Enabled?

Of note: I played an 8K VP90 video just now and it was breathtakingly smooth! Spectacular image, clarity, and smoothness!

EDIT: VP90 is a Google-created competitor to the HVEC codec. I just tested 8-bit HVEC, 10-bit HVEC, and a VP90 video -- all work flawlessly. Still wondering if I can get VideoProc to show the those codecs as supported!
Useful sample video source for HVEC here.

2 options, Smbios iMac Pro 1,1 or iMac 18,3 with DGPU+IGPU .
 
2 options, Smbios iMac Pro 1,1 or iMac 18,3 with DGPU+IGPU

Ah! So simply an SMBIOS issue (cosmetic) and does not affect performance?
Any idea why iMac19,1 doesn't work?

Thank you so much man for all your help!!
Mega appreciated.
 
Ah! So simply an SMBIOS issue (cosmetic) and does not affect performance?
Any idea why iMac19,1 doesn't work?

Thank you so much man for all your help!!
Mega appreciated.

Edited : You might try device properties : AAPL,ig-platform-id : 0300983E for iMac 19,1 and IGPU enabled in bios, dgpu as primary.
 
Last edited:
Some are useless.

Did you get this line from radeon boo**?
<key>ATY,EFIVersion</key>
<data>MDEuMDEuMTkw</data>

Hahaha, no. That from you've shared before (fixed my post).
 
Edited : You might try device properties : AAPL,ig-platform-id : 0300983E for iMac 19,1 and IGPU enabled in bios, dgpu as primary.

Interesting. Per the Recommended BIOS settings on the Opencore dortania guide, I had 4G Decoding set to Enabled. I had to Disable that in order to boot after enabling IGPU Multi-Monitor. Keep in mind this is on an ASRock Z390 Phantom Gaming ITX/ac running BIOS v4.30.

In config.plist, setting Device Properties : AAPL,ig-platform-id : 0300983E or 0300923E (per the Opencore dortania guide) both resulted in VideoProc correctly showing HVEC and H.264 support through the IGPU. What is the difference between the two settings?

Geekbench 5 also now shows both GPUs instead of only the dGPU. Metal and OpenCL scores in Geekbench 5 remain the same, around 62k and 53k respectively.

Is seeing only the IGPU in VideoProc the expected readout?

-----------------

TLDR: All I had to do was set 4G Decoding to Disabled, and set IGPU Multi-Monitor to Enabled in my BIOS!
 

Attachments

  • Screen Shot 2020-05-07 at 12.50.55 PM.png
    Screen Shot 2020-05-07 at 12.50.55 PM.png
    138.8 KB · Views: 118
  • Screen Shot 2020-05-07 at 12.54.53 PM.png
    Screen Shot 2020-05-07 at 12.54.53 PM.png
    129.7 KB · Views: 114
  • Photo May 07, 12 44 23 PM.jpg
    Photo May 07, 12 44 23 PM.jpg
    2.2 MB · Views: 107
Back
Top