Contribute
Register

[Success] Bortoni's GA-Z270-HD3 | i7-7700k | Radeon RX560 | NVMe

Status
Not open for further replies.
Update 5: Graphics tweaks

1) I added an injector kext to enable HEVC HW decoding with my RX560 card. It should be placed in EFI/CLOVER/kexts/Other
2) Stopped using RadeonDeInit in my config file. Instead I'm now using a SSDT to do the deInit work as well as select the Acre framebuffer and rename my card. I then patched my connectors to match my particular card. See this thread for instructions.

Code:
        <key>KextsToPatch</key>
        <array>
            <dict>
                <key>Comment</key>
                <string>Asus RX560 2G OC connector Patch</string>
                <key>Disabled</key>
                <false/>
                <key>Find</key>
                <data>
                AAQAAAQDAAAAAQEBAAAAABECAgEAAAAAAAgAAAQCAAAA
                AQIAAAAAACEDBQQAAAAABAAAAAQCAAAAAQMAAAAAAAAA
                AwUAAAAA
                </data>
                <key>Name</key>
                <string>AMD9500Controller</string>
                <key>Replace</key>
                <data>
                AAQAAAQDAAAAAQEAAAAAABECBQEAAAAAAAgAAAQCAAAA
                AQMAAAAAACEDAwQAAAAABAAAABQCAAAAAQIAAAAAABAA
                BAUAAAAA
                </data>

Code:
<key>Graphics</key>
    <dict>
        <key>Inject</key>
        <dict>
            <key>ATI</key>
            <true/>
            <key>Intel</key>
            <true/>
        </dict>
        <key>RadeonDeInit</key>
        <false/>
        <key>ig-platform-id</key>
        <string>0x59120003</string>
    </dict>

Hello bortoni,
My respect for the gigantic work done,
I saw in your post SSDT-AMD.dsl what is this file, how can I use it, do I need to make this file myself and how? My framebuffer is the same as yours. New drivers and lilu give me a black screen.
 

I need to update this post. I recently converted back to using the latest lilu and whatevergreen. The method I was using resulted in periodic kernel crashes related to graphics drivers. Also, given that you have different CPU and board class (ie ep35) the system definition is critical. I suggest you search the boards for folks with 460/560 cards using your same system def.
 
I need to update this post. I recently converted back to using the latest lilu and whatevergreen. The method I was using resulted in periodic kernel crashes related to graphics drivers. Also, given that you have different CPU and board class (ie ep35) the system definition is critical. I suggest you search the boards for folks with 460/560 cards using your same system def.
AH. Maybe this is causing the kernel crashes in Lightroom! So this also means you removed the Acre and RX560 kexts and just installed Lilu and whatevergreen instead?
 
Update 8: Mojave Prep

I'm currently running 10.13.6. But in preparation for moving to Mojave I've done the following:

1) Update to Clover 4630 (From Multibeast 10.4) 4674 (from Unibeast 9.0, right click it, Show Package Contents->Contents->Resources-> Clover_v2.4k_r4674-UEFI-UB.pkg)
2) Update FakeSMC.kext to latest from tonymacx86.com download section
3) Update to Lilu 1.2.7
4) Update WhateverGreen to 1.2.3

Made some changes to my config.plist to go along with new WEG. Most info from here.


Code:
    <key>Devices</key>
    <dict>
        <key>Properties</key>
        <dict>
            <key>PciRoot(0x0)/Pci(0x2,0x0)</key>
            <dict>
                <key>AAPL,ig-platform-id</key>
                <data>
                AwASWQ==
                </data>
            </dict>
        </dict>
    </dict>
 
Last edited:
Hi ... i am currently using RX580 on my home machine with GA-Z270-HD3 and have 4 monitors running. Throughout this thread you end up with:

Chipset → Integrated Graphics : Enabled

Why is this done? Can we actually use the onboard graphic to drive a monitor? the reason i ask is i am building a pseudo duplicate machine for the office - Same MB but rather than RX580 i was hoping to use a less expensive 560 card but they have a Max of 3 ports - My current office machine has 4 monitors on HD6870 but i have not had much luck getting High Sierra to run on that card. So can i use the built-in and the 560 to get a 4 monitor system ?

When i enable Graphics on my home machine (10.13.6) it auto reboots before login prompt.
 
Last edited:
Hi ... i am currently using RX580 on my home machine with GA-Z270-HD3 and have 4 monitors running. Throughout this thread you end up with:

Chipset → Integrated Graphics : Enabled

Why is this done? Can we actually use the onboard graphic to drive a monitor? the reason i ask is i am building a pseudo duplicate machine for the office - Same MB but rather than RX580 i was hoping to use a less expensive 560 card but they have a Max of 3 ports - My current office machine has 4 monitors on HD6870 but i have not had much luck getting High Sierra to run on that card. So can i use the built-in and the 560 to get a 4 monitor system ?

When i enable Graphics on my home machine (10.13.6) it auto reboots before login prompt.

The reason I enable the integrated graphics is to use it in headless mode (no display connected to it). Some apps (such as Final Cut Pro) use Intel's QuickSync H/W in the integrated graphics to speed up H264 encoding and H265 decoding.

I have not tried to use it in addition to the 560. To use it in headless mode, the proper settings have to be added to the clover config.
 
The reason I enable the integrated graphics is to use it in headless mode (no display connected to it). Some apps (such as Final Cut Pro) use Intel's QuickSync H/W in the integrated graphics to speed up H264 encoding and H265 decoding.

I have not tried to use it in addition to the 560. To use it in headless mode, the proper settings have to be added to the clover config.

Thanks ... makes sense. I was able to get my old HD 6870 to work in the office machine - I was getting black screen and flickering when i tried the initial 10.13 install. BUT, using an image installed with RX580, then upgraded to 10.13.6 and swapped cards she came first try. Only problem was color flashing, using SwitchResX to set colors to millions works around that issue.
 
Status
Not open for further replies.
Back
Top