Contribute
Register

AMD Radeon RX 580 & RX 570 Released!!

Status
Not open for further replies.
Thank You for chiming in.
I was just wondering if the dual set up works and cards are fully recognized by the system on an hackintosh build.
As for the applications, normally the Adobe CC ones do not take great advantage of the second or any additional card, but Da Vinci Resolve, if the full paid version is used, it recognizes and exploit all the cards that are installed.
I do not know about Final cut though.

Yes they are recognized in system profile, my dual RX 480s work for Luxmark Ball and it gets a double score compared to single cards, but for OpenGl or gaming type apps the scores are slightly lower due to the PCIe lanes set to 8 versus 16 for a single card. For rendering it should work as long as the apps take advantage of them.
 
how can get the display port audio working redeon rx580 with philips 4k monitor
 
how can get the display port audio working redeon rx580 with philips 4k monitor
Change your System Definition to iMac14,2. Use MultiBeast, just selecting the following >
Customize > System Definitions > iMac > iMac14,2
Build > Install
Reboot​
The above will also assign other parameters appropriate for the 14,2 SysDef.
 
Could someone tell me why I am getting only 300MHz for my RX580? I am using whatevergreen and lilu, latest clover, on Mojave.
Screenshot 2018-11-05 at 09.41.48.png
 
Hi,

I built a GA-Z68XP-UD3P hack running Mojave and installed a MSI Radeon RX570 as recommended in the buyers guide as compatible. It posts, boots to clover and apple progress bar then blam, black screen with occasional horizontal glitch lines. I can VNC in and look at the hardware profiler and it recognizes the card perfectly. I have whatevergreen and lilu kexts in OTHER folder, tried modifying EFI config with no option selected under ACPI, under Graphics only RadeonDeInit selected. Tried turning off/on BIOS integrated graphics as well with no luck. Anyone have any ideas? I posted the System Info and attached my config.plist. The rig worked perfectly with a Nvidia card and modified NVidia web drivers for Mojave, but when I updated to 10.14.1 it broke it for good. The NVidia and Cuda drivers are still installed BTW. Any help appreciated!
 

Attachments

  • config.plist
    6.8 KB · Views: 153
  • Untitled.png
    Untitled.png
    144.4 KB · Views: 169
Hi,

I built a GA-Z68XP-UD3P hack running Mojave and installed a MSI Radeon RX570 as recommended in the buyers guide as compatible. It posts, boots to clover and apple progress bar then blam, black screen with occasional horizontal glitch lines. I can VNC in and look at the hardware profiler and it recognizes the card perfectly. I have whatevergreen and lilu kexts in OTHER folder, tried modifying EFI config with no option selected under ACPI, under Graphics only RadeonDeInit selected. Tried turning off/on BIOS integrated graphics as well with no luck. Anyone have any ideas? I posted the System Info and attached my config.plist. The rig worked perfectly with a Nvidia card and modified NVidia web drivers for Mojave, but when I updated to 10.14.1 it broke it for good. The NVidia and Cuda drivers are still installed BTW. Any help appreciated!

If using lilu+whatevergreen, remove radeondeinit.
You need to uninstall nvidia web drivers as well.
 
Hi,

I built a GA-Z68XP-UD3P hack running Mojave and installed a MSI Radeon RX570 as recommended in the buyers guide as compatible. It posts, boots to clover and apple progress bar then blam, black screen with occasional horizontal glitch lines. I can VNC in and look at the hardware profiler and it recognizes the card perfectly. I have whatevergreen and lilu kexts in OTHER folder, tried modifying EFI config with no option selected under ACPI, under Graphics only RadeonDeInit selected. Tried turning off/on BIOS integrated graphics as well with no luck. Anyone have any ideas? I posted the System Info and attached my config.plist. The rig worked perfectly with a Nvidia card and modified NVidia web drivers for Mojave, but when I updated to 10.14.1 it broke it for good. The NVidia and Cuda drivers are still installed BTW. Any help appreciated!

I had the same issue with almost the exact same hardware. The fix for me was moving to a UEFI installation, which required a full reinstall in my case.
 
Hi,

I built a GA-Z68XP-UD3P hack running Mojave and installed a MSI Radeon RX570 as recommended in the buyers guide as compatible. It posts, boots to clover and apple progress bar then blam, black screen with occasional horizontal glitch lines. I can VNC in and look at the hardware profiler and it recognizes the card perfectly. I have whatevergreen and lilu kexts in OTHER folder, tried modifying EFI config with no option selected under ACPI, under Graphics only RadeonDeInit selected. Tried turning off/on BIOS integrated graphics as well with no luck. Anyone have any ideas? I posted the System Info and attached my config.plist. The rig worked perfectly with a Nvidia card and modified NVidia web drivers for Mojave, but when I updated to 10.14.1 it broke it for good. The NVidia and Cuda drivers are still installed BTW. Any help appreciated!
I'm surprised you managed to boot and managed to VNC into your system.
You have a 2nd Gen i7 Sandybridge CPU with HD3000. Strictly, I believe that CPU is not supported by Mojave.

For Sandybridge HD3000, you should NOT set ig-platform-id to 0x0166000a (that's for 3rd Gen i7 Ivybridge). Just remove it.
Set in Graphics->Inject->Intel=YES and Graphics->RadeonDeInit=NO.
 
OK thanks for the feedback! I tried many variations, still the same problem, this is to add some clarification:

1. Running UEFI BIOS with Graphics setting to AUTO (tried built-in first as well)
2. Removed platform-id 0x0166000a (tried with and without)
3. Tried Graphics->Inject->Intel=YES and Graphics->RadeonDeInit=NO (with Lilu & WhateverGreen)
4. Tried Graphics->Inject->Intel=YES and Graphics->RadeonDeInit=YES (without Lilu & WhateverGreen)
5. Removed Cuda and NVidia Web Drivers first and before doing any of the above.

I did a clean install of Mojave to a 240GB SSD at 10.14.0 with modded NVidia Web Drivers and it ran perfectly with an STRIX-GTC-960-DC2OC-4GD5 video card until the 10.14.1 update broke the web drivers for good. I thought about trying a platform change to a newer imac but I am so close.... The fact the RX570 shows up perfectly in System Profiler, has Clover boot options, Apple boot logo, and working in proper resolution until the progress bar completes, is strange. I can still VNC in no problem in any variation of the above attempts. I am currently using the HDMI port but have tried the DVI as well as my monitor has both. No DP connection or adapter so I didn't try those (3).
 
Status
Not open for further replies.
Back
Top