Contribute
Register

Replacing GTX 970 with RX 580

Status
Not open for further replies.
Just got a Sapphire 580 in 13.6, and it still requires radeondeinit and InjectATI. If I don't use them, I get the black screen (both HDMI and DVI). The only kexts I run are fakesmc and IOnetworking. I run a VERY vanilla install and want to keep it that way. I do not want to run any graphics injectors for my system. Previously I ran a GTX 740 for the sole purpose of native support with nothing else required.

I was told that I did not need any additional kexts (like whatevergreen) for the RX580 to run in 13.6. What am I doing wrong? I updated clover to latest, but no luck.

Please help!

I am having the same issue. Did you find any solution?
 
Hello CrashMidnick,

Thanks for the awesome guide. I followed your step by step list and presto! I'm set for Mojave. Very easy. I think this is the first time I have had an update / upgrade proceed with zero issues. Even the Graphics entry is updated correctly.


  • Update lilu with the latest release
  • Remove NvidiaGraphicsFixup and replace it with the latest Whatevergreen release
  • Uninstall your nvidia web drivers
  • Remove the webdriver argument in your config.plist
  • Uninstall CUDA if installed

Swap the cards and it should be good.
 

Attachments

  • Screen Shot 2019-02-03 at 17.07.48.png
    Screen Shot 2019-02-03 at 17.07.48.png
    114.8 KB · Views: 177
As an added bonus, the sleep function works as it should.

Hello CrashMidnick,

Thanks for the awesome guide. I followed your step by step list and presto! I'm set for Mojave. Very easy. I think this is the first time I have had an update / upgrade proceed with zero issues. Even the Graphics entry is updated correctly.
 
Hello, i have a problem with my Sapphire NITRO+ RX 580 and FCPX. I can not export a h264 video. Any ideas?
I run 13.5 and 10.4.2

BTW should I see this symbol or perhaps it is a leftover from IGFX (i had that enabled with my former GT740)
 

Attachments

  • Screen Shot 2019-02-05 at 01.28.41.png
    Screen Shot 2019-02-05 at 01.28.41.png
    7.1 KB · Views: 71
  • Update lilu with the latest release
  • Remove NvidiaGraphicsFixup and replace it with the latest Whatevergreen release
  • Uninstall your nvidia web drivers
  • Remove the webdriver argument in your config.plist
  • Uninstall CUDA if installed

Swap the cards and it should be good.

This is great advice. I recently purchased a second hand Sapphire RX580 Nitro+ for a pretty good price - I swapped it out for a EVGA 1070 which was working great ; although made the decision to change because I started to loose faith for nVidea driver release for Mojave and also liked the idea of better support generally

It was a very straight forward process installed/updated the kexts and took the card out and replace it. Once the new card (RX580) was in and working I removed the old nVidea drivers using the GUI menu.

One slight issue I had was - I have a very old Dell monitor that work was throwing out so I took it home as a general secondary monitor. (It uses a DVI port) - I also have a a newer monitor that uses a HDMI port. The HDMI port worked fine and even booted with this as the primary screen which the 1070 didn't do previously (I'm assuming because the drivers needed to be initialised for it to be used)

The monitor plugged into the HDMI port worked fine, but the monitor plugged into the DVI port wouldn't display. I read around and it could be an issue with using an analogue and digital display ports mixed? either way my working solution was to use a DVI to Display Port adapter and reboot the machine and both monitors worked fine.

This is probably trivial for for most but I'm hoping it might save someone the 30-45 mins I spent scratching my head wondering why both monitors wouldn't work!

And before anyone says it, yeh - I should probably be upgrading my monitors! :lol:
 
Alright, I've been running through this thread over and over and even went off on several side tracks trying anything that seemed possible... Here's my story.

I've been using a 1050ti, but wanted something my system would recognize and let me run acceleration and last but not least I dual boot to Windows 10 to play my Oculus Rift and the XFX Radeon RX 580 8GB GTR-S BLACK EDITION really helped my VR framerate.

The RX580 works great in Windows 10, but during Mojave boot up about 2/3 of the progress bar under the apple logo the screen becomes a lighter black and just hangs there.

Yesterday I formatted my OSX drive and started with a clean install (10.14.3) and have just been working on the video card issue.

In my kexts/Other folder I only have FakeSMC, Lilu & WhateverGreen (most current versions downloaded today)

I also read that they would work better in Library/Extensions &/or the System/Library/Extensions so I tried installing into them using KextBeast (they are still in both of those folders)

In Clover Configurator - I have (boot) dart=0, nv_disable= 1, & kext-dev-mode=1 checked
(graphics) Inject ATI
iMac 14,2 for model.

I'm at the point where I really don't know what to try next. Thank you for any suggestions you've got!
 
Last edited:
I also read that they would work better in Library/Extensions &/or the System/Library/Extensions so I tried installing into them using KextBeast (they are still in both of those folders)

iMac 14,2 for model.

I'm at the point where I really don't know what to try next. Thank you for any suggestions you've got!

Remove the kexts from S/L/E, they should only be in L/E and not in S/L/E.
Try iMac 18.3
 
I tried your suggestions with same result.

I found a thread that said to roll back the card bios. Tried that, but didn't help.

After I did Ploddles suggestion it goes to black at the halfway mark of the apple boot progress bar.

I read XFX's are harder to get running. Wish I'd checked into that further before the purchase.
 
Hello friends, I have similar problem getting my Sapphire Pulse RX 580 fully working at 4k. I am rocking Mojave 10.14.3 on a Gigabyte G77 MVP and I am trying to get 4k output from my RX 580 HDMI ports. The ports seem to work fine on smaller screens but during the boot process, when switching to the login screen at 4k output to my LG screen, I get black. I have tried WG with Lilu multiple times with no luck. I have removed all nvidia extensions and CUDA, turned off all injection in Clover and deactivated internal graphics in BIOS. I have also tried multiple System profiles including the 4k and 5k imac with no success. I also tried a DVI to HDMI adaptor and got the screen at 1080 with a red tint. I have ordered a 4k DP to HDMI adapter and hope that will work but not counting on it. I have read tons of threads on the RX 580 but still have not resolved this problem. Also, I am running Clover v2.4k rev 4769. Any thoughts would be appreciated.
 
Status
Not open for further replies.
Back
Top