Contribute
Register

Radeon Compatibility Guide - ATI/AMD Graphics Cards

I can’t, for the life of me, get a Gigabyte Radeon RX 570 Gaming 4G https://www.newegg.com/Product/Product.aspx?Item=N82E16814125966to work properly in my computer. I've been at it all week. With no Lilu or WEG, or anything graphics related checked in config.plist, I have no problem booting other than the usual flash at second stage. If I check Inject ATI=YES then I can see in preboot.log that the Dayman framebuffer is being used.

The problem is graphics on screen aren't great and contain odd artifacts especially at small text sizes. (See brownish artifacts in image below.)

IMG_3152.JPG

Also, in display preferences, it would seen that my Dell U2715H monitor is being recognized as a TV. I cannot switch it to a normal 60Hz setting. The monitor is also set at YPbPr and not RGB.

Screen Shot 2018-12-03 at 11.49.08 PM.jpg

I am pretty sure this is the root of the problem but have no idea how to fix. I am using a brand new DP to DP cable BTW.

I am crazy frustrated by it and if I cannot get it to work, I will return this card and go back to my trusty Nvidia GTX 750 Ti which has been working flawlessly. The only reason I decided to go AMD is because there are no Nvidia drivers for Mojave. I am still on High Sierra, but at some point I will have to update.

I really hope someone can help me. Attached is IOREG and preboot.log and the cards ROM.
 

Attachments

  • Mac Pro.ioreg
    8.2 MB · Views: 125
  • preboot.log-12-05.txt
    28.7 KB · Views: 141
  • 1002_67df_22f71458.rom.zip
    44.1 KB · Views: 85
Hello , i would like to buy a

Gigabyte Windforce R9 280 (non-X) for a MacPro 1.1

Could someone please provide me with a working a MacEFI as i would like to learn to flash the card to get the BootScreens and maximum port availability. I would be running Lion 10.7.5 and Mavericks 10.9.1 or 10.9.3 (plus sporadically Snow Leopard).


Thank you very much.
 
Thank you, thank you, thank you!!! That finally worked! I spent 8 hours yesterday trying to edit a framebuffer when all it took was this.

Shocking that this (or similar) issue has been around since before 2013 and affects real Macs as well!

Anyone know why this is an issue with this card (or others)?

I'm sure I could have done this with Clover injection somehow?
 
Last edited:
Anyone know why this is an issue with this card (or others)?
It's actually a problem with the Dell monitor. I've got an RX 570 by Gigabyte and use an LG IPS monitor. I've not encountered this problem. Apple uses LG panels it their iMac line.

I’ve been using a 27″ Dell at work for a couple years, connected to my Mac via DisplayPort. It’s a pretty nice display with a bright IPS panel, good colors, and wide viewing angles. When I plugged it into my new Mac, the text looked terrible.

Eventually, I noticed something in the monitor’s settings: My new Mac was connecting with an Input Color Format of YPbPr.
John Ruble
 
A few weeks ago, I bought a Nvidia GTX 1060, only to learn here that it will likely never work with Mojave, because Nvidia has been stopped from releasing the drivers by Apple.

Swell.

So, off to Amazon again, and I'm getting a Sapphire 11265-05-20G Radeon Pulse RX 580 8GB GDDR5 Dual HDMI / DVI-D / Dual DP OC delivered tomorrow.

But reading thru this forum (and others) has me totally confused. Some folks say to enable the internal GPU and others say to disable it. Some say to just leave the old 1060 drivers alone; others say to remove them; some say I need "Lilu and Whatevergreen" (I did find the iDiots guide here) ... and so on.

Of course this is because of different mobos, and different model Radeon cards.

I'm more of a user than a modder, but I'm not afraid of mucking about either. My primary software is Final Cut Pro and Photoshop.

What I'm hoping is that someone here who has made the transition from Nvidia to Radeon, and has the same mobo (GA-Z77x-UD5H) can offer me the steps I need to take to successfully use the new card. I'm also hoping that my old, simple mobo will make life a bit more simple, but whatever it takes...

Thanks... and happy holidays to all.
 
Last edited:
I'm more of a user than a modder, but I'm not afraid of mucking about either. My primary software is Final Cut Pro and Photoshop.

What I'm hoping is that someone here who has made the transition from Nvidia to Radeon, and has the same mobo (GA-Z77x-UD5H) can offer me the steps I need to take to successfully use the new card. I'm also hoping that my old, simple mobo will make life a bit more simple, but whatever it takes...

Thanks... and happy holidays to all.
With that CPU and RX 580, I'm afraid you will not have much joy.
I have an i7-3770K + Z77X-UP5 + RX580 and FCPX is not vey stable.
With the latest Mojave release, Apple have removed h/w codec feature of the RX 580.
You will be able to use your system, but will rely on the CPU and IGPU as long as H.264 output is all you need.

See issues here.
 
With that CPU and RX 580, I'm afraid you will not have much joy.
I have an i7-3770K + Z77X-UP5 + RX580 and FCPX is not vey stable.
With the latest Mojave release, Apple have removed h/w codec feature of the RX 580.

ARRRGGGGHHHH!!

OK: thank you for saving me. I've got a return/refund approved on the card.

So: if I want to continue to move forward, then I take it I need to build a new hackintosh (or pray that Nvidia releases drivers?)

Any suggestions for what mobo/CPU/card -will- work with, and accelerate, FCP? I'm more a creative than a gamer...

Thanks again. Not the news I wanted, but I appreciate your response.
 
Back
Top