Contribute
Register

[Success] AMD RX6000 Series working in macOS

Same boat as you. I am a pure Mac user so no need for Windows. I built a hack in 2018, the first one and probably only one. Reason, for around £3k I could risk building a setup that would outperform what Apple could sell me at that time.

Previously I'd buy a new iMac every 3 years. But my work got more demanding regarding graphics power. The HS 1080 build I use is rock solid but during the last few months all my software updates are becoming incompatible, Davinci, Blender, most Adobe apps. I pay for Adobe's CC, which is annoying but at the end of the day it was my choice to risk building a hack for work purposes. It's feels a little dirty, but the horsepower and hardware options are great.

Given the current incompatibility of these new AMD cards (my hunch is they *may* be supported much further down the line, more on that below) and the fact graphics cards are so ludicrously expensive + hard to come by and probably will be for the rest of the year if not beyond, my options are to wait for a suitable M* iMac that will have enough graphics grunt for my needs. Or hold out for a M* MacPro. Considering the work I do, switching to Windows would be the best thing to do, but f that.

A few months ago I was certain these cards would gain driver support pretty quickly. I've now changed my views, not least for the obvious reason that we still don't see support. If you consider the top whack x4 GPU graphics config for the Intel MacPro it's extremely powerful, albeit stupidly expensive. How many higher-end MacPro customers are desperate to upgrade beyond the current upgrade options or indeed if they already have the top spec? Likely most of these customers are perfectly content for now. It still leaves a hole in the market, for pro creatives like myself. I think this will be addressed when Apple unleash more powerful graphics systems. I strongly feel they are working on something that will be sufficient, but realistically this will be a 2022 thing. Happy to be wrong.

I impulse bought a 6900XT in Jan but sent it back within 10 days as I don't use Windows and reconsidered the insanity a £1200 paperweight. Supply seems to have got worst since, no doubt the recent crypto boom is partly to blame.

All said, agree with some others comments on here to just be hyper aware if you're considering buying a new AMD card specifically for MacOS you're probably in for a long wait.

It is frustrating. I don't necessarily blame anyone as I am still aware these are not officially supported computers. That said, as a creative professional like yourself, the options for my needs seem to be quickly dwindling.

Sure, I can wait... but software is starting to become incompatible. Thankfully most of them are still running, but a few will stop in April. This will force me to find something else, or switch back and fourth between Windows which I really don't want to do.

If Apple offered something reasonable (I was hoping so bad that the Mac mini would work), then I would happily go back to a genuine Mac, but with the costs and limitations, I can't do it.

A perfect example to me is my friend. He purchased a fully upgraded iMac about 5 years ago. It is now at the point where the internals are struggling with his OS and software. Yet, the 5K monitor is in great condition. He resents this because he can see himself throwing it in recycling due to the lack of upgradability. I don't want to spend $4000 and be in the same boat.
 
It is frustrating. I don't necessarily blame anyone as I am still aware these are not officially supported computers. That said, as a creative professional like yourself, the options for my needs seem to be quickly dwindling.

Sure, I can wait... but software is starting to become incompatible. Thankfully most of them are still running, but a few will stop in April. This will force me to find something else, or switch back and fourth between Windows which I really don't want to do.

If Apple offered something reasonable (I was hoping so bad that the Mac mini would work), then I would happily go back to a genuine Mac, but with the costs and limitations, I can't do it.

A perfect example to me is my friend. He purchased a fully upgraded iMac about 5 years ago. It is now at the point where the internals are struggling with his OS and software. Yet, the 5K monitor is in great condition. He resents this because he can see himself throwing it in recycling due to the lack of upgradability. I don't want to spend $4000 and be in the same boat.
Totally feel your pain man. The software I use is stable enough for now, so I can keep going but I'm missing out on things like DaVinci Resolve Studio 17. Lets just hope the upcoming 27" iMac rolls out in Q3, I think that will be a good proposition for us. Fingers crossed. Looking at the performance from the new Mac Mini it's certainly intriguing what they are planning to offer for graphics processing.

I had a similar 5K iMac as your friend, in fact I flogged it when I built this hack. I still miss that beautiful display (I now use a 34" LG Ultrawide). In terms of the graphics power of that iMac, it would have been pretty useless for my work these days.
 
I could of course purchase an older generation AMD card and get OSX upgraded, but I am not interested in putting large amounts of cash on an older card when I would prefer the newest generation for work and gaming. I want to future proof my PC as much as I can, so starting off purchasing older GPU's seems like a bad idea.

I understand the desire to be frugal and I try to be as frugal as possible myself. However, we have to consider how much longer macOS itself will continue to be updated for Intel based Macs.

If you believe, as I do, that macOS support for Intel based systems are coming to an end far sooner than Apple is letting on, then it wouldn't make much sense to invest any money in a dying platform. Yes, you might be future proofing your PC/Windows, but you are also throwing money away in regards to macOS.

The days of a single machine to run macOS and Windows are coming to an end. Those who have been using macOS for work and Windows to game and have a limited budget will have to decide if they want to spend on a video card to improve their gaming experience or use the money for an Apple Silicon Mac to continue working.
 
Last edited:
I understand the desire to be frugal and I try to be as frugal as possible myself. However, we have to consider how much longer macOS itself will continue to be updated for Intel based Macs.

If you believe, as I do, that macOS will support for Intel based systems are coming to an end far sooner than Apple is letting on, then it wouldn't make much sense to invest any money in a dying platform. Yes, you might be future proofing your PC/Windows, but you are also throwing money away in regards to macOS.

The days of a single machine to run macOS and Windows are coming to an end. Those who have been using macOS for work and Windows to game and have a limited budget will have to decide if they want to spend on a video card to improve their gaming experience or use the money for an Apple Silicon Mac to continue working.
...or move work to a PC.
 
If you believe, as I do, that macOS will support for Intel based systems are coming to an end far sooner than Apple is letting on.

That's an interesting idea... Just made me think, if that was their plan, it might make sense to upgrade the Mac Pro drivers with a few hip and happening GPUs to give a bit more longevity for those who invested in one. That way their abandoning of the hardware might not sting so much.
 
That way their abandoning of the hardware might not sting so much.
That 2019 MP tower will start looking like a dinosaur of the computer world rather quickly. Apple is heading toward making everything smaller, quieter and more energy efficient. The MP is certainly quiet but it's got a 1.4kW PSU in it ! It's probably the heaviest Mac ever built too. Those huge power hungry high end graphics cards are also on the way out since we've now got the M1 SoC that shows us how capable their chip division is. Apple's direction has always been forward. They want to leave behind the past to follow their vision for the future. That's why they've left Intel for good and will leave AMD behind as well. Those two companies have served them well in the past but now they'll be making their own CPUs and GPUs for Macs. I don't think the Mac line of computers would exist today if they hadn't made the switch to Intel some 15 years ago.

They will build their own dedicated GPU and it will be a fraction of the size and use a fraction of the electricity to function. That's one of the reasons they'll eventually drop all AMD dGPUs. I would compare it to ICE engines we've used in cars for 100+ years in relation to electric motors powering cars. The electric motor is a lot quieter and more energy efficient. It also doesn't need all the regular maintenance that a combustion engine does. In ten years or possibly more, most every car will be powered by an electric motor. Once they get to the point they cost less to build than ICE cars. Building their own SoC and desktop/laptop GPU also saves Apple a lot of money and helps them keep profit margins high without raising prices. We all know that is also a big part of their DNA. 30% profit margins are what has made them their huge 2 trillion + market cap they have today.
 
Last edited:
Compatibility of 3rd-party devices with Apple hardware has usually been driven by the fact Apple themselves use the device, and I am not aware they have ever supplied driver support for devices they don't. Except perhaps, for Final Cut and Logic Pro that are not a part of the OS and are specifically designed to do a given job. However we have to modify this view a little ...

Apple does adhere to standards. For example that cheapo webcam works with the Mac Mini not because Apple explicitly likes the brand or uses it but because the webcam is UVC compliant, which is a broad standard. That audio mixer works when connected because it is a UAD, another industry standard, not because Apple supports that particular model. Adhering to standards is a good thing, making Apple more of a team-player than a lot of people give them credit for.

With graphics card support the reality is more complex but basically the same. For example Apple never offered an RX560 AMD GPU but they did a Radeon Pro 560X. The specs of the two are close enough that you can use an RX560 with only the need to rename it, perhaps using Whatevergreen.kext. Other GPUs, whether AMD or NVidia inherently use the same driver model whatever the OS and so are coincidentally compatible when we build a Hack with one. Not because Apple supports it directly. We have lists here and here on the Site of the GPUs supported directly and indirectly, and their chipset brethren.

Of course it would be nice if Apple decided to support newer GPUs and if they continue with the Intel Mac Pro line they may upgrade the GPUs they themselves offer and include the RX6*** series. But if they decide to throw all their development muscle into creating their own new GPUs then we are left at the 5700/Vega 2 point in history.

None of this is "bad" it's just how it is. If you go way back in Apple's history you will find all kinds of weird and wonderful standards, sockets and plugs, totally incompatible with mainstream PC parts. Probably to make the distinction clear between the two approaches to desktop computing, and of course to restrict what is being connected so Apple can control reliability.
Explained perfectly! We've been through a lot during the thread and historically. I stand by my thought we won't see support given how good the M1 is. Also I see no reason for Apple to update the Intel side (especially given the campaigns we are seeing.) That tells me they have broken relations.
 
Yes. The Mac options aren't good for many reasons. One of course is cost. I just priced out a lower end iMac which after taxes was above $4000. That is beyond my budget, especially considering the lack of upgradability and power.

Secondly, as many others I am on a budget, so I have to find a way to make my money go as far as it can. As someone who likes to game in Windows (and needs access to a few programs in there), I can't really afford to put thousands into a gaming PC, then turn around an purchase another Mac on top of that. We would be looking at like $7000+ and I simply don't have the budget for that.

Finally, my current setup and requirements make Macs a limiting option.They are simply too restrictive in their capabilities for me to consider. Which is why I was so happy to have found the Hackintosh solution. I was able to put a reasonable amount of money towards a PC that gave me the flexibility to deal with most of my needs. It was working perfectly until the lack of Nvidia drivers.

I could of course purchase an older generation AMD card and get OSX upgraded, but I am not interested in putting large amounts of cash on an older card when I would prefer the newest generation for work and gaming. I want to future proof my PC as much as I can, so starting off purchasing older GPU's seems like a bad idea.
Really sorry to say this, but have you considered getting an M1 Mac? They are amazing.

I game in Parallels for the few that I don't have a Mac equivlent.
 
Last edited:
But this graphic is not just for gamers, for me it would be perfect for 3D to be able to speed up the rendering of Octane or Redshift, which now work well on mac with betas. So many of you are going to ask: but why don't you just go back to PC?
I tried again, but Windows sucks, we waste more time trying to get apps up and running than working, all work time slows down 10% compared to Mac, if we win with the GPU we lose out on workflow. It starts to be complicated to have the best of 2 worlds on one machine (Hackintosh).
 
Back
Top