Contribute
Register

[Success] AMD RX6000 Series working in macOS

During SWIM’s time working at Apple SWIM was explained that the fault was actually on Apple’s side.
They definitely had the ability to make Nvidia cards work, but they were holding off, because ultimately their goal is to undermine the Nvidia CUDA mononopoly and boost their own hardware and Metal standards.

As upsetting as this was as a consumer, internally, SWIM was actually using special drivers on Catalina with Nvidia GTX 1080ti’s and RTX 2080ti’s without any issues.
SWIM might have snagged some of those driver files from their machine, but was never able to get it to work on his own, sadly :(

Who or what is SWIM?
maxresdefault.jpg
 
Last edited:
SWIM = "someone who isn't me". A way of saying "obviously I mean me, but legally I can't actually say it's me."

So I'd read that previous post as implying that "I worked at Apple but due to NDAs I can't admit that fact, so I'm pretending this story is hearsay." Of course it could also actually be hearsay.

Personally I always assumed the NVidia split was instigated from Apple's side. I can think of various reasons they'd want to lock NVidia out - such as the possibility of negotiating better terms with AMD by guaranteeing they have exclusive access to market for new GPUs in Macs. And the push to promote Metal that the previous poster claimed certainly sounds plausible; I can see them not wanting NVidia to be able to continue with CUDA on Macs and give customers a competing option. Especially as by that point Apple was already secretly developing Silicon, and knew the time would come when Metal would mean Apple hardware exclusively.

Looking at it from the other side, it seems hard to believe that NVidia would just throw away the chance to sell new hardware to Mac users. Especially as it came at the same time as eGPU support was added, so that market was growing. The cost of developing and supporting drivers - drivers that were already written - seems like it would be significantly lower than potential revenue from Mac users, so why not support them if they could? Someone said it was because they didn't want to support Metal, but I don't buy that. In fact that could give them an opportunity to 'prove' CUDA was better than Metal, by optimising the former and not the latter. Which again speaks to Apple wanting to prevent that possibility.

I'm also not sure that it's true to say it was as simple as getting a developer account. Even if that was all that was technically required, that still allows Apple to block notarizing - they always have the final say, and have blocked developers before for various reasons. On top of that I do wonder if there's more technically required in GPU drivers than just following published APIs and then notarizing them. I had thought that developing GPU drivers for macOS - along with necessary support for Apple Secure Boot and the like - would require some help from or cooperation with Apple, but I don't know enough about device driver development on macOS (or in general) to know for sure.

I have no problem believing that both Apple and NVidia are quite capable of blocking support for entirely selfish and anti-consumer reasons. But in this case it seems to me to be more likely to be Apple than NVidia. And there appears to be some evidence for this beyond the public statement from NVidia, such as journalists that spoke (off-the-record) to Apple developers.

All that said, I don't see many parallels in that situation and that of AMD 6000-series drivers, besides the obvious one of "we're all waiting for news and have no idea if or when it will come." Apple and AMD have an ongoing relationship, with AMD GPUs being sold every day in Apple hardware and reports of at least some new Intel-based Macs still to be released. The day will come when Apple is 100% using Apple hardware, but we're not quite there yet. I would be pretty surprised if the 6000-series GPUs were not supported at some point fairly soon - though I could definitely see them being the last third-party GPUs to receive support.

Personally I'm choosing to look on that recent post by the Octane developer as a good sign. He'd be better placed than the average person to know about macOS GPU support, and he claimed that 6800/6900 support was imminent. Actually he claimed it was there in 11.3 beta 3, which it isn't (at least not full support), but at least that seems like a promising sign.
 
SWIM = "someone who isn't me". A way of saying "obviously I mean me, but legally I can't actually say it's me."

So I'd read that previous post as implying that "I worked at Apple but due to NDAs I can't admit that fact, so I'm pretending this story is hearsay." Of course it could also actually be hearsay.

Personally I always assumed the NVidia split was instigated from Apple's side. I can think of various reasons they'd want to lock NVidia out - such as the possibility of negotiating better terms with AMD by guaranteeing they have exclusive access to market for new GPUs in Macs. And the push to promote Metal that the previous poster claimed certainly sounds plausible; I can see them not wanting NVidia to be able to continue with CUDA on Macs and give customers a competing option. Especially as by that point Apple was already secretly developing Silicon, and knew the time would come when Metal would mean Apple hardware exclusively.

Looking at it from the other side, it seems hard to believe that NVidia would just throw away the chance to sell new hardware to Mac users. Especially as it came at the same time as eGPU support was added, so that market was growing. The cost of developing and supporting drivers - drivers that were already written - seems like it would be significantly lower than potential revenue from Mac users, so why not support them if they could? Someone said it was because they didn't want to support Metal, but I don't buy that. In fact that could give them an opportunity to 'prove' CUDA was better than Metal, by optimising the former and not the latter. Which again speaks to Apple wanting to prevent that possibility.

I'm also not sure that it's true to say it was as simple as getting a developer account. Even if that was all that was technically required, that still allows Apple to block notarizing - they always have the final say, and have blocked developers before for various reasons. On top of that I do wonder if there's more technically required in GPU drivers than just following published APIs and then notarizing them. I had thought that developing GPU drivers for macOS - along with necessary support for Apple Secure Boot and the like - would require some help from or cooperation with Apple, but I don't know enough about device driver development on macOS (or in general) to know for sure.

In general I have no problem believing that both Apple and NVidia are quite capable of blocking support for entirely selfish and anti-consumer reasons But in this case it seems to me to be more likely to be Apple than NVidia. And there appears to be some evidence for this beyond the public statement from NVidia, such as journalists that spoke (off-the-record) to Apple developers.

All that said, I don't see many parallels in that situation and that of AMD 6000-series drivers, besides the obvious one of "we're all waiting for news and have no idea if or when it will come." Apple and AMD have an ongoing relationship, with AMD GPUs being sold every day in Apple hardware and reports of at least some new Intel-based Macs still to be released. The day will come when Apple is 100% using Apple hardware, but we're not quite there yet. I would be pretty surprised if the 6000-series GPUs were not supported at some point fairly soon - though I could definitely see them being the last third-party GPUs to receive support.

Personally I'm choosing to look on that recent post by the Octane developer as a good sign. He'd be better placed than the average person to know about macOS GPU support, and he claimed that 6800/6900 support was imminent. Actually he claimed it was there in 11.3 beta 3, which it isn't (at least not full support), but at least that seems like a promising sign.
this thread is for the lack of AMD 6800 support... Not a thread to complain about the lack of Nvidia drivers.....
 
Looking at it from the other side, it seems hard to believe that NVidia would just throw away the chance to sell new hardware to Mac users. Especially as it came at the same time as eGPU support was added, so that market was growing. The cost of developing and supporting drivers - drivers that were already written - seems like it would be significantly lower than potential revenue from Mac users, so why not support them if they could? Someone said it was because they didn't want to support Metal, but I don't buy that. In fact that could give them an opportunity to 'prove' CUDA was better than Metal, by optimising the former and not the latter. Which again speaks to Apple wanting to prevent that possibility.

If they had Metal drivers and didn't optimize them so that it would run like crap when compared to Cuda, it could have had the unintended effect of proving that AMD GPUs are better.
 
this thread is for the lack of AMD 6800 support... Not a thread to complain about the lack of Nvidia drivers.....
True but, since there is nothing happening on AMD drivers front, we are just keeping this thread lively :lol:
 
Yes Metal v CUDA. nVidia chose the proprietary route with CUDA, made it better than the competition, and customers got locked in to various Apps that only worked with, or worked much faster with CUDA.

Apple wanted Metal (2) to replace all the old outdated tech, doing what OpenGL/OpenCL/CUDA did, but open and supporting both platforms equally. You can imagine Apple wanting to ship whatever GPUs will suit the OS best at a price point and not have its customers locked in or out of vNvidia's proprietary strategy depending on what year Mac they have, or being forced to make their OS updates compatible with nVidia into the future, if e.g AMD gave them the better deal.

On the other hand you can also imagine nVidia chose it's strategy and needs to stick to it for it to pan out how they planned or its all for naught.

You can see both sides have a point, time will tell who regrets not budging. At the end of the day, gamers pay the bills for GPU manufactures, and that's Windows.
 
I think the following article gives a hint as to whether there will be AMD GPUs in future Apple Silicon Macs.


Didn't I read somewhere that NVidia were going to actively prevent their high-end GPUs being able to be used for mining?

Two completely different reactions and attitudes to a similar problem.
 
Didn't I read somewhere that NVidia were going to actively prevent their high-end GPUs being able to be used for mining?

Two completely different reactions and attitudes to a similar problem.

Yes. Nvidia is supposedly making their gaming cards less capable for mining. However, they also released cards for the specific purpose of mining.
 
Here we are my friends - a turning point I believe. AMD and NVIDIA have both discovered the miners as a clientele and they are now building individual products for them based on existing GPU designs but with the hint that those would be only usable for mining an nothing else. The good point here might be that this eases the situation for creatives and gamers as there is currently not enough supply and the GPU prices, even for second hand cards are skyrocketing. But with the current tendency of the market one could wonder what earns you more for a living, doing creative work on the GPU or just let it mine cryptos 24/7. To me that is the real question and frankly I am not sure anymore if creative usage of a GPU has enough advocates anymore. I am not talking about the gaming industry here, that's another big factor but the creative industry. Where are all those creative users. Maybe there are less then a decade or two ago? Or maybe they just became a less important clientele then consumers and miners. That's the point, when I look at Apple and AMD and NVIDIA. They don't care about the prosumers anymore as they tend to buy only once every 5 to 10 years and not yearly. The creative software industry has made the move before to subscription based models and yes those are more interesting for them, because they have a more stable cashflow and market behavior is more predictable, if you cut consumers freedom. So while we were wondering where our choices have gone the whole industry has changed and it turns out that the prosumers are not interesting anymore. Only consumers are needed for now and if there is something you could possibly use that GPU hardware for then you could just damn put it to mine. Who in the end needs a human brain behind that machine at all. It's a long long time ago since apple took that photography of Pablo Picasso and made an advertisement campaign with it. I think they called it "think different". Today they'd just take an image of any generic person and if there was a claim it would probably say "don't think at all"...because there is an app for that.
 
Back
Top