Contribute
Register

OS X Mountain Lion 10.8.3 Beta has AMD Radeon 7XXX drivers

Status
Not open for further replies.
I would think the real reason behind this move is compute power. Look at Hashing Benchmarks or any more general GPGPU benches and the AMD is far and away the better architecture per watt or per dollar. This includes the recently released GTX Titan based on the GK110. In the new mac pro I think GPGPU has to be considered far more than the other product releases for Apple. Open CL acceleration, as well as graphical performance will be a consideration, plus with AMD's stock situation I think the please pick me sweet deal is also likely.

http://www.tomshardware.com/reviews/radeon-hd-7970-ghz-edition-review-benchmark,3232-14.html

I really hope this comes to fruition though. I just purchased a 7870LE (think of it as a 7930, still a Tahiti chip) and would love to not lose my hackintosh due to the removal of my GTX 480. In related news if anyone wants to purchase a GTX 480 let me know...
 
I would think the real reason behind this move is compute power. Look at Hashing Benchmarks or any more general GPGPU benches and the AMD is far and away the better architecture per watt or per dollar. This includes the recently released GTX Titan based on the GK110. In the new mac pro I think GPGPU has to be considered far more than the other product releases for Apple. Open CL acceleration, as well as graphical performance will be a consideration, plus with AMD's stock situation I think the please pick me sweet deal is also likely.

http://www.tomshardware.com/reviews/radeon-hd-7970-ghz-edition-review-benchmark,3232-14.html

I really hope this comes to fruition though. I just purchased a 7870LE (think of it as a 7930, still a Tahiti chip) and would love to not lose my hackintosh due to the removal of my GTX 480. In related news if anyone wants to purchase a GTX 480 let me know...

Agreed. Waiting patiently here with my 7950 (that cost almost half the price of the comparable-but-slower GTX670. No thanks nvidia, you can keep it). ATI drivers on OSX have historically been better as well, so hopefully that continues.
 
I would think the real reason behind this move is compute power. Look at Hashing Benchmarks or any more general GPGPU benches and the AMD is far and away the better architecture per watt or per dollar. This includes the recently released GTX Titan based on the GK110.

This is simply not true. In OpenCl AMD performs much better because Nvidia OpenCl drivers are terrible, not because the hardware is bad at it. Nvidia OpenCl drivers are terrible because they'd like people to continue to invest in coding CUDA. As long as people, and applications, stick with CUDA, Nvidia has a guaranteed consumer base. This could be seen with Adobe products. Initially, if you wanted GPU acceleration, you're only option was a NVidia card. Only later did Adobe add OpenCl code allowing AMD products accelerate work also.

While more and more consumer applications are incorporating gpu acceleration, NVidia's big market is still in the enterprise space. Nvidia has every reason to continue to gimp OpenCl on their products. The only reason Nvidia would have to work on OpenCl would be if their compute market share were dropping precipitously, in which case I'm sure they could have a very functional OpenCl driver available quickly(I'd guess they already have it). But their market share is not dropping. They still have a huge advantage (essentially the entire market). And the more time that goes on, the more expensive (in time, training, and hardware) it becomes for a person/company/lab to switch to OpenCl.
 
This is simply not true. In OpenCl AMD performs much better because Nvidia OpenCl drivers are terrible, not because the hardware is bad at it. Nvidia OpenCl drivers are terrible because they'd like people to continue to invest in coding CUDA. As long as people, and applications, stick with CUDA, Nvidia has a guaranteed consumer base. This could be seen with Adobe products. Initially, if you wanted GPU acceleration, you're only option was a NVidia card. Only later did Adobe add OpenCl code allowing AMD products accelerate work also.

You are correct in that CUDA and the marketing muscle behind it gives a larger market share, especially in the enterprise space. But in terms of peak GPGPU performance you can't claim NVIDIA's current gen cards are where AMD's are, especially in double precision.

http://www.theinquirer.net/inquirer...hed-amds-mid-range-radeon-hd-7870-gpu-compute

Nvidia will continue to dominate in the enterprise space due to good library support with CUDA and a developer base that is well entrenched in those modified C libraries. But the content creation space that the MAC PRO will target will continue to embrace OpenCL quickly and aggressively as it has done in the last year. And for that the AMD option is more powerful and more cost effective. When a $375 dollar card in the 7970 can compute on par with the GTX Titan at $900+ you see the benefit.
 
Agreed. Waiting patiently here with my 7950 (that cost almost half the price of the comparable-but-slower GTX670. No thanks nvidia, you can keep it). ATI drivers on OSX have historically been better as well, so hopefully that continues.

Wow, I would really want to know where I can get a 7950 for half the price of a 670. I envy you.

Here at my place the 670 is more expensive than the 7950, but by only about US$40-50 equivalent.
 
I'm a bit confused as to what new Macs will have 7xxx graphics? Is there a Thunderbolt addon graphics chamber in the works? Or is this finally the new Mac Pro? But why not Kepler for that?

It's not either/or. If they add support, it will simply open up more options. They won't remove NVidia support.
 
Has anyone confirmed this working other than the 7770 in the Youtube video? I am in the process of building a new hackintosh and would love to be able to use a 7970 for the dual mini display port
 
Until 10.8.3 is released, this is all hearsay.

Any word on when that may be? I may just run my computer on hd4000 until it is sure if there is support for HD 7000 series or not.
 
Status
Not open for further replies.
Back
Top