Contribute
Register

Graphics on Hackintosh vs Windows - GTX980

Status
Not open for further replies.
Joined
Feb 9, 2012
Messages
48
Motherboard
Z97-D3H Rev 2.2
CPU
i7 4790k @ 4.6 Ghz
Graphics
MSI Gaming GTX 980 4GB (OC)
Mobile Phone
  1. Android
Hi,

I am posting to start a discussion to help me decide what is right for me here. If this is the wrong section for this topic, please transfer my post... So, i recently got my first big bad GPU. a GTX 980 4GB. I ran benchmarks in unigine and saw FPS around 60 and thought "I guess thats good".. I'm not a gamer, but use OSX for 2D and 3D CAD/CAM, heavy photo editing, photo design, rarely video editing...

So, today I decided to OC my card in Windows. Installed Win 7 and ran the same benchmark. WOWWW!! Did some overclocking and within ten minutes I was maxxing 310 FPS in Unigine Heaven and averaging +150 FPS.

I understand this is due to DirectX11 vs OpenGL, but really? So, I ask you, am I really just pouring my money down the drain with this 980 in OSX? I dont game and realize FPS isn't everything. But really, like, the Unigine score is TWICE in Windows than OSX. Honestly, i've been diehard OSX for 5 years. Have jumped through ridiculous hurdles to get a stable hackintosh. But im wondering if this is worth it after seeing these numbers..

So, again, my question is, was buying a 980 a waste of money to use in a hackintosh (for cad, cam, photo editing, and gaming)?

Thanks
 
I guess its up to the end user? Most people have a dual boot machine that use their machines for gaming, I personally like OS X for the interface simplicity of use and I suppose to some extent the challenge of getting it to run in the first place. I play few games these days but the one i suppose i play most often would be WORLD OF WARCRAFT, while i get 100 fps on my mac and double on my windows boot i choose to use my OS X more often than not 1.because i can't see the difference between 100 and 180fps but more than anything i use os x because of the fact windows installs degrade so fast wheres macs stay consistent .
every windows install i have used for everyday computing (mail, web, gaming, video etc) I have to clean install every 6 months whereas os x is just as good 1 year after ... I have a 980ti in my 1 rig and a gtx titan in another, waste of money, No. but knowing that dX11 or 12 is going to offer far more horse power for gaming does not translate to the same gains in photo editing or rendering.
 
OS X drivers are not that bad, the problem is "just" the high CPU overhead, which makes bottlenecks many applications / games, especially in systems with older processors.

A few weeks ago I had a GTX 980 in my Skylake rig for testing, and its performance was almost on par with Windows 10. The difference decreases with higher graphics settings, as it shifts more load onto the GPU. OS X will look very poor when comparing e.g. Unigine Heaven benchmarks with "Basic preset", but it should be fine when going up to 1440p/4K with "Ultra" quality, Tesselation and all that stuff.

Have a look here (yes, DirectX would add another few % on top of that, but it's not much):

980 Heaven OS X Ultra 1440p.png 980 Heaven Win10 Ultra 1440p.PNG
 
FL0R!an, thanks for your reply. You provided some interesting theory. So, i tested that theory in Windows and OS X.

No overclock. Stock speeds. Here is a benchmark in OS X. Had similar results in Windows 7, so your theory is correct. P.S. look at my signature (updated), my cpu is def not bottlenecking my card in gaming or benchmarks.
Screen Shot 2016-12-29 at 8.00.02 PM.png


So, whats the verdict??????; that DirectX is always gonna be better for gaming, hence OSX is not for gaming. It also seems to be that Unigine is a garbage benchmark between the two operating systems.
 
I'm quite sure virtually any available CPU is bottlenecking the GPUs in OS X, especially at low resolutions, thanks to the CPU overhead in the drivers.

You could verify that by OC'ing your CPU and observing GPU benchmark results. In heavily CPU bound benchmarks like Cinebench, the score should increase linearly with CPU clock, and you should see a similar behavior in Unigine when choosing the "Basic" preset.
 
Fl0r!an,

I'm still a bit boggled on windows vs OSX performance. Mainly in reference to FPS. I downloaded ARK: Survival Evolved and so far it preforms a lot better in Windows than OSX. My HW Monitor is also reporting low (-500 Mhz) GPU core clock, which is P*ssing me off (separate thread for that).
 
Windows will always have more "direct access" to the GPU, it's just the way it's always been.

Metal was supposed to take care of that, but it's not popular yet because GPUs are so limited by macOS (now mostly sub-par AMD, officially).

Adobe CC2017 apps have a new "Metal" renderer, in Media Encoder and Premiere Pro. They are slightly faster than OpenCL from my tests, but when I switch to CUDA, it trumps them both.

I just hope Metal isn't abandoned...it was supposed to make it easier for Devs to hit the GPU directly than have layers blocking access to it.

But as Fl0r!an mentioned, it's not that big of a difference.

Also macOS is not really for gaming, for production work, CUDA is far superior at the moment than Metal or OpenCL.

If you really want to game, I suggest Windows for that. You will always get better drivers and performance.
 
Windows will always have more "direct access" to the GPU, it's just the way it's always been.

Metal was supposed to take care of that, but it's not popular yet because GPUs are so limited by macOS (now mostly sub-par AMD, officially).

Adobe CC2017 apps have a new "Metal" renderer, in Media Encoder and Premiere Pro. They are slightly faster than OpenCL from my tests, but when I switch to CUDA, it trumps them both.

I just hope Metal isn't abandoned...it was supposed to make it easier for Devs to hit the GPU directly than have layers blocking access to it.

But as Fl0r!an mentioned, it's not that big of a difference.

Also macOS is not really for gaming, for production work, CUDA is far superior at the moment than Metal or OpenCL.

If you really want to game, I suggest Windows for that. You will always get better drivers and performance.

This is precisely why I dual boot macOS and Windows 10. Metal in Adobe CC also tends to be quite buggy. I can't even use it in Premiere CC 2017 because none of the video previews show up.
 
This is precisely why I dual boot macOS and Windows 10. Metal in Adobe CC also tends to be quite buggy. I can't even use it in Premiere CC 2017 because none of the video previews show up.

Absolutely you are 100% correct.

I just stick to Windows for dumb things like gaming and VR on the weekends.

Everything I do that's productive (that makes me money and a living) is under macOS (El Cap for now since it's stable for me).

I just finally setup the 2x980Ti Hybrids and with proper fan setup it's really quiet and a small machine but super speedy for Adobe work.

Thankfully there are zero glitches now that I disabled iGPU. I removed Intel injector too from clover.config so the nVidia glitches are gone too.

I use CUDA in Adobe Apps and OpenCL if there are any glitches (haven't seen any lately after disabling Intel inject completely).

Also switched to 17,1 profile with the AGDPfix and no glitches so far after a hardcore session of all Adobe apps running at the same time.

A happy puppy indeed. For now. :)
 
To bump this thread and pose a question of my current problem. Can the CPU bottle neck be overcome with a nice/renice input via terminal?

I am also trying to run Ark survival evolved on a GTX960 4gb. Crazy FPS spikes especially in pvp etc. Lowering setting/messing with them does not yield results.

I have tried running sudo renice -18 -p PID
And yes it boost my load times (greatly) of the game and all that, but why the huge FPS spikes and overall **** game play? Is that just an OpenGL vs Dx?


*currently downloading Windows 10 to test that, but would love come informed feedback.*
 
Status
Not open for further replies.
Back
Top