Contribute
Register

Nvidia Web (alternate) Drivers vs standard GeForce Drivers

Status
Not open for further replies.
Joined
Jan 1, 2017
Messages
5
Motherboard
ASRock Pro4s z170
CPU
i7 6700K
Graphics
EVGA GTX 1060
Mac
  1. MacBook Pro
Mobile Phone
  1. iOS
All the new Pascal support excitement got me thinking about something that's bothered me since I went hackintosh.

What is the difference between the standard GeForce drivers available for other OS's and the Web/Alternate drivers we use on MacOS? It seems like there is a huge performance difference in 3D games between the two.

If this is already described elsewhere, I'd appreciate anybody that can point me in the right direction.
 
The gaming performance differences between MacOS and Windows is more because the fact MacOS uses OpenGL and Windows uses DirectX.

That, and needing to be developed with Apple's implementations of OpenGL and other such features in mind (Thanks, Metal!). It's different, sadly, from simply porting the Linux drivers, at least from what I can tell.

I never did test, but I wonder what performance differences are like between macOS (Preferably Sierra) and Linux. I don't have a spare HDD to install Arch, so I can't examine for myself.
 
That, and needing to be developed with Apple's implementations of OpenGL and other such features in mind (Thanks, Metal!). It's different, sadly, from simply porting the Linux drivers, at least from what I can tell.

I never did test, but I wonder what performance differences are like between macOS (Preferably Sierra) and Linux. I don't have a spare HDD to install Arch, so I can't examine for myself.

I came to MacOS Sierra (Hackintosh) from Ubuntu 16.04. Ubuntu drivers delivered high frame rates (at least on the old crappy games I play) but didn't provide consistent, smooth play. Just didn't seem stable. MacOS seems a lot more stable and even, but the frame rate is much lower. Obviously, Windows is the control in all this. High, consistent frame rates.

So, what I'm hearing is that the drivers are the "same". Any gaming difference isn't because of drivers but because of other issues. I guess compute is in a similar boat?
 
I came to MacOS Sierra (Hackintosh) from Ubuntu 16.04. Ubuntu drivers delivered high frame rates (at least on the old crappy games I play) but didn't provide consistent, smooth play. Just didn't seem stable. MacOS seems a lot more stable and even, but the frame rate is much lower. Obviously, Windows is the control in all this. High, consistent frame rates.

So, what I'm hearing is that the drivers are the "same". Any gaming difference isn't because of drivers but because of other issues. I guess compute is in a similar boat?

Probably a difference in how things are displayed; from what little I actually know of Linux, there's two graphical display implementations (X and Wayland) with many different variants of use, and they are imperfect. OS X's is more refined... probably thanks to more consistent, refined APIs instead of riding the generic FOSS/OpenGL banners for all it's worth (Even if that's all OS X uses at its core...).

Again, taking a bird's eye approach to viewing this so I'm probably wrong/oversimplifying things but yeah. But makes sense, since the core drivers are 'similar', just how they're implemented is different.
 
Let's just say, even the OpenGL on Mac is too old(4.1 I believe?)

Sometimes the gaming performance difference can be double or tripple less compared to DirectX or Metal

That may seems ridiculous but that's what I've been experienced on StarCraft 2

I mean seriously, how nonsense it is for the same card has less FPS on 1080+Low+OpenGL than 4K+Ultra+DirectX/Metal?

Mac has been a weak gaming platform for some reason
 
Status
Not open for further replies.
Back
Top