Contribute
Register

4K UI performance Nvidia/AMD

Is your 4K UI experience smooth, and if so, what GPU do you use? (please specifiy in a post)

  • Yes! I'm a Nvidia user.

    Votes: 7 46.7%
  • Yes! I'm a AMD user.

    Votes: 2 13.3%
  • No! I'm a Nvidia user.

    Votes: 6 40.0%
  • No! I'm a AMD user.

    Votes: 0 0.0%

  • Total voters
    15
Status
Not open for further replies.
Joined
Oct 31, 2010
Messages
37
Motherboard
Gigabyte X79-UD3
CPU
Intel Core i7 4930k
Graphics
MSI GTX770 2GB
Mobile Phone
  1. Android
Hi!

I'd like to start a discussion about 4k UI peformance in El Capitan.
I've been using a 4K monitor (60hz) paired with a 1080p monitor for a while now, and I'm very disappointed to say the least. Of course, one can't really be disappointed when using a hackintosh since hickups are to be expected but this seems to be a driver related problem (maybe even within Mac OS) and it has forced me to switch to windows, something I'm definitely not comfortable with and which has greatly impacted my productivity.

Here's the thing: I'm using a GTX770 (2GB) to drive both my displays with the Nvidia web drivers. I'm using the first displayscaling option, one that gives the "may impact performance" warning. Using no scaling at all is way too small, and the next option just really negates my move to 4K since things get way too big already. I have full graphics acceleration and everything seems to work as it should.

Until I open any heavy programs. Programs like Logic Pro X, Photoshop, Permiere, Final Cut Pro etc. etc. they are all unusable. The UI lags terribly, and the framerate within these programs sometimes falls back to a feeble 5fps.
Now, I've of course been googling around, and the Logic Pro issues seem to be widespread. Even on Apple's own products, like the 5K iMac the issues seem to be present big time. This is not an issue isolated to hackintoshes apparently, but Apple screwing things up.

I read about the "Metal framework" yesterday. In short, this framework developed by apple promises to resolve issues with sluggish UI performance and increase general GPU performance. I have read some reports already from people using real Macs that it did resolve the problems to some extend, also within heavy applications.
Unfortunately there's no clear sign yet that Nvidia will implement this framework within their web drivers and within what timespan, so we can't do anything else but wait in this case.

Finally, I'd like to hear all of your opinions and experiences concerning using 4K displays. How is it coming along for you guys? Is the UI smooth? Do you use display scaling, and if so, which one? Are the heavy applications smooth? And most importantly; what GPU do you use?

I'm especially wondering how the AMD GPU's are doing in this area, since most of the forum posts concerning these issues seem to be from Nvidia users. Then again, Nvidia is the most popular option amongst Hackintoshers if I'm not mistaken.
If these problems are less apparent on camp AMD I'm definitely willing to switch out my GTX770 for a red one.

So, let me know your thoughts, and maybe you can help me out. I'm on the verge of immediately buying the new Macbook Pro when they come out next month because I simply can't do any serious work in Windows (its audio environment just doesn't work) and I'm dying to get some work done.
 
Last edited:
I wanted to chime in here. I had a 980ti that gave me problems with Adobe Apps. Illustrator used to randomly crash. Photoshop would have graphical glitches till I turned down GPU acceleration. I switched to the r9 380 and all that went away. I ran Cinebench at there was a 3 point difference between the two cards...so not much performance loss. Plus...I get 10bit color output. It was win win.
 
I wanted to chime in here. I had a 980ti that gave me problems with Adobe Apps. Illustrator used to randomly crash. Photoshop would have graphical glitches till I turned down GPU acceleration. I switched to the r9 380 and all that went away. I ran Cinebench at there was a 3 point difference between the two cards...so not much performance loss. Plus...I get 10bit color output. It was win win.

There's a MASSIVE performance difference between the 980ti and the R9 380-- massive. Like double. (See: http://www.videocardbenchmark.net/high_end_gpus.html, or check out some gaming/compute benchmarks.)

It's just that Cinebench is virtually useless, particularly in OS X, for judging relative GPU performance. Switch to something like Heaven, or gaming benchmarks, or BruceX (for FCPX render check) and you'll see what I'm talking about. In OS X, Cinebench ends up weighting your single-core CPU performance instead of GPU performance-- when you do the graphics test!
 
I wanted to chime in here. I had a 980ti that gave me problems with Adobe Apps. Illustrator used to randomly crash. Photoshop would have graphical glitches till I turned down GPU acceleration. I switched to the r9 380 and all that went away. I ran Cinebench at there was a 3 point difference between the two cards...so not much performance loss. Plus...I get 10bit color output. It was win win.
Thank you for your reply. The important question is however: are you using a 4k display?
 
Bad HiDPI UI performance in OS X is a known fact; however it shouldn't be that bad. Due to the way Apple handles the scaling, the internal resolutions will become insanely high when using "non-@2x" resolutions.

Maybe your GPU is somehow stuck in a low power state? Might be worth a try observing that...
 
Bad HiDPI UI performance in OS X is a known fact; however it shouldn't be that bad. Due to the way Apple handles the scaling, the internal resolutions will become insanely high when using "non-@2x" resolutions.

Maybe your GPU is somehow stuck in a low power state? Might be worth a try observing that...

I indeed know about the way Apple handles scaling, that's why they give the warning themselves after all. But still, even when not using these scaled resolutions these heavier applications still seem to lag behind.
Because I read about this all over the net about real macs too I'm simply afraid there's no solution and Apple is simply forgetting about the pro-scene, like the rumors are saying.

However, this power state thing is something I was thinking about too, but I'm not sure how to find out if it is, do you have any ideas about this?
 
I think adding the GPUSensors plugin to FakeSMC should give you what you're looking for (GPU core/mem clocks). The number of readings which is accessible like that varies with GPU models, but Kepler should be quite well supported.

Alright, thanks. However, I'm kind of weary when it comes to fakeSMC, since the last time I tried to get that to work I killed my setup by simply installing the FakeSMC kext. I'll however give it a try again, since I'm not dependent on my Mac setup for now anyways.
 
Ok, so this time the whole HWsensors things went without a hassle. I've been trying some things and the GPU clock speed seems to go up and down as it should. For example, in Logic Pro it bumps up to full clock speed and then quickly goes down to about 4/5th (around 800Mhz), so this seems to be like it should. No explanation yet why it lags so much apart from Apple simply screwing up their applications.
 
There's a MASSIVE performance difference between the 980ti and the R9 380-- massive. Like double. (See: http://www.videocardbenchmark.net/high_end_gpus.html, or check out some gaming/compute benchmarks.)

It's just that Cinebench is virtually useless, particularly in OS X, for judging relative GPU performance. Switch to something like Heaven, or gaming benchmarks, or BruceX (for FCPX render check) and you'll see what I'm talking about. In OS X, Cinebench ends up weighting your single-core CPU performance instead of GPU performance-- when you do the graphics test!


I wasn't aware that Cinebench weighs in your CPU when do a GPU graphics test. But I do agree that the 980ti crushes the R9 380 when it comes to gaming/compute benchmarks. I ran Heaven and my 980ti smashed it.

The one reason I keep the r9 380 (And this is a personal reason) is I do alot of graphical design. The 980ti does not have 10 bit color output. None of the Geforce cards do...you need to pickup the quadro line. So even though i have a nice 10bit monitor that I spent tons of $$ on...I can't use it to the max with the 980ti. :( I would pick up a quadro...but those are ridiculously priced. So I found a "cheap" alternative.
 
Status
Not open for further replies.
Back
Top