Contribute
Register

GTX 660 glitch issue with MacOS High Sierra 10.13

Status
Not open for further replies.
I have both a 780 and a 660 in my computer.
The 780 alone works great with high sierra and nvidia drivers ! :thumbup:

When both cards are connected and the 780 is plugged to my monitor, the computer slows down a lot and I lost transparency on some UI elements. When I try to connect the 660 to the monitor when both cards are connected I have the glitches but I don't have slowdowns... :crazy:

(I use both cards for CUDA 3D rendering, it would be so great to be able to use them both with macOS :cry:)

Thats great to hear Peter. Thank you!

I use Premier Pro and After Effects mostly and the 650 has been a work horse for me.
I NEED at least 3 monitors (2 DVI and 1 HDMI) and occasionally a 4th with HDMI.
There are FEW cards available with 2 DVI and 2 HDMI ports that work simultaneously.
Slim choices and they are Expensive!

Im considering a EVGA GeForce GTX 780 Dual FTW (Used at $325) but it has 2304 CUDA Cores, 3GB GDDR5 384bit Memory, 1033 MHz GPU core. supports a max of 4 monitors; 2 DVI, HDMI and Display Port.
Thats kick ass even at todays standards.
 
Thats great to hear Peter. Thank you!

I use Premier Pro and After Effects mostly and the 650 has been a work horse for me.
I NEED at least 3 monitors (2 DVI and 1 HDMI) and occasionally a 4th with HDMI.
There are FEW cards available with 2 DVI and 2 HDMI ports that work simultaneously.
Slim choices and they are Expensive!

Im considering a EVGA GeForce GTX 780 Dual FTW (Used at $325) but it has 2304 CUDA Cores, 3GB GDDR5 384bit Memory, 1033 MHz GPU core. supports a max of 4 monitors; 2 DVI, HDMI and Display Port.
Thats kick ass even at todays standards.


Sounds like you're kinda in the same boat as I am. I use my Mac only for Final Cut Pro.

My 650TI Boost only has glitches when using Final Cut. I am beginning to notice artifacts in the transcoded outputs as well.

It's a shame really. The 650 is a great card which has really been amazing.

But I'd rather go back to Sierra then buy another card.
 
Last edited:
Sounds like you're kinda in the same boat as I am. I use my Mac only for Final Cut Pro.

My 650TI Boost only has glitches when using Final Cut. I am beginning to notice artifacts in the transcoded outputs as well.

It's a shame really. The 650 is a great card which has really been amazing.

But I'd rather go back to Sierra then buy another card.

YES I agree! I just think that this is going to become a bigger problem as time goes on.
I bought a Used EVGA GeForce GTX 970 Super clocked off eBay this morning :-/ $270! Thats a good deal unfortunately!

The EVGA cards are of the FEW brands that provide 2 DVI's and an HDMI. Most have just 1 DVI, 1 HDMI then a few Digital Ports which makes things a pain in the ass!

EVGA GeForce GTX 980 SC
GPU Clock: 1380 MHz, Memory Clock: 7010 MHz Effective, CUDA Cores: 2048
Memory Bit Width: 256 Bit, 0.28ns, Memory Bandwidth: 224.3 GB/s
Max Monitors Supported: 4, 4096x2160
500 Watt or greater power supply.

EVGA GeForce GTX 970 SC
GPU Clock: 1317 MHz, Memory Clock: 7010 MHz Effective, CUDA Cores: 1664
Memory Bit Width: 256 Bit, 0.28ns, Memory Bandwidth: 224.3 GB/s
Max Monitors Supported: 4, 4096x2160
500 Watt or greater power supply.

EVGA GeForce GTX 780 SC
GPU Clock: 1020 MHz, Memory Clock: 6008 MHz Effective, CUDA Cores: 2304
Memory Bit Width: 384 Bit, 0.33ns, Memory Bandwidth: 288.38 GB/s
Max Monitors Supported: 4, 4096x2160
600 Watt or greater power supply with a minimum of 42 Amp on the +12 volt rail.
 
Intelgraphicsfix?
is that a kext? Cuz I can't see anything in clover to check off
 
I just wanted to say that I read a majority of this thread and wasn't very hopeful that this would work... But I found 'Intel multi display' setting in BIOS and enabled it and BOOM, I boot up to same glitched out windows.... BUT WAIT, I click the window and it seems to rerender without glitches. I click all of the different apps that auto-loaded from previous reboot and the glitch disappear for each. Still 10 minutes later and not a single artifact. Used latest version of clover and essential kexts but no other changes to the config. Dell XPS 8700 with GTX 660

Edit: Scratch that, they're back. Will try other stuff and report back if successful.
 
Last edited:
I held my breath
 
I try to check in here every month or so and I suppose I am stuck with Sierra and my GTX660 until the graphic card prices come down. It is insane to think that inflated Bitcoin market prices are keeping GPU costs so insanely overpriced.
 
I'm really not sure why this problem became so big in this version of Osx, but I have to reassure you that this problem existed in every version I've tried from 10.10 till today. Although the problem was only inside apps on the previous versions, (people doing rendering and live rendering like twinmotion most probably know what I'm talking about) .But the whole interface rendered incorrectly ? crazy . Anyway when I realized that back on 10.10, I had a friend with a real Mac with a stock gtx660 and the same program installed so I sent him the same file. Everything perfect, no glitches no nothing..

Result: All of you having this problem have factory overclocked (maybe failing ?) cards (Me has an msi twinfrozr3)
Solution: Underclock your cards to stock nvidia branded 660 clock speeds.

I've been running more than three hours without a glitch for now. Default Osx driver, z77x-ud5h, Ozmosis bios, no clover or kexts installed. I will switch to nvidia driver later and report back but I think the case is closed.
(btw this can happen on windows too.)

If anyone need help doing this.. ask and I may take the time to write the process up too :p

I'm just happy to have a glitch free screen again.
Btw if someone knows why logic scans only half of my Au's please report back :p High sierra messed me up
 
Status
Not open for further replies.
Back
Top