Contribute
Register

GTX 660 glitch issue with MacOS High Sierra 10.13

Status
Not open for further replies.
all right....BUT SIERRA HIGH 10.13.5/6 and mojave (10.14.0/1) by APPLE have introduce something do WRONG on nvidia 6.xx and 7.xx to the NATIVE INTERNAL OSX DRIVER.... i can't say if "metal" cause or GPU bug....BUT you can try 1000 times and see with your eyes.... those gfx card work always native BUT you must convive with a big graphic bug .. graphic softwares and more on NLE (video editing) softwares are unusable!
try final cut pro x (any release...) put any video clip you like , on timeline... go right on effect window, begin to play with any FX you like...and in a short time you see withc GLICH compare, first in yout preview...and few time later in menu and all your fcpx console. You must exit from program...and restart system IF you like "clean" this bug from memory...

solutions? nothing... only APPLE DEVS can fix this....but... no more important NVIDIA cards for them now.
and belive , this happens also on real MAC with nvidia CARDS !!! yes...last 2 updates of SIERRA HIGH and MOJAVE, do wrong on nvidia internal driver (kexts)...

ofcourse..with SIERRA HIGH, alternative is use the WEB DRIVER, work fine also on 6xx and 7xx.... but with mojave? for now solution is only BUY a RX 5xx video card....
I take the 580 8gb , and no more gfx bug.
I can wait NVIDIA write web driver and cuda x mojave...meaby (or sure) never...
 

Attachments

  • Schermata 2018-10-20 alle 13.21.44.png
    Schermata 2018-10-20 alle 13.21.44.png
    1.1 MB · Views: 205
all right....BUT SIERRA HIGH 10.13.5/6 and mojave (10.14.0/1) by APPLE have introduce something do WRONG on nvidia 6.xx and 7.xx to the NATIVE INTERNAL OSX DRIVER.... i can't say if "metal" cause or GPU bug....BUT you can try 1000 times and see with your eyes.... those gfx card work always native BUT you must convive with a big graphic bug .. graphic softwares and more on NLE (video editing) softwares are unusable!
try final cut pro x (any release...) put any video clip you like , on timeline... go right on effect window, begin to play with any FX you like...and in a short time you see withc GLICH compare, first in yout preview...and few time later in menu and all your fcpx console. You must exit from program...and restart system IF you like "clean" this bug from memory...

solutions? nothing... only APPLE DEVS can fix this....but... no more important NVIDIA cards for them now.
and belive , this happens also on real MAC with nvidia CARDS !!! yes...last 2 updates of SIERRA HIGH and MOJAVE, do wrong on nvidia internal driver (kexts)...

ofcourse..with SIERRA HIGH, alternative is use the WEB DRIVER, work fine also on 6xx and 7xx.... but with mojave? for now solution is only BUY a RX 5xx video card....
I take the 580 8gb , and no more gfx bug.
I can wait NVIDIA write web driver and cuda x mojave...meaby (or sure) never...
In High Sierra really work the Web Drivers? Because here never worked... always glitchs... and I am stuck on Sierra
 
in native mode ...bis sierra high 10.3.4 work good ...seem to 10.3.5 begin the bug...and ofcourse with last 10.3.6
I never try but I think meaby work with WEB DRIVER...
alternative is go back to 10.3.4 and owrk with native NV mode..!
 
OSX Mojave 10.14.2. Everything is the same.
 
So, I am convinced that there is no solution to this. Is there any trick that would make it disappear for a while, so that I could work, even though if it will appear again?

I tried these things:

  • Set default the Intel card in BIOS
  • Attach monitors in both intel & nvidia card: actually this makes the situation worse, even the monitors attached to the intel card have glitches
  • set the intel graphics card size in BIOS
  • enabled inject nvidia in Clover
  • Give Vendor ID in Clover (making things even worse for intel)

So, if there is a trick, when glitches appear, to disappear, even for some short time, I'll appreciate if you share it with me.
 
So, I am convinced that there is no solution to this. Is there any trick that would make it disappear for a while, so that I could work, even though if it will appear again?

I tried these things:

  • Set default the Intel card in BIOS
  • Attach monitors in both intel & nvidia card: actually this makes the situation worse, even the monitors attached to the intel card have glitches
  • set the intel graphics card size in BIOS
  • enabled inject nvidia in Clover
  • Give Vendor ID in Clover (making things even worse for intel)

So, if there is a trick, when glitches appear, to disappear, even for some short time, I'll appreciate if you share it with me.

Did you not read through all 65 pages of this thread? There were some "tricks" listed, like messing with the settings in the Display preference pane, but it only "fixes" it for a very short time. I think it was about a minute, when I tried it last. The only real solution is not using the card and using something else. It will likely never be fixed and there will likely never be another solution. Continuing to use the affected cards will only result in torturing yourself.
 
Did you not read through all 65 pages of this thread?

Of course I did, this is why I did all the tries I mentioned above. But I ask something different, if there's a trick to fix it for a short time, where "short >= half an hour".

And exactly, since there are 66 pages, I might have missed something.

(fun fact: this page is in my front page bookmark just in case...)
 
But I ask something different, if there's a trick to fix it for a short time, where "short >= half an hour".

The only thing that temporarily fixed it for me was temporarily rotating the screen/changing the resolution. Even then, I don’t think I got 30 minutes out of it before it started acting up again. I can’t remember if it even fully got rid of the artifacts, or if it just reduced them for a few minutes.
 
...... The only real solution is not using the card and using something else. It will likely never be fixed and there will likely never be another solution. Continuing to use the affected cards will only result in torturing yourself.
That was my idea as well, to change the card, even if it sounds bit stupid. :) But if there is no other solution, then no choice we left, didn't we?
Question is, what type of used graphics card could (should) work? (without the less hacking... ) My 660Ti worked always without any hacking before.... through the years...
And I accept the fact, things are not forever. However no idea what card should I buy to make it work well. I am not a gamer, so I do not need better card then this, or maybe a little.
Anyway, if someone has any suggestion please do not hesitate to share it :D
Thank you.
 
That was my idea as well, to change the card, even if it sounds bit stupid. :) But if there is no other solution, then no choice we left, didn't we?

Question is, what type of used graphics card could (should) work? (without the less hacking... ) My 660Ti worked always without any hacking before.... through the years...

And I accept the fact, things are not forever. However no idea what card should I buy to make it work well. I am not a gamer, so I do not need better card then this, or maybe a little.

Anyway, if someone has any suggestion please do not hesitate to share it :D
Thank you.

I use a GTX 760 on my X99 build and it works great with Yosemite through High Sierra. Before I change to a RX 580 on my main system I also used a GTX 760 there. It worked great on Sierra and High Sierra.

So if you are looking for a used card to replace the GTX 660 Ti I recommend a GTX 760. It is natively supported from Yosemite through High Sierra at least.
 
Status
Not open for further replies.
Back
Top