Contribute
Register

GTX 660 glitch issue with MacOS High Sierra 10.13

Status
Not open for further replies.
Solution: Underclock your cards to stock nvidia branded 660 clock speeds.

Facing the same issues, also running a stock overclocked 660. How would you underclock it? Not so keen on bios editing my card, considering the horrendous prices nowadays... (Thank you, crypto miners:banghead:).

I can also confirm this problem existed in earlier versions of macOS, although it did show up VERY rarely compared to High Sierra.
 
and the answer is......../










Bios editing :p
(I'm not aware of any way to software underclock on mac. sorry.)

(Do not fear Vbios editing. If you have windows installed on the same pc just download kepler editor and nvflash. If you don't, run a vm or a real winpc and create a freedos usb. Then again download kepler and nvflash(dos version). Save your original bios by using gpuz or anythin and open it with kepler editor. Reopen an instance of kepler editor, got to techpowerup database and download original 660 vbios. Now open original nvidia vbios to second kepler instance and copy every single number you see there from every single page to the first instance where you have your own bios. After finishing save file to something like vbios.rom . If you have windows installed just run nvflash. I always suggest though to copy all of your files (nvflash and rom) to freedos usb, boot from there and just run nvflash vbios.rom . It takes about 10 minutes from start to finish, it probably took me more to type it. If you still have problems doing it.. ASK, don't mess your graphics card, it's not easy)
 
Last edited:
Oh well. No risk, no fun.

What tools did you use and to how did you clock it (to what frequencies)? Trying to minimize risks.:lol:
 
Alright, everything went smoothly. Thanks for the detailed instructions!

Testing on macOS right now.

I chose to only correct memory speed first, since I noticed overclocking it by even a small bit would cause crashes and glitches in games. My hope is to keep the increased GPU clocks while eliminating the glitches.

Fingers crossed.:beachball:

EDIT: That didn't work out so well. Glitching again - looks less pronounced though. Now to try lowering the GPU clocks.

BTW:
if someone knows why logic scans only half of my Au's please report back
I've read that some seem to have fixed this by copying the auvaltool from an older (working) macOS version. I've never tried it, but recently looked into this because some of my AU wouldn't load - turned out to be an user error in my case though (fiddled with Waves plugins but didn't rerun the Waves AU Reg Utility).
 
Last edited:
Yeah I tried it too to find out the golden frequency which is not stock but not overclocked enough to glitch everything but it is not easy and the performance gain is so tiny,that I was really bored fast and just clocked to nvidia's stock.

Update: After 3-4 hours started glitching again while being totally glitch free for these 3-4 hours. Now I started running with nvidia web driver enabled from ozmosis, and I have slightly better performance. No glitches or artifacts just yet. I really have to test this out, but the difference is tremendous. Before the bios update I really could not use my mac web driver or not.
If anyone knows anything about mac vbios please help us out here. I know there is a gtx680 mac vbios so I would like to create one for 660.. If anyone has any info please help.
Update2: Glitched Again. I stayed on nvidia web driver and restarted, found out that it starts to glitch when I use opengl (mostly vsts or aus on logic)
Update3: Finally found it I think, You have to go all out and use the reference clock numbers. I used nvidia's 660 stock clocks. Now I just flashed a bios with the numbers from the specs on nvidia website. Base 980 Boost 1033 Memory 3004. These are the only numbers I found working. Also check your tables. Opengl works great right now. I checked vst, aus plus cinebench . No artifacts.Completely vanilla install with no kexts lilu or anything. Igpu completely disabled, ozmosis, nvidia web driver installed and activated and cuda installed.
Update4:Tried ten times with cinebench, glitched again on tenth. Flashed new bios, Tdp,base,boost are all #0 980. I'm starting to think that the glitches come up because of the boost table, which is now disabled, I think ?. Again if someone knows more on the subject, I stand to be corrected.
 
Last edited:
Have been running Base 915, Boost 980, Mem 3004 (values from ref model bios) - glitching after nearly 2 hours of Safari usage, although very lightly and it didn't affect other programs as it previously did. On stock overclocked numbers it would glitch after a couple of minutes, glitching everywhere, so there seems to be an improvement. Could be luck though. I haven't tried the ref clocks from the website yet.

It makes sense to me that the clock speeds play a role here, since these glitches appear mostly while running apps that rely on the gfx card. Safari and macOS UI itself do, since Metal API is used heavily. That's most likely the reason why older versions didn't have these severe issues.

Maybe it's only when boost clocks come into play that these glitches show up, as they mostly happend to me when, for example, there was an animated transition to fullscreen or similar, resulting in a very brief period of boosted clocks. That would also explain why lowering the clocks, including boost speed, causes these glitches to be less severe and occur less often. Could try to reduce boost speed to standard clocks to investigate this further.

On the other hand, if your clock speeds from the Nvidia website really work out better in the long run, this "boost-theory" wouldn't make any sense, as they're much higher than mine currently.:crazy: I also don't understand why an OS would expect a specific clock speed, and someone on the Apple forums complained about a similar issue, however, he is using a MBP with onboard GTX 650 - totally different card, standard setup and certainly with the right clocks. Confusing.

EDIT:
Just noticed your update 4. I see you suspect likewise. Maybe we're on to something... Time will tell.

EDIT2:
Did some research.

Setting everything to Entry #1, #2 or #3, = equal values, will not disable boost, it won't even limit it. And disabling the Boost Entry will just lead to the Base Entry value being used as the Boost Entry value. The Boost Entry seems to be the base boost clock. From there on it will look at the Boost Table and always choose the highest possible entry, so that it won't exceed thermal design limits (some of which can be adjusted in the Power Table). Boost itself cannot be disabled completely according to an Nvidia employee.

Maybe setting the Boost Limit to Base Clock or slightly higher would do the trick. However, I'm still in the process of getting an overview how all these settings correlate and whether we can change them freely without bricking. This tool is made for over-, not underclocking after all. Chances are no one ever tested this before. For example, what would happen if Boost Limit is set lower than Base Clock? Is that even a valid configuration, even though Kepler BIOS Tweaker allows for it? Which leads me to the question if there is any validation in this at all, except from the Boost Table, which has a right-click option for that.
 
Last edited:
Ok, Just disappointing . Although the glitches are now much less and manageable they are there... I have definitive ways to trigger them, you could work two hours straight without a single glitch but just by opening a certain app everything goes bananas. I managed to disable boost, just by choosing disable on the dropdown, then wrote my own table so the voltage and the clock stays stable all the time. Do not bother, does no more than just lowering the base clock. I'm now thinking about decoding and opengl. Having the igpu disabled probably renders these unusable. I'm trying to setup again with igpu enabled and recognized. I'm not sure if AppleGraphicPowerManagement.kext should be loading or not but it certainly does not for me not even after injecting 660's id inside the kext. I'm lost after this, I'll have to do more research for sure.

Update:
Scratch everything.

I restored my original vbios just to diminish variables (i will actually keep the underclocked one seems far more stable.). I then finally managed to have both igpu and 660 recognized by osx. At this point I had lilu,nvdafixup, shiki and I don't know what else I had modified or installed . With the igpu recognized but still my screens connected on the gtx, I have yet to see an artifact. After this to make things even more clear I begun fresh. Reinstalled 13.3, bios reset, made new dsdt, nvidia web driver and booted. Result is the same. I'm not using lilu or any kexts right now. So I tried every single way to make it glitch again but it will not. It seems good this time . I'm hoping for the best, only time will tell ;)

Update2:
Everything good for now. Haven't changed anything yet. Just ran alot of apps and the good known ways to start glitching but still nothing. Seems really good, not the smallest glitch yet.

Update3:
It's been a day, no glitches at all. I'm really happy finally about this.. If you want to test, just try with your igpu enabled without any custom kexts, lilu or anything. If you want detailed instructions, just ask.

Final Update:
I settled on the setup that works best for me. No artifacts or strange glitches whatsoever. I'm running a vbios with boost enabled, stock nvidia clocked. But that should not matter for the artifacts after all. It just seems more stable, both on benchmarking and general os stability. The trick is to have OSX recognize the igpu, this is the only setting that made an actual difference. At the moment the osx recognizes the igpu, the artifacts are gone ;).
For this to be done you have to enable igpu on bios and have it set as your First graphics card. This means no screen while booting, if you have set up default boot device your macintosh HD then the image will come up when mac boots up.
Although, you are able to run both of your displays connected to different graphics card, and this is what I'm actually doing right now.
Igpu connected to first screen gtx660 to second, I can see bios and ozmosis bootloader plus -v and osx boot time, on the screen that is connected on the Igpu and when osx boots up both of them are active and working. :) This is a perfect solution for me and works great.
On the rest of it, I'm not using any kexts at all except for fakesmc,nvidia web driver,and ethernet, wifi is native. No lilu nvdafixup or anything. I'm using ozmosis with a custom made dsdt and injecting imac 27 late 2013. 3770k, igpu recognized as 4000 1536 MB.
Results are great here. :) Really did worth the time for a perfect mac. Now I'm gonna investigate why logic still isn't scanning half of my plugins :p .
Have fun
 
Last edited:
nice games everyone.........
so, if I understand lameturtle..... its just enabling IGPU?

im running an older x58 board with no internal graphics, so im still in no options land?

it's interesting that you have noticed glitching in previous systems..... if I have seen it I guess I haven't attributed it to the graphics card....

FCPx is the one that always kills it for me...... and iTunes... so im guessing cuda and open gl.
I can't even remember why I got this specific card, its never been supported in After Effects for instance, which is one of my main apps.....

just for fun I contacted Vida support..... they said, not supported and wait for update hahahaha

ever onwards eh?
 
x58 is interesting.. Yes just IGPU does the trick.
What I'm thinking now though is, if this whole situation is smbios dependent. Try to find an mac pro without internal graphics and use this smbios. I'm not sure if this would do it but it is a good start. I think this whole situation is because osx expects to see the igpu.. What if it did not expect to see it :p
 
Status
Not open for further replies.
Back
Top