Ok, Just disappointing . Although the glitches are now much less and manageable they are there... I have definitive ways to trigger them, you could work two hours straight without a single glitch but just by opening a certain app everything goes bananas. I managed to disable boost, just by choosing disable on the dropdown, then wrote my own table so the voltage and the clock stays stable all the time. Do not bother, does no more than just lowering the base clock. I'm now thinking about decoding and opengl. Having the igpu disabled probably renders these unusable. I'm trying to setup again with igpu enabled and recognized. I'm not sure if AppleGraphicPowerManagement.kext should be loading or not but it certainly does not for me not even after injecting 660's id inside the kext. I'm lost after this, I'll have to do more research for sure.
Update:
Scratch everything.
I restored my original vbios just to diminish variables (i will actually keep the underclocked one seems far more stable.). I then finally managed to have both igpu and 660 recognized by osx. At this point I had lilu,nvdafixup, shiki and I don't know what else I had modified or installed . With the igpu recognized but still my screens connected on the gtx, I have yet to see an artifact. After this to make things even more clear I begun fresh. Reinstalled 13.3, bios reset, made new dsdt, nvidia web driver and booted. Result is the same. I'm not using lilu or any kexts right now. So I tried every single way to make it glitch again but it will not. It seems good this time . I'm hoping for the best, only time will tell
Update2:
Everything good for now. Haven't changed anything yet. Just ran alot of apps and the good known ways to start glitching but still nothing. Seems really good, not the smallest glitch yet.
Update3:
It's been a day, no glitches at all. I'm really happy finally about this.. If you want to test, just try with your igpu enabled without any custom kexts, lilu or anything. If you want detailed instructions, just ask.
Final Update:
I settled on the setup that works best for me. No artifacts or strange glitches whatsoever. I'm running a vbios with boost enabled, stock nvidia clocked. But that should not matter for the artifacts after all. It just seems more stable, both on benchmarking and general os stability. The trick is to have OSX recognize the igpu, this is the only setting that made an actual difference. At the moment the osx recognizes the igpu, the artifacts are gone
.
For this to be done you have to enable igpu on bios and have it set as your First graphics card. This means no screen while booting, if you have set up default boot device your macintosh HD then the image will come up when mac boots up.
Although, you are able to run both of your displays connected to different graphics card, and this is what I'm actually doing right now.
Igpu connected to first screen gtx660 to second, I can see bios and ozmosis bootloader plus -v and osx boot time, on the screen that is connected on the Igpu and when osx boots up both of them are active and working.
This is a perfect solution for me and works great.
On the rest of it, I'm not using any kexts at all except for fakesmc,nvidia web driver,and ethernet, wifi is native. No lilu nvdafixup or anything. I'm using ozmosis with a custom made dsdt and injecting imac 27 late 2013. 3770k, igpu recognized as 4000 1536 MB.
Results are great here.
Really did worth the time for a perfect mac. Now I'm gonna investigate why logic still isn't scanning half of my plugins
.
Have fun