Contribute
Register

Is it possible to disable RTX 2080 and use integrated graphics?

Status
Not open for further replies.
Joined
Jul 29, 2016
Messages
96
Motherboard
Jingsha X79 Dual S8
CPU
2x E5-2690 v2
Graphics
RX 480
Mac
  1. 0
Classic Mac
  1. 0
Mobile Phone
  1. Android
Hi.

I'm well aware there are no web drivers for RTX 2080 at the moment. I was wondering if i can disable it somehow to directly run my Integrated HD4600 graphics instead? I don't want to unplug the graphics card everytime i need to use Mojave on my hackintosh.
 
This might be possible -- if someone has done this or is doing this now, perhaps they will add their comments to this thread.

Here are some general comments based on my prior experience. This may or may not apply to your case, so please keep that in mind.
  • On my 2008 MacPro 3,1 with multiple PCIe slots and High Sierra, when I installed my Nvidia GTX970 (without any Nvidia web drivers), MacOS simply applied a default graphics driver (non-accelerated). MacOS did not crash, it did not complain, it seemed to be okay. I could run a display monitor through the GTX970. Performance was very laggy, but it fundamentally worked.
  • So I suspect that if you plug an unsupported RTX2080 into a PCIe slot, but use your HD4600 internal graphics processor and connect your display monitor to a video port on the motherboard, Mojave might just report two graphics devices in System Report, and it may attach a default driver to the RTX2080. You could simply ignore the RTX2080 in this case and run off the HD4600.
Again, this is speculation based on a different Mac setup, but it is a data point for you to consider.
 
Yes, it's possible and it works perfectly, I did it for my 1080ti, but each time I give guys this Rehabman SSDT solution, they say it's too difficult... So it depends on you, but if you can follow his very well made step by step, it's doable and after that, your RX will be invisible to Mojave.
 
Last edited:
Renaming the external GPU's DSDT device and/or spoofing an invalid vendor/device-id could do the trick. If it doesn't get matched by any kexts, it'll just be another device attached to the system, right?
 
I think im going to buy a Cheap GT710 and use that for mojav, it works for mac straight out of the box apparently.
 
I have a workable solution, not sure if it will work with your bios though.

I have an Asrock phantom gaming itx, in the bios there is an option to select the primary display output, I selected the onboard graphics which goes out HDMI to my monitor, my RTX 2080 TI goes via DP to the monitor (x34p). Now Mojave works correctly via IGPU.

For windows you can't operate (at least I didn't figure it out) the DP when HDMI is also plugged in, so I added a cheap HDMI switcher and I "disable" HDMI output (by changing hdmi input source to my rasberry pi instead) when I want to use my rtx in windows. Of course you need to change the port setting on the monitor everytime you switch, but it's better than unplugging/plugging cables every time.
 
Hey @rayceyiii, did purchasing that cheap card work? I hav the RTX 2080 and in the same boat. I would much rather have it default to the GT and can you share any steps you used to make it default to the GT instead of the RTX?
 
Hey @rayceyiii, did purchasing that cheap card work? I hav the RTX 2080 and in the same boat. I would much rather have it default to the GT and can you share any steps you used to make it default to the GT instead of the RTX?
sorry for late response mate.. lol I bought an rx 580 instead
 
Status
Not open for further replies.
Back
Top