Contribute
Register

[SOLVED] Nvidia GeForce GTX 1080/1070

Status
Not open for further replies.
Yep, I'm doing that right now.
I have a Palit GTX 1070.

OSX uses the integrated graphics (Intel HD4600), Windows and Linux use the GTX 1070.
How exactly would you do that? Disable the GPU through BIOS or what?
 
So would it be possible to somehow make the OSX partition run integrated graphics while other partitions like Windows or Linux use the Pascal cars?

Yes, it is possible.

I'm running a i7-6700K Skylake + Nvidia GeForce 1080 on a Asus Maximus VIII Hero motherboard on my rig. In BIOS, I set IGPU (integrated GPU) to be default GPU and CPU Graphics Multi-Monitor to Enabled, all this under Advanced tab > Systems Agent Config > Graphics Config. My 1440p monitor has 1 Displayport and 1 HDMI. The Displayport is used by the Geforce 1080 card on Windows, while I use HDMI port on the motherboard for OS X (El Capitan). I use Inject Intel to make OS X use the integrated GPU. About This Mac > System Report > Graphics/Displays shows both GPUs, tho the Nvidia doesn't have all the information.

Booting into Clover using HDMI and I switch to DP (on my monitor display menu) and at the same time launch Windows 10. Windows detects both GPUs but I've set it to only use the Nvidia Card. It's not optimal, but it'll work until Nvidia release new drivers.

To make OS X output 1440p to my monitor over HDMI, I did the mac-pixel-clock-patch-V2.
 
Last edited:
Yes, it is possible.

I'm running a i7-6700K Skylake + Nvidia GeForce 1080 on a Asus Maximus VIII Hero motherboard on my rig. In BIOS, I set IGPU (integrated GPU) to be default GPU and CPU Graphics Multi-Monitor to Enabled, all this under Advanced tab > Systems Agent Config > Graphics Config. My 1440p monitor has 1 Displayport and 1 HDMI. The Displayport is used by the Geforce 1080 card on Windows, while I use HDMI port on the motherboard for OS X (El Capitan). I use Inject Intel to make OS X use the integrated GPU. About This Mac > System Report > Graphics/Displays shows both GPUs, tho the Nvidia doesn't have all the information.

Booting into Clover using HDMI and I switch to DP (on my monitor display menu) and at the same time launch Windows 10. Windows detects both GPUs but I've set it to only use the Nvidia Card. It's not optimal, but it'll work until Nvidia release new drivers.

To make OS X output 1440p to my monitor over HDMI, I did the mac-pixel-clock-patch-V2.
So if I was going to do this, I would need possibly 2 HDMI cables. Would one HDMI cable be attached to the motherboard (for OSX) and one to the GPU (for Windows and Linux)?
 
So if I was going to do this, I would need possibly 2 HDMI cables. Would one HDMI cable be attached to the motherboard (for OSX) and one to the GPU (for Windows and Linux)?
I'm not sure, sorry. I don't know if it would work if both GPUs transmitted the signals via HDMI to the same monitor. Maybe you need to actively disable one in BIOS for the other to work (and switch back when you wan the other OS). I'm a bit surprised it works for me using Displayport and HDMI :)
 
So if I was going to do this, I would need possibly 2 HDMI cables. Would one HDMI cable be attached to the motherboard (for OSX) and one to the GPU (for Windows and Linux)?
Yes, as I understood.. Or you can switch the cable (from iGPU to the dedicate GPU, and viceversa) every time you switch the OS

consider this: eGPUs are coming finally via thunderbolt 3...
Razer released their "Core External Graphics Dock"...
thats a new target group besides hackintosh users and old macpro users...
Hope you feel better now.
If you make this adaptor by yourself, you will need about 200-300€ (without the GPU), while ready-for-use ones costs at least the double.. I don't think that spend 200€ or more for just an adapter is a good choice!
 
Yes, as I understood.. Or you can switch the cable (from iGPU to the dedicate GPU, and viceversa) every time you switch the OS


If you make this adaptor by yourself, you will need about 200-300€ (without the GPU), while ready-for-use ones costs at least the double.. I don't think that spend 200€ or more for just an adapter is a good choice!
Where could I get this type adapter and what would it be called?

Thanks,
 
Where could I get this type adapter and what would it be called?

Thanks,
You should look for "Thunderbolt to PCIe adaptor". The cheapest that I know is the Akitio Thunder2 PCIe, but you need an external PSU and a 16X PCIe riser (because most GPU doesn't fit in the Akitio case), and someone suggest a PC case where put everything!
Online you'll find a lot of informations about these eGPU (like videos etc.), but remember that not all the GPUs are supported in OS X
 
Yes, it is possible.

I'm running a i7-6700K Skylake + Nvidia GeForce 1080 on a Asus Maximus VIII Hero motherboard on my rig. In BIOS, I set IGPU (integrated GPU) to be default GPU and CPU Graphics Multi-Monitor to Enabled, all this under Advanced tab > Systems Agent Config > Graphics Config. My 1440p monitor has 1 Displayport and 1 HDMI. The Displayport is used by the Geforce 1080 card on Windows, while I use HDMI port on the motherboard for OS X (El Capitan). I use Inject Intel to make OS X use the integrated GPU. About This Mac > System Report > Graphics/Displays shows both GPUs, tho the Nvidia doesn't have all the information.

Booting into Clover using HDMI and I switch to DP (on my monitor display menu) and at the same time launch Windows 10. Windows detects both GPUs but I've set it to only use the Nvidia Card. It's not optimal, but it'll work until Nvidia release new drivers.

To make OS X output 1440p to my monitor over HDMI, I did the mac-pixel-clock-patch-V2.


I just wanted to piggyback off this and mention that it is also possible have essentially the same setup but instead of using the IGPU, you can use a 2nd dedicated GPU for OS X.

I had been using a 770 gtx for my hackintosh/windows build, and I decided to upgrade to a 1080 for the windows side. The only difference from the setup above is that I set the BIOS to default to the PCIe slot that has the 770 so that OS X picks it up. I also applied the mac-pixel-clock-patch-V2 patch so that I could output 1440p via HDMI. On the Windows side, I basically just disabled the 770 outputs and had windows only output via the 1080 via Displayport.

My monitor also has input auto-detect from sleep, so I don't even need to adjust anything when I boot; each OS only outputs one signal via one of the cables, and my monitor will switch to the active input from sleep mode.
 
Yes, it is possible.

I'm running a i7-6700K Skylake + Nvidia GeForce 1080 on a Asus Maximus VIII Hero motherboard on my rig. In BIOS, I set IGPU (integrated GPU) to be default GPU and CPU Graphics Multi-Monitor to Enabled, all this under Advanced tab > Systems Agent Config > Graphics Config. My 1440p monitor has 1 Displayport and 1 HDMI. The Displayport is used by the Geforce 1080 card on Windows, while I use HDMI port on the motherboard for OS X (El Capitan). I use Inject Intel to make OS X use the integrated GPU. About This Mac > System Report > Graphics/Displays shows both GPUs, tho the Nvidia doesn't have all the information.

Booting into Clover using HDMI and I switch to DP (on my monitor display menu) and at the same time launch Windows 10. Windows detects both GPUs but I've set it to only use the Nvidia Card. It's not optimal, but it'll work until Nvidia release new drivers.

To make OS X output 1440p to my monitor over HDMI, I did the mac-pixel-clock-patch-V2.

How did you make it work? I have IGPU set to default and my GTX 1070 set to secondary. When I run Windows 10 on that config it never uses 1070 - the output stays black.
 
Status
Not open for further replies.
Back
Top