Contribute
Register

Solved > 2 Different PCI-E GPUS (Web driver installed) video cards no 3D working, with 1 its ok. Is there solution?

Joined
Oct 10, 2015
Messages
91
Motherboard
Z390 Ultra Gigabyte SLI
CPU
i9-9900K
Graphics
GTX 1070
Mac
  1. Mac mini
Classic Mac
  1. iMac
  2. Power Mac
Mobile Phone
  1. Android
Hello,
i want to use 2 different Nvidia cards within my PC, because of multibooting, Linux KVM videocard passthroug etc.

I have 10.13.6 Geforce 1070 - primary (top primary slot) and Gefore 730(bottom slot), i installed all with Geforce 1070 and added Geforce 730 later i didnt chance any system settings, its enough just to plug the card + reboot and its broken.

Nvidia driver is working, active in both case, only difference clue which i have is that HWmonitor monitoring show GPU die 2 instead GPU die 1 when second card is placed. In Nvidia panel, i can see both cards.

I just need to make 3d working for gaming, its there any way to do it? I dont mind disable second card completely in MacOS etc.

Would Nvidia primary + old ATI card for other OS work better?
 
Last edited:
Joined
Oct 10, 2015
Messages
91
Motherboard
Z390 Ultra Gigabyte SLI
CPU
i9-9900K
Graphics
GTX 1070
Mac
  1. Mac mini
Classic Mac
  1. iMac
  2. Power Mac
Mobile Phone
  1. Android
Ok, i tried to add Radeon 7750, but its not working too, i got free at logo screen..
 
Joined
Oct 10, 2015
Messages
91
Motherboard
Z390 Ultra Gigabyte SLI
CPU
i9-9900K
Graphics
GTX 1070
Mac
  1. Mac mini
Classic Mac
  1. iMac
  2. Power Mac
Mobile Phone
  1. Android
I fixed it, i found nice Github how to for different situations..

I disabled whole PCI-E slot (there is a bit mess with filenames device.md = option4):

Some fail proven notes to it, simple as possible:
Tested on Mac 10.13.6

Tool on screenshot is Clover Configurator.

My path (magic string) was shorter, that in example, also not be confused that for check within Clover, you need to click on Properties button, without it you will see any fields in Device panel.

You need 2 kext, but they are already part of Multibeast / Unibeast install.

gfxtool.. its just same binary to run, you dont need to install and dont need to place it to terminal as in turotial, you just need to run it.. and if you need only gpus in ouput, you need to run it with parameters.

Other confusing thing could be if all value is sample xml are just some samples, or really value which you should use.. Well use it as it, only replace on thing that is:
<key>PciRoot(0x0)/Pci(0x1B,0x2)/Pci(0x0,0x0)</key>
Which is your value, its unique for you machine probably

Other confusing coudl be how to identify which card to disable, i example is:
/Users/(YourUsername)/Downloads/(gfxdownload folder)/gfxutil -f GFX0
but if you need to disable 2nd or 3rd card you will need no GFX0, but GFX1 or GFX2.

Other good thing how to indetify card is just by its Hardware ID, gfxutil reports it, at start of string, its unique for all card models:
01:00.0 10de:1b81 /PCI0@0/PEG0@1/GFX0@0 = PciRoot(0x0)/Pci(0x1,0x0)/Pci(0x0,0x0)
And you can see it in System report too, in Windows - device manager - Details - Hardware ID - or HWinfo will print it, in Linux you can discover it by:
lspci -nn

Other problematic thing for new users could be where exactly in config.plist put next magic XML block,
when i placed it device block after FakeID block, after save the file. You can check if it is right by using Clover Configurator and navigate to same place (switch to Properties tab) and you should see values.. is not config is not right.
You can mount EFI partition with config file with Clover Configurator too, its working for NVME disk too.

So it look like this:
<key>Devices</key>
<dict>
<key>Audio</key>
<dict>
<key>Inject</key>
<string>3</string>
</dict>
<key>FakeID</key>
<dict>
<key>ATI</key>
<string>0x0</string>
<key>IMEI</key>
<string>0x0</string>
<key>IntelGFX</key>
<string>0x0</string>
<key>LAN</key>
<string>0x0</string>
<key>NVidia</key>
<string>0x0</string>
<key>SATA</key>
<string>0x0</string>
<key>WIFI</key>
<string>0x0</string>
<key>XHCI</key>
<string>0x0</string>
</dict>
<key>Properties</key>
<dict>
<key>PciRoot(0x0)/Pci(0x1B,0x2)/Pci(0x0,0x0)</key>
<dict>
<key>IOName</key>
<string>#display</string>
<key>class-code</key>
<data>
/////w==
</data>
<key>name</key>
<data>
I2Rpc3BsYXk=
</data>
</dict>
</dict>

<key>USB</key>

How to check that all is fine? Well you can run some 3D app - like Unigine Heaven Benchmark or game, but within Nvidia panel Gsynch.. I had Geforce 1070 and second Geforce 730 no second is gone.
Other check is About Mac - System report - Graphic and displays, there is now only one card.

So multibooting with more GPUs per OS, is the thing even with MacOS. Only problem really seems to be get to MacOS to discover number.. but there are people which will help you can too boot (needed for ATI only, or you have to find out way how to discover magic PCI string without working MacOS and gfxutil) - there some inject flag etc, i never was good with that..

Now i can start mess with 3rd GPU for Windows 98 Linux KVM videocard passthrough. Second was for XP, which runs on Z390 (yeah we have nice Win-Raid community patches for USB3 / NVME boot etc) even natively :) Its they run in KVM too, you can select.
 
Last edited:
Top