Hello,
i want to use 2 different Nvidia cards within my PC, because of multibooting, Linux KVM videocard passthroug etc.
I have 10.13.6 Geforce 1070 - primary (top primary slot) and Gefore 730(bottom slot), i installed all with Geforce 1070 and added Geforce 730 later i didnt chance any system settings, its enough just to plug the card + reboot and its broken.
Nvidia driver is working, active in both case, only difference clue which i have is that HWmonitor monitoring show GPU die 2 instead GPU die 1 when second card is placed. In Nvidia panel, i can see both cards.
I just need to make 3d working for gaming, its there any way to do it? I dont mind disable second card completely in MacOS etc.
Would Nvidia primary + old ATI card for other OS work better?
Contribute to khronokernel/How-to-disable-your-unsupported-GPU-for-MacOS development by creating an account on GitHub.
github.com
Some fail proven notes to it, simple as possible:
Tested on Mac 10.13.6
Tool on screenshot is Clover Configurator.
My path (magic string) was shorter, that in example, also not be confused that for check within Clover, you need to click on Properties button, without it you will see any fields in Device panel.
You need 2 kext, but they are already part of Multibeast / Unibeast install.
gfxtool.. its just same binary to run, you dont need to install and dont need to place it to terminal as in turotial, you just need to run it.. and if you need only gpus in ouput, you need to run it with parameters.
Other confusing thing could be if all value is sample xml are just some samples, or really value which you should use.. Well use it as it, only replace on thing that is: <key>PciRoot(0x0)/Pci(0x1B,0x2)/Pci(0x0,0x0)</key>
Which is your value, its unique for you machine probably
Other confusing coudl be how to identify which card to disable, i example is: /Users/(YourUsername)/Downloads/(gfxdownload folder)/gfxutil -f GFX0
but if you need to disable 2nd or 3rd card you will need no GFX0, but GFX1 or GFX2.
Other good thing how to indetify card is just by its Hardware ID, gfxutil reports it, at start of string, its unique for all card models: 01:00.0 10de:1b81 /PCI0@0/PEG0@1/GFX0@0 = PciRoot(0x0)/Pci(0x1,0x0)/Pci(0x0,0x0)
And you can see it in System report too, in Windows - device manager - Details - Hardware ID - or HWinfo will print it, in Linux you can discover it by: lspci -nn
Other problematic thing for new users could be where exactly in config.plist put next magic XML block,
when i placed it device block after FakeID block, after save the file. You can check if it is right by using Clover Configurator and navigate to same place (switch to Properties tab) and you should see values.. is not config is not right.
You can mount EFI partition with config file with Clover Configurator too, its working for NVME disk too.
How to check that all is fine? Well you can run some 3D app - like Unigine Heaven Benchmark or game, but within Nvidia panel Gsynch.. I had Geforce 1070 and second Geforce 730 no second is gone.
Other check is About Mac - System report - Graphic and displays, there is now only one card.
So multibooting with more GPUs per OS, is the thing even with MacOS. Only problem really seems to be get to MacOS to discover number.. but there are people which will help you can too boot (needed for ATI only, or you have to find out way how to discover magic PCI string without working MacOS and gfxutil) - there some inject flag etc, i never was good with that..
Now i can start mess with 3rd GPU for Windows 98 Linux KVM videocard passthrough. Second was for XP, which runs on Z390 (yeah we have nice Win-Raid community patches for USB3 / NVME boot etc) even natively Its they run in KVM too, you can select.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.