Contribute
Register

Disable intel integrated graphics to use nvidia discrete graphics instead?

Status
Not open for further replies.
Joined
Nov 24, 2013
Messages
6
Motherboard
HP Envy 15 TS
CPU
Intel Core i7-4700MQ
Graphics
Intel HD 4600 + NVIDIA GeForce GT 740M
Mac
  1. 0
Classic Mac
  1. 0
Mobile Phone
  1. 0
Edit: TL;DR It's probably possible but would require so much work and knowledge of mystical things it's not gonna happen, not from me at least.

I have Intel HD4600 graphics working.
Nvidia Geforce card is detected.
I understand that Optimus graphics switching will not work with OSX so while the nvidia card is detected it will not be used and is just a power drain.
I have seen several suggestions in various forums that disabling the integrated graphics may allow the discrete graphics to be used as the main graphics driver. I have not been able to find anyone who has been able to dismiss or confirm this.
In terms of disabling the integrated graphics, my BIOS has no option for this, though I have seen suggestions this could be done with DSDT edits. Again no posts to dismiss or confirm this.
I do a lot of photo and video editing and requiring the powerful discrete card is the only thing now taking me back to using Windows and I would love to be able to use OSX for this work. My laptop is almost always plugged in while I'm using it so power consumption is of little concern compared to full graphics acceleration.
So before I look in to DSDT edits to disable the nvidia card completely can anyone say for certain if this is ever going to work and any suggestions on how to edit DSDT to disable the integrated graphics?
 
I have Intel HD4600 graphics working.
Nvidia Geforce card is detected.
I understand that Optimus graphics switching will not work with OSX so while the nvidia card is detected it will not be used and is just a power drain.
I have seen several suggestions in various forums that disabling the integrated graphics may allow the discrete graphics to be used as the main graphics driver. I have not been able to find anyone who has been able to dismiss or confirm this.
In terms of disabling the integrated graphics, my BIOS has no option for this, though I have seen suggestions this could be done with DSDT edits. Again no posts to dismiss or confirm this.
I do a lot of photo and video editing and requiring the powerful discrete card is the only thing now taking me back to using Windows and I would love to be able to use OSX for this work. My laptop is almost always plugged in while I'm using it so power consumption is of little concern compared to full graphics acceleration.
So before I look in to DSDT edits to disable the nvidia card completely can anyone say for certain if this is ever going to work and any suggestions on how to edit DSDT to disable the integrated graphics?

I've not heard of anyone doing it. That does not mean it is not possible. But figuring it out (if possible) will require some programming skills.

For your use, you might consider building a desktop instead.
 
I am a developer so I have programming skills, just not at the OS level or with OSX but this is what hacking is all about right :s

Will the difficulty be in working out the correct DSDT/SSDT edits or would there also likely be patching kexts/kernel?

I'm guessing it's not gonna be a case of just switching off a block of code in the DSDT to disable the integrated graphics? I've seen guides on disabling discrete graphics, maybe the process will be similar?

Is turning off the integrated graphics gonna be enough to allow the nvidia card to get picked up as the display driver? I guess also have clover inject the nvidia? Then the nvidia will be supported by vanilla nvidia kexts?

Sorry for all the questions, if I'm gonna take this on it will be the biggest task I've attempted yet so if it's gonna be a dead end i wonna know what I'm getting myself into.
 
I am a developer so I have programming skills, just not at the OS level or with OSX but this is what hacking is all about right :s

Will the difficulty be in working out the correct DSDT/SSDT edits or would there also likely be patching kexts/kernel?

Hopefully just DSDT/SSDT patches...

I'm guessing it's not gonna be a case of just switching off a block of code in the DSDT to disable the integrated graphics? I've seen guides on disabling discrete graphics, maybe the process will be similar?

It is not that simple. It will involve calling the right methods at initialization to switch the outputs (eg. HDMI/LVDS ports) to the discrete card (think about it... you have two graphics cards that share the same physical outputs, unlike a desktop with two graphics cards where the outputs are independent). You'll also need to have the nvidia drivers in OS X enabled correctly (probably DSDT injections).

I think the tricky part will be the fact that both devices must stay active because with optimus the nvidia card is used only to construct the graphics frames. The framebuffer exists on the Intel chip and the nvidia output is routed through it. In other words, correctly working will involve both drivers. And this is where the whole thing probably falls apart. There would need to be support code in the Intel driver to allow for this configuration (connection of framebuffer of nvidia to Intel driver). So here you get into patching the Intel driver (and probably the nvidia drivers), assuming you're familiar, in detail, how both drivers work at the x86 code level. Without internal knowledge, probably not possible.

Macs use a different/incompatible switching mechanism than PCs and the interfaces are not documented by Apple.

I think it would be easier for you to buy a laptop or build a desktop that already has the hardware configuration you'd like to use.
 
It will involve calling the right methods at initialization to switch the outputs (eg. HDMI/LVDS ports) to the discrete card (think about it... you have two graphics cards that share the same physical outputs, unlike a desktop with two graphics cards where the outputs are independent). You'll also need to have the nvidia drivers in OS X enabled correctly (probably DSDT injections)

So I don't disable the intel graphics. Instead I use DSDT to switch the outputs to nvidia, inject the nvidia and somehow set the nvidia to be the source for the display.

I think the tricky part will be the fact that both devices must stay active because with optimus the nvidia card is used only to construct the graphics frames. The framebuffer exists on the Intel chip and the nvidia output is routed through it.

This is what I was afraid of.

So if i understand correctly, even if I could patch the DSDT to switch everything to the nvidia, the output from the nvidia would go nowhere since it needs the intel framebuffer?

So do these nvidia cards have a framebuffer of their own or are the build to only ever be used with intel graphics chips and use optimus switching?

Does the MBP use different hardware setup for framebuffer and switching? I ask because if it's not a difference in hardware then I'm struggling to understand why the OSX switching won't just work with our PC components? :crazy:

Again sorry for all the questions but I've been struggling to get clear understanding of this stuff.
 
So I don't disable the intel graphics. Instead I use DSDT to switch the outputs to nvidia, inject the nvidia and somehow set the nvidia to be the source for the display.

Yes.

This is what I was afraid of.

So if i understand correctly, even if I could patch the DSDT to switch everything to the nvidia, the output from the nvidia would go nowhere since it needs the intel framebuffer?

Yes.

So do these nvidia cards have a framebuffer of their own or are the build to only ever be used with intel graphics chips and use optimus switching?

I don't think they really do... I think it is setup to share the framebuffer with the Intel chip. Unlike a normal nvidia graphics setup, they do not have the connections to the actual display ports (HDMI/LVDS/DP/etc). They go through the Intel chip for that part. So, managing of the connections and displays is done on the Intel side, but the image is generated by the nvidia driver. BTW, same thing going on with a switched Radeon/Intel setup. So the two drivers have to be aware of this and written for it specifically.

Keep in mind I've never been involved in driver development for such devices, so I may have some details off. But my what I write is my (limited) understanding.

Does the MBP use different hardware setup for framebuffer and switching?

Yes. As I understand it, some sort of special device called the GMUX only present on Macs...
 
Ok so even without any dynamic switching and only ever wanting to use the discrete graphics for rendering is going to take some serious work. I guess if you did enough to get that far you wouldn't be far off creating a dynamic switching set up similar to what linux has with bumblebee etc.

At least I can go ahead and disable my nvidia card to save power and know I haven't let it go without a fight :wave:

Given the fact this isn't gonna be resolved by Apple using optimus or PCs coming with Nvidia cards that have GMUX, do you think there will ever be a hackintosh version of the linux dynamic switching drivers? I did see one post on another forum from 2011 I think where someone was trying to get people involved in a project like that but it seemed it died out pretty quickly. Would Apple ever make their internal specs available or someone ever reverse engineer them? Seems sad to accept the vast majority of laptops going forward are gonna miss such a big feature.
 
...
Given the fact this isn't gonna be resolved by Apple using optimus or PCs coming with Nvidia cards that have GMUX, do you think there will ever be a hackintosh version of the linux dynamic switching drivers? I did see one post on another forum from 2011 I think where someone was trying to get people involved in a project like that but it seemed it died out pretty quickly. Would Apple ever make their internal specs available or someone ever reverse engineer them? Seems sad to accept the vast majority of laptops going forward are gonna miss such a big feature.

It is probably so much work, that anyone qualified could easily buy a MacBookPro for far less than the time it would take to accomplish.

Most people simply do not need powerful graphics cards, especially given how far Intel has come with integrated graphics.
 
It is probably so much work, that anyone qualified could easily buy a MacBookPro for far less than the time it would take to accomplish.

Yeah but that's no fun :cool:

Most people simply do not need powerful graphics cards, especially given how far Intel has come with integrated graphics.

This is true, to be fair it's been able to handle all the photoshop work I've required over the past couple months. I haven't tried any video editing or 3D yet.

Thanks for all the help man! :thumbup:
 
...
Now I can focus on my next task to disable the nvidia then move on to try the new method to patch mobile HD4600 for yosemite from gygabyte666's envy page.

Please read the rules: http://www.tonymacx86.com/faq.php

"Prerelease Software or Developer Builds are covered by NDA and therefore there cannot be any discussion of features included or removed within said software. The NDA also prohibits the distribution of any component contained within these releases. There will be no discussion of installing, help with installing or documenting the use of this type of software. Repeated violations of this rule will result in a permanent ban."
 
Status
Not open for further replies.
Back
Top