Contribute
Register

<< Solved >> AMD WX4170 dGPU on ZBook G5 17 Laptop

Status
Not open for further replies.
Okay, will test!


Your brightness control working?
Yes
And do you have h264,
Yes
HEVC hardware acceleration?
Yes but using IGPU as you can't force it unless you use Imac Pro SMBIOS and/or shikigva. (I find shiki slows down performance across the board significantly)

I've done many tests and have concluded that the Hybrid setup is the best as far as performance, battery, flexibility, etc...
There's a lot of info about my tests and conclusions in this thread https://www.tonymacx86.com/threads/hp-zbook-g5-17.266012/post-2158182

For example watching Netflix or light web use on battery:
On Discrete mode, uses 32-35W
On Hybrid mode, 20-26W

FinalCut Render
On Discrete 43Sec.
On Hybrid 18sec.

Cinebench R15
Discrete 110 fps
Hybrid 96 fps

So having both GPU's like a real MacBookPro has it's advantages, and a cost. But I prefer the extra hour+ battery offered by having the IGPU perform all the small tasks, and leaving the heavy lifting to the DGPU only on certain things.
 
the latest config for wx4150, zbook g3 15
Discrete GPU only

What is not working:
-Brightness
of led display without AMD inject.
-Brightness range issue. on a half of brightness icon timeline led shim jumps to the early brightness level.
-Hardware aceleration (no H264 and H265)
-Gpu power control,
After A\C is plugoff, gpu will drop frequency to a power safe mode. But when A\C is reconnected, the frequencies are still low (214-280Mhz). Tried different power states of framebuffer + polariszbook_wx4170.kext.zip , but at most of time without when power supply disconnected, laptop freezes or GPU making extreme power off, and just disconnect itself. Salado and Baladi is great framebuffer with really great power control, but no one of them have no power state info in plist
-Hotplug Pins somenetimes create mirroring issues of same image on other screens if fast connect\disconnect 4 ports
 

Attachments

  • config+newhotplug.plist
    16.8 KB · Views: 45
Last edited:
-Brightness of led display without AMD inject.
Whatevergreen fixes that
-Brightness range issue. on a half of brightness icon timeline led shim jumps to the early brightness level.
Needs a patch, you need to add device PNLF in ACPI, and then if you look at IOREG in your GPU (IGPU in my case but GFX0 in yours)
Screen Shot 2021-04-27 at 1.47.52 PM.png

and match that name in HEX

Screen Shot 2021-04-27 at 1.49.09 PM.png


Screen Shot 2021-04-27 at 1.49.33 PM.png

In my IGPU Config its F14Txxxx and in my Discrete Config it's F19Txxxx (and the patch changes PciRoot to match GFX0)

Screen Shot 2021-04-27 at 1.18.24 PM.png


-Hardware aceleration (no H264 and H265)
Needs shikigva or edit S/L/Private Frameworks/AppleGVA as explained in the ZBook 17 G5 thread.
-Gpu power control, After A\C is plugoff, gpu will drop frequency to a power safe mode. But when A\C is reconnected, the frequencies are still low (214-280Mhz). Tried different power states of framebuffer + polariszbook_wx4170.kext.zip , but at most of time without when power supply disconnected, laptop freezes or GPU making extreme power off, and just disconnect itself. Salado and Baladi is great framebuffer with really great power control, but no one of them have no power state info in plist
I'm not sure I understand this, but when I unplug my laptop, everything is pretty much the same except the GPU slows down to about 1/2 performance, but when I connect AC, everything resumes as before.
I tested with an external screen and another USB-C device that drew a lot of power and they both disconnected on battery only, so maybe that's what you mean?
Does the main screen stop working?
-Hotplug Pins somenetimes create mirroring issues of same image on other screens if fast connect\disconnect 4 ports
No idea, doesn't happen here.
 
Last edited:
Whatevergreen fixes that

Needs a patch, look at IOREG in your GPU (IGPU in my case but GFX0 in yours)

View attachment 516604
and match that name in HEX
View attachment 516607

View attachment 516608

In my Discrete Config, it's:

View attachment 516609


Needs shikigva or edit S/L/Private Frameworks/AppleGVA as explained in the ZBook 17 G5 thread.

I'm not sure I understand this, but when I unplug my laptop, everything is pretty much the same except the GPU slows down to about 1/2 performance, but when I connect AC, everything resumes as before.
I tested with an external screen and another USB-C device that drew a lot of power and they both disconnected on battery only, so maybe that's what you mean?
Does the main screen stop working?

No idea, doesn't happen here.
Can you share your dsdt patches and config?


The key difference is iam using discrete gpu only, and my gpu is not supported by the laptop. I have a lot hardware problems, which was solved by designing some pcbs to solve those issues. I even try to mod the bios rom, but not succeeded to pass RSA protection...
 
Can you share your dsdt patches and config?

There's a (bit older but should work) copy of my EFI on one of the last posts on the other thread.
I'm making a github guide but haven't had time to finish it.

A bit back, I used your vrom flashing trick to get my Zbook G2 (K2000) going, and it worked perfectly, then I upgraded the mxm to a GTX860 and everything was good.
Then I got a new laptop (the G5) and I tried the WX-4170 on my ZBook G2 and I seem to remember it worked ok.
Because I wanted to use the GTX860 from my G2 on my G5 and WX-4170 on the G2.
But the problem was the G5 didn't like the GTX860 at all, the fans kept going nuts, outputs wrong, no sleep, overheating, etc... I suspect the architectures were just too different.

I insist the G2 was just so "universal", and easy to hack (took 2 days to get it perfect). But HP keeps making it harder. Now the G5 is months in the making, but I believe it's finally as good as it can get.
 
Last edited:
But the problem was the G5 didn't like the GTX860 at all, the fans kept going nut
The trick to control fanspeed, but you will always get post lights with message “gpu failure”, and message warning to replace fans

Thermal probe with thermal paste is located near the crystal on gpu processor pcb. Thermal readings is exactly the same as in hw sensors
Overheating
You should use nvidia Kepler heatsink
 

Attachments

  • 0E8D59E2-5FB7-4504-B1F5-A69B8DC42473.png
    0E8D59E2-5FB7-4504-B1F5-A69B8DC42473.png
    1.5 MB · Views: 40
  • 59DCEA30-C35A-4399-8D31-D1A68FC73A19.png
    59DCEA30-C35A-4399-8D31-D1A68FC73A19.png
    1.6 MB · Views: 40
Last edited:
Definitely more than what I wanted to do to make it work. (also the outputs were wrong)

And then there was the whole Coffeelake - not really working well with High Sierra - webdrivers only work in HS problem.

So in the end the 4170 works.
 
Last edited:
There's a (bit older but should work) copy of my EFI on one of the last posts on the other thread.
I'm making a github guide but haven't had time to finish it.

A bit back, I used your vrom flashing trick to get my Zbook G2 (K2000) going, and it worked perfectly, then I upgraded the mxm to a GTX860 and everything was good.
Then I got a new laptop (the G5) and I tried the WX-4170 on my ZBook G2 and I seem to remember it worked ok.
Because I wanted to use the GTX860 from my G2 on my G5 and WX-4170 on the G2.
But the problem was the G5 didn't like the GTX860 at all, the fans kept going nuts, outputs wrong, no sleep, overheating, etc... I suspect the architectures were just too different.

I insist the G2 was just so "universal", and easy to hack (took 2 days to get it perfect). But HP keeps making it harder. Now the G5 is months in the making, but I believe it's finally as good as it can get.
There's a (bit older but should work) copy of my EFI on one of the last posts on the other thread.
I'm making a github guide but haven't had time to finish it.

A bit back, I used your vrom flashing trick to get my Zbook G2 (K2000) going, and it worked perfectly, then I upgraded the mxm to a GTX860 and everything was good.
Then I got a new laptop (the G5) and I tried the WX-4170 on my ZBook G2 and I seem to remember it worked ok.
Because I wanted to use the GTX860 from my G2 on my G5 and WX-4170 on the G2.
But the problem was the G5 didn't like the GTX860 at all, the fans kept going nuts, outputs wrong, no sleep, overheating, etc... I suspect the architectures were just too different.

I insist the G2 was just so "universal", and easy to hack (took 2 days to get it perfect). But HP keeps making it harder. Now the G5 is months in the making, but I believe it's finally as good as it can get.
You pushed me to interesting idea.

What if chipset cannot find right gpu ssdt tables because of unknown gpu and injects wrong ssdt.

So what I have done:
Using F4 in clover, I drop oem tables with different configurations: Intel+Amd, Amd only, Intel+Amd+Nvidia egpu, Amd+nvidia egpu.

Prompt: laptop has two ssdt configs. The ATIGFX, NVIDIAGF - for gpu which is main and the AMDSGTBL, NVSGTBL for internal gpu.

“SGTBL” can be a suffix and abbreviation of “secondary graphic T backlight”

The next step was to extract gpu ssdt from G4 rom. As all ssdt and dsdt is aml files in rom. 10 years ago I modded Asus p5ql-pro bios tables with patched for hackintosh and it works fully like real Mac. But after 10 years I forget everything about acpi patching. So back into topic))

G4 GPUs sdt tables are really different. I will later find out what the difference.

Back to the beginning:
In the image I show different hardware configs and how the chipset inject it (I hope that clover is not selecting sdt tables, otherwise it will be really hard to say bootloader how to select proper ssdts)

DF671A71-D7EC-424E-9065-5073B3D934B7.jpeg


So when egpu connected, it won’t work, because:
- when Intel in Hybrid, Amd as secondary (AMDSGTBL), egpu is also secondary (NVSGTBL). But should be NVIDIAGF.
- when Discrete only + egpu, the two GPUs are main. But screens are black because two of them is GFX0@0, so os recognizes them as one gpu. The ssdt configs are ATIGFX, NVIDIAGF.


So to make egpu working I need to force inject NVIDIAGF and patch (I don’t know how and where) pci or acpi path to make it GFX0@1, or GFX1@0



Upd 1: After unknown steps, egpu load os but the last problem is the nvidia doesn’t get any image. Intel is working on nvidia ports, because of switchable dsdt applied which I can’t force reinject or replace with standalone dsdt
 
Last edited:
You pushed me to interesting idea.

What if chipset cannot find right gpu ssdt tables because of unknown gpu and injects wrong ssdt.

So what I have done:
Using F4 in clover, I drop oem tables with different configurations: Intel+Amd, Amd only, Intel+Amd+Nvidia egpu, Amd+nvidia egpu.

Prompt: laptop has two ssdt configs. The ATIGFX, NVIDIAGF - for gpu which is main and the AMDSGTBL, NVSGTBL for internal gpu.

“SGTBL” can be a suffix and abbreviation of “secondary graphic T backlight”

The next step was to extract gpu ssdt from G4 rom. As all ssdt and dsdt is aml files in rom. 10 years ago I modded Asus p5ql-pro bios tables with patched for hackintosh and it works fully like real Mac. But after 10 years I forget everything about acpi patching. So back into topic))

G4 GPUs sdt tables are really different. I will later find out what the difference.

Back to the beginning:
In the image I show different hardware configs and how the chipset inject it (I hope that clover is not selecting sdt tables, otherwise it will be really hard to say bootloader how to select proper ssdts)

View attachment 516686

So when egpu connected, it won’t work, because:
- when Intel in Hybrid, Amd as secondary (AMDSGTBL), egpu is also secondary (NVSGTBL). But should be NVIDIAGF.
- when Discrete only + egpu, the two GPUs are main. But screens are black because two of them is GFX0@0, so os recognizes them as one gpu. The ssdt configs are ATIGFX, NVIDIAGF.


So to make egpu working I need to force inject NVIDIAGF and patch (I don’t know how and where) pci or acpi path to make it GFX0@1, or GFX1@0



Upd 1: After unknown steps, egpu load os but the last problem is the nvidia doesn’t get any image. Intel is working on nvidia ports, because of switchable dsdt applied which I can’t force reinject or replace with standalone dsdt

I've wondered myself if I could inject some properties from different SSDT configs (discrete/hybrid/udma) to try to get Catalina to load.

Is your thunderbolt tree working now? do you get hotplug?
I believe there's 2 modes of thunderbolt operation, as a PCI "extension" or as a Thunderbolt device.

In my G2 the Thunderbolt 2 devices had to be plugged in at startup and were recognized as PCI cards (ejectable with a crash) but that was a different controller.

Another option for you would be to flash your TB3 controller chip with apple's FW and then your eGPU would bypass all the DSDT stuff and just be controlled by apple's TB3 drivers?
 
Status
Not open for further replies.
Back
Top