Contribute
Register

FCPX: dGPU only running better than iGPU+dGPU?

Status
Not open for further replies.
Joined
Oct 9, 2018
Messages
65
Motherboard
Gigabyte z490 Vision G
CPU
i9-10900K
Graphics
HD 630/RX 5700 XT
Can someone help me understand what is going on here?

I did the Brucex test in Final Cut Pro X.

When I ran the test with my GTX 1060 6GB with display as PCIE and internal graphics as auto (with hardware encoding showing NO and Intel 530 not appearing in hardware information), I got a score of 31 seconds.

When I ran it with display as PCIE and internal graphics enabled (hardware encoding YES and Intel 530 appearing in hardware information), I got a score of 90 seconds.

When I ran it with display as IGFX and internal graphics set to auto (hardware encoding YES and Intel 530 appearing in hardware information), I got a score of 65 seconds.

Can someone help me understand why my score is worse when iGPU and dGPU run together (with hardware encoding working)? Shouldn't I be experiencing the opposite?

I would really like for my iGPU to show up in hardware information because I want native power management to work.
 
Last edited:
Please post your SMBIOS.

FCPX uses iGPU when it's on an iMac.

On the iMac Pro/Mac Pro SMBIOS there is no iGPU so it does something under the hood to use the dGPU hardware to decode/encode formats like H264.

If you are using an iMac SMBIOS like the 18,3, go into the BIOS and enable the iGPU (increase iGPU memory to 64MB or so) but set your dGPU as your primary. When you boot into macOS, look at System Information and you should see both the Intel GPU and your GTX1060.

When you do this, FCPX will use the iGPU to encode things like H264 and other iGPU hardware accelerated portions of FCPX which are optimized for Intel HD graphics.

Also on another note, AMD is more optimized for FCPX, not NVIDIA. I think the latest version of FCPX is moving to completely Metal and dumping OpenCL support. NVIDIA is NOT refining their drivers for Metal.

On another Hackintosh I have FCPX running with a 6700k and an RX560 and it works well. iGPU is seen and it's pretty quick with the BruceX test.
 
Please post your SMBIOS.

FCPX uses iGPU when it's on an iMac.

I have iMac 17,1.

On the iMac Pro/Mac Pro SMBIOS there is no iGPU so it does something under the hood to use the dGPU hardware to decode/encode formats like H264.

If you are using an iMac SMBIOS like the 18,3, go into the BIOS and enable the iGPU (increase iGPU memory to 64MB or so) but set your dGPU as your primary. When you boot into macOS, look at System Information and you should see both the Intel GPU and your GTX1060.

I was doing exactly this in BIOS. I had primary display as PCIE and Internal Graphics Enabled. My platform-id was 0x19120000 and this gave me Intel 530 and GTX 1060 in Graphics/Display under Hardware Info. I also had hardware encoding working.

But, my FCPX video output was all garbled and distorted, along with the long render times.

When you do this, FCPX will use the iGPU to encode things like H264 and other iGPU hardware accelerated portions of FCPX which are optimized for Intel HD graphics.

I changed my platform-id to 0x19120001 with the same BIOS settings. I no longer had Intel 530 showing up in Hardware Info, but FCPX render time and video output was fixed. I still had hardware encoding working.

However, I lost IGPU@2 > AGPM one of the signs that native power management was working correctly. Any idea if there is a workaround. I don't want to lose npm if I can help it.

Also on another note, AMD is more optimized for FCPX, not NVIDIA. I think the latest version of FCPX is moving to completely Metal and dumping OpenCL support. NVIDIA is NOT refining their drivers for Metal.

On another Hackintosh I have FCPX running with a 6700k and an RX560 and it works well. iGPU is seen and it's pretty quick with the BruceX test.

Can you elaborate on FCPX dumping OpenCL? Does this mean any future Final Cut Pro (11) won't work with Nvidia anymore while X will? Or will FCPX updates lose OpenCL support. Or is it more like I'll be stuck on High Sierra instead of upgrading to Mojave (if Nvidia drivers are released)?

I will definitely see about moving to AMD if I can sell my GTX, but I also don't want to lose the better gaming gpu (in this case gtx 1060 is better in gaming than rx580).
 
Last edited:
Status
Not open for further replies.
Back
Top