Contribute
Register

Radeon RX 4XX/5XX standalone system, AMDRadeonX4250.kext (GVA support H264) does not support HEVC HW

Status
Not open for further replies.
Hi guys,

I tried to go through the thread but still confused.
I successfully installed Catalina 10.15.4 with Clover on my Dell XPS 8500 with i7-3770, MSI Radeon RX570 8GB.
As I understand i7-3770 has Intel graphics HD 4000. In the BIOS I set Enable Multi-Display support to Enabled. That's the only graphics related option it has. No memory for iGPU, nothing else.

System does not recognize iGPU. And when I run Davinci Resolve 16 rendering it does not utilize Radeon GPU (On Windows it does).
Is there a way to enable hardware acceleration?

Thanks in advance.
 
Hi there @macnb

On my Z390 i9 9900KS iMP build, would it be better to spoof the CPU and/or the mobo?

FWIW, I had the paid version of VideoProc. It doesn't actually use the GPU, it just pretends to. I had a back and forth with the company and they admitted that AMD and MacOS isn't supported. They gave me a refund after a month of email flurry.
Not sure about spoofing CPU or the need to do that.
There are plenty of folks here with a 9900KS cpu.
You have set your model id to iMacPro1,1.
That system does NOT have an IGPU.

My aging 3770K does have an IGPU BUT I turn it off in the BIOS and use iMacPro1,1.
That's is the only way I can get FCPX (10.4.6) to use the RX580 for encode and decode oh H.264 & HEVC.

VideoProc is a strange one. I get it to work even though it says "Graphics N/A":
Screenshot 2020-04-27 at 19.33.20.png

It actually works pretty well in using the RX580 in transcoding H.264 to HEVC and HEVC to H.264.

See these :
H264-to-HEVC-VPv36.png
HEVC-toH264-VPv36.png


Regarding your config, it's looks like you are NOT using Whatevergreen.kext.
I now use it. It takes care of a lot of issues - Black Screen patch, correct number of video ports, ACPI renaming (GFX0,IGPU,etc), GPU power management just to name a few. If you had an IGPU, it takes of connector-less framebuffers, etc.
Can't tell from your config if you have turned off your IGPU but you may want to try that with iMacPro1,1 model-id you are using.

You have a great CPU so may wish to consider using iMac19,1 that closely matches your system and the IGPU of that CPU is pretty decent for H.264 & HEVC encode/decode too.
 
I tried the recommendations in the video and ran some tests. So far, the older version of Final Cut 10.4.6 still renders considerably faster than the new one.

The changes made in 10.4.8 seem to break hardware acceleration causing H265 to render painful slow and also double the render time for H264.

Here's a basic test:

A 1:30 1080p video was rendered using the 'Apple Devices 1080p' and the 'Apple Device 4K 8bit HEVC' standard presets. Here's the results:

FCPX 10.4.6
1080p render time = 30 seconds
4K 8bit HEVC render time = 28 seconds

FCPX 10.4.8
1080p render time = 1:10
4K 8bit HEVC render time = 4:15
Great tests.
I too did similar tests and agree with your findings.

FCPX 10.4.8 and Compressor 4.4.6 are a REGRESSION as far as AMD GPU GPU's are concerned (I think VII fares better).
I have gone back to 10.4.6
 
Hi guys,

I tried to go through the thread but still confused.
I successfully installed Catalina 10.15.4 with Clover on my Dell XPS 8500 with i7-3770, MSI Radeon RX570 8GB.
As I understand i7-3770 has Intel graphics HD 4000. In the BIOS I set Enable Multi-Display support to Enabled. That's the only graphics related option it has. No memory for iGPU, nothing else.

System does not recognize iGPU. And when I run Davinci Resolve 16 rendering it does not utilize Radeon GPU (On Windows it does).
Is there a way to enable hardware acceleration?

Thanks in advance.
Looks like your first post.

Don't know much about XPS 8500 but you should be able to Enable/Disable the IGPU.
Ideally you should be able to make the PCIe / External graphics card (i.e. your AMD GPU) Primary.

I disable IGPU.
I use SMBIOS model ID of iMacPro1,1
I use Whatevergreen.kext

I have not tried Davinci but I know it works well with AMD GPU on macOS. If your GPU is not setup properly, then it may not be able to use it.
 
Looks like your first post.

Don't know much about XPS 8500 but you should be able to Enable/Disable the IGPU.
Ideally you should be able to make the PCIe / External graphics card (i.e. your AMD GPU) Primary.

I disable IGPU.
I use SMBIOS model ID of iMacPro1,1
I use Whatevergreen.kext

I have not tried Davinci but I know it works well with AMD GPU on macOS. If your GPU is not setup properly, then it may not be able to use it.
Thank you. I used your method and everything is fine.
What are these widgets you have on desktop?
 
Thank you. I used your method and everything is fine.
What are these widgets you have on desktop?

You're welcome.
They are Apps.
One is Intel power Gadget (free from Intel).
The other is iStat Pro (paid for). I had to use FakeSMC v6 kext and it's sensors kexts for this as VirtualSMC does not quite work.
 
VideoProc is a strange one. I get it to work even though it says "Graphics N/A":
View attachment 465466
I also do get "Graphics N/A" when using WEG + Lilu(AMDFrameBufferVIB is used) only. But when injecting Orinoco FrameBuffer + WEG, VideoProc does display correct GPU Name. I don't see much of difference in GeekBench 4 score between these two method which is around 14.5k~14.6. But it does show difference in GPU power consumption reading in HWMonitorSMC2 which decreased from 90w to 30 when Orinoco FB is used.
 
I also do get "Graphics N/A" when using WEG + Lilu(AMDFrameBufferVIB is used) only. But when injecting Orinoco FrameBuffer + WEG, VideoProc does display correct GPU Name. I don't see much of difference in GeekBench 4 score between these two method which is around 14.5k~14.6. But it does show difference in GPU power consumption reading in HWMonitorSMC2 which decreased from 90w to 30 when Orinoco FB is used.
Thx for the info.
I do not see any real difference between the WEG-only & WEG+Inject Orinoco method. The inject method has a couple of negative side effects: not all the ports work in the order I expect (DP vs HDMI) and you get an extra port that's not functional and I want DP as the first port.
 
Not sure about spoofing CPU or the need to do that.
There are plenty of folks here with a 9900KS cpu.
You have set your model id to iMacPro1,1.
That system does NOT have an IGPU.

My aging 3770K does have an IGPU BUT I turn it off in the BIOS and use iMacPro1,1.
That's is the only way I can get FCPX (10.4.6) to use the RX580 for encode and decode oh H.264 & HEVC.

VideoProc is a strange one. I get it to work even though it says "Graphics N/A":
View attachment 465466
It actually works pretty well in using the RX580 in transcoding H.264 to HEVC and HEVC to H.264.

See these :
View attachment 465467View attachment 465468

Regarding your config, it's looks like you are NOT using Whatevergreen.kext.
I now use it. It takes care of a lot of issues - Black Screen patch, correct number of video ports, ACPI renaming (GFX0,IGPU,etc), GPU power management just to name a few. If you had an IGPU, it takes of connector-less framebuffers, etc.
Can't tell from your config if you have turned off your IGPU but you may want to try that with iMacPro1,1 model-id you are using.

You have a great CPU so may wish to consider using iMac19,1 that closely matches your system and the IGPU of that CPU is pretty decent for H.264 & HEVC encode/decode too.

Okay, thanks. I'm already using iMP and WEG, etc. I am very pleased with how well FCPX, Compressor, Motion, Resolve, LR6, and CS6 all employ the RX580. The issue was with VideoProc itself. Unless they've updated their software to support MAcOS and AMD, the problem was/is this:

[email protected] wrote:
Dear customer,

Thanks for contacting us.
I contacted our develop team and the Mac do does not suppport AMD GPU for hardware acceleration at present. And we will add this feature in the near future.
Will you accept a half refund and keep the full version for future use?
Please let us know if we could be of any further assistance.

o_O
[1] I paid for two licenses.
[2] It didn't work as advertised.
[3] They only wanted to refund to me half of what I paid.

Conclusion: VideoProc is fine for showing whether hardware acceleration can work in supported software. It just doesn't use the GPU for its own tasks. Run an export from FCPX to Compressor and one will see that the right setup works well. But not VideoProc.
 

Attachments

  • HW Info.png
    HW Info.png
    91.8 KB · Views: 322
  • VideoProc_during decode.png
    VideoProc_during decode.png
    190.3 KB · Views: 335
  • AMD GPU is not employed in VideoProc decoding process.png
    AMD GPU is not employed in VideoProc decoding process.png
    69 KB · Views: 337
Last edited:
Okay, thanks. I'm already using iMP and WEG, etc. I am very pleased with how well FCPX, Compressor, Motion, Resolve, LR6, and CS6 all employ the RX580. The issue was with VideoProc itself. Unless they've updated their software to support MAcOS and AMD, the problem was/is this:

[email protected] wrote:
Dear customer,

Thanks for contacting us.
I contacted our develop team and the Mac do does not suppport AMD GPU for hardware acceleration at present.
And we will add this feature in the near future.
Will you accept a half refund and keep the full version for future use?
Please let us know if we could be of any further assistance.


So, I paid for two licenses. It didn't work as advertised. But they only wanted to refund me half of what I paid. Hmm.
So, VideoProc is fine for showing whether hardware acceleration can work in supported software. It just doesn't use the GPU for its own tasks. Run an export in FCPX to Compressor and one will see that the right setup works well.
Just not VideoProc.
I know that my screenshot you have shown does indicate that INTEL is being employed and not the GPU.
BUT in reality it is NOT. There's NO way my 3rd Gen CPU can transcode even HEVC or H264 source that quickly. Besides, look at IPG tool, iStat and that AMD Monitor App which shows that the CPU is almost IDLE and that the AMD GPU is busy Decoding AND Encoding. Also, note that it shows "INTEL" as green and NOT the CPU which it is indicating that it thinks it is using the IGPU.

I think it's the bug they refer to when they say "does not support AMG GPU" by which they mean it does not display that the AMD GPU is actually being utilised ? That is, they cannot distinguish which GPU is being utilised.

Have you tried turning OFF your IGPU and test ?
 
Status
Not open for further replies.
Back
Top