Contribute
Register

Asus Radeon HD 5450 Silent and Mountain Lion w/ Dual Monitors?

Status
Not open for further replies.
Joined
Feb 11, 2012
Messages
54
Motherboard
MSI GL72M 7RDX
CPU
i7-7700HQ
Graphics
HD 630
Mac
  1. iMac
Classic Mac
  1. SE
Mobile Phone
  1. iOS
[Solved] Asus Radeon HD 5450 Silent and Mountain Lion w/ Dual Monitors?

After quite a long time I finally figured out how to get my HD5450 to work using the kexts from the thread below.

http://www.osx86.net/view/3043-ati_radeon_hd5450_with_full_qe-ci_support.html

I was able to get this working by adding 0x68E11002 to the IOPCIMatch sections of both AMDRadeonAccelerator.kext & ATI5000Controller.kext. I added multiple times into every section of the ATI5000Controller.kext, even though I'm pretty sure I just needed it to be added under the Cedar section. This seemed to be a big bit of info that seemed to be lacking on many threads I read, that the IOPCIMatch values show up multiple times in the file, which I was unaware of.

I'm also a bit hit and miss as far as everything working. The animations seem smooth, the menu bar is translucent, but Chess still crashes when I load it.

Now the problem I have is that I only have one functioning monitor. The DVI connected monitor is working fine, but not the VGA one. I've read from a couple different sources and wasn't able to find anything definitive on the topic, especially since usable ML install instructions seem to be pretty sparse still. I tried changing the AtiConfig to Shrike, Hoolock, Orangutan, Baboon, Vervet all with no result after reading a few different articles about Lion troubleshooting.

With my Lion setup I was able to get QE/CI using this same card, but with GraphicsEnabler=No and no AtiConfig setup. Is there something I'm missing here?
 
Usually, VGA doesn't work in the Mac systems. That's all I can say on the matter.

That's so odd knowing that it works with Lion. I read somewhere that using GraphicsEnabler=Yes that dual monitors won't work, which I can say seems to be true, as my Lion install will only work set to No.
 
Hey, Modsuperstar. Any luck with this yet?

I have two of these Asus Radeon HD 5450 Silent cards (1GB and a 512MB), both have the Device ID of 0x68f91002.
I see that your Dev ID (0x68E11002) is different to mine, is yours possibly a 2GB version?

Dare I say it, mine are currently installed in old AMD based systems running 10.6.8 (but with Lion Kexts).
If I remember correctly I had to cross over to the Dark Side and manually edit the 'Eulemur' framebuffer personalities for each to get full functionality on all three ports (ie. HMDI, DVI and VGA/DSUB). More importantly, in both cases the edits were different for each card, albeit only slightly, even though they share the same Device ID. But this slight difference made a BIG difference between whether everything worked properly or not.

As it stands right now, it sounds like your system is defaulting to the basic RadeonFramebuffer' personality, which will give you most functionality, but will probably crash on Chess and when you run DVD Player. Strangely enough I remember RadeonFramebuffer being much more forgiving at multiple display support than your reports indicate.

As I mentioned before, mine are currently running in Snow Leopard systems, and I've yet to try them in Mountain Lion, but I'll see if I can get one of them into my ML build for some testing if you like?

For now, you're definitely on the right track with the Dev ID edits and 'GraphicsEnabler' should definitely be set to 'Yes'. You just need to make sure that Chimera is picking up your specific model/Dev ID and assigning the correct Framebuffer personality at boot (possibly 'Eulemer') and not defaulting to something like 'Null' which would invoke the 'RadeonFramebuffer'. If there is no entry in Chimera for your card, or just a generic entry, no matter what AtiConfig string you specify at boot, it will probably default to 'Null' (if I'm not mistaken - sorry my memory is a bit rusty).

Either way it may take getting your hand a little dirty to get this sucker up and running.
I was certainly no genius when I started down the path, so if I could figure it out, I'm pretty sure we can get you up to speed and running smoothly :D

Cheers!
 
For now, you're definitely on the right track with the Dev ID edits and 'GraphicsEnabler' should definitely be set to 'Yes'. You just need to make sure that Chimera is picking up your specific model/Dev ID and assigning the correct Framebuffer personality at boot (possibly 'Eulemer') and not defaulting to something like 'Null' which would invoke the 'RadeonFramebuffer'. If there is no entry in Chimera for your card, or just a generic entry, no matter what AtiConfig string you specify at boot, it will probably default to 'Null' (if I'm not mistaken - sorry my memory is a bit rusty).

Either way it may take getting your hand a little dirty to get this sucker up and running.
I was certainly no genius when I started down the path, so if I could figure it out, I'm pretty sure we can get you up to speed and running smoothly :D

Cheers!

I'm running the 1gb version of the card.

How do I add an entry for Chimera? I haven't seen anything process-wise about doing this, so if you could point me in the right direction that would be awesome.
 
I myself have XFX Radeon 5450 2GB version, I had trouble setting it up for ML trying to follow http://www.osx86.net/view/3043-ati_radeon_hd5450_with_full_qe-ci_support.html

I got it working by simply adding my device ID to ML default kexts (the id is same as OP, 68e1), AMDRadeonAccelerator.kext and ATI5000Controller.kext.
Using GraphicsEnabler=No
I would not reach desktop with GE=Yes.
Full QE/CI and performance is good.
 
I myself have XFX Radeon 5450 2GB version, I had trouble setting it up for ML trying to follow http://www.osx86.net/view/3043-ati_radeon_hd5450_with_full_qe-ci_support.html

I got it working by simply adding my device ID to ML default kexts (the id is same as OP, 68e1), AMDRadeonAccelerator.kext and ATI5000Controller.kext.
Using GraphicsEnabler=No
I would not reach desktop with GE=Yes.
Full QE/CI and performance is good.

Fisle is right. First try adding your Dev ID to your Vanilla Mountain Lion kexts.

I generally don't like to rely too heavily on other people's Kext files, besides it makes it more difficult to isolated / solve problems when you're using different data set on similar systems/hardware. It can also cause trouble later when you run any system updates and you don't know how to execute the necessary patches to get things back up and working.

I've looked at the Chimera 1.8.0 ati.c source code, and see entries to assign the 'Eulemur' frame buffer for your Card and Dev ID, so you shouldn't need to delve into that sea of editing and compiling confusion just yet. As long as you're using Chimera v1.8.0 and above we can rule that issue out... for now.

Next is to check if Chimera is successfully assigning 'Eulemur' once you've booted and not hitting a snag and assigning 'RadeonFrameBuffer' instead...

Assuming you're not using any third-party kexts or injectors, one quick way to check is to open up System Profiler, go to the Graphics/Displays option and see how it identifies your card.
If it shows as ATI Radeon HD 5000 series – it's recognising your Cedar card but probably defaulting to 'RadeonFrameBuffer'
If it shows as ATI Radeon HD 5400 series – it's recognising your Cedar card and your Dev ID: 0x68E1 and it's probably loading 'Eulemur'
If it shows as ATI Radeon HD 5450 series – You've hit the jackpot and it's recognising your Cedar card, your Dev ID: 0x68E1 AND your card's specific Subsys IDs – and it's almost definitely loading 'Eulemur'.

Another way to check is to download and run 'IORegistryExplorer.app' and search for "framebuffer".
If you see 'RadeonFrameBuffer' entries in the left-hand panel 'Eulemur' isn't being assigned after booting.
If you see 'Eulemur' entires in the left panel, it certainly is being assigned and you probably only have a slight FrameBuffer personality port mismatch issue –*i.e. the OS is trying to pushing out the HDMI signal via your DVI port or trying to send a digital signal via the analog VGA port or something like that. Which is generally why most people using only their DVI or HDMI ports hardly notice any real problems most of the time...
If you see some other personality in the side panel (eg. Shrike, Vervet, Hoolock or something), then you could have one of those specified under the 'AtiConfig' string in your 'org.chameleon.Boot.plist' file which is causing 'Eulemur' not to be assigned and again resulting in FrameBuffer Personality / port mismatches.

Port mismatches are easy enough to spot fro here, but let's first make sure you're getting the correct FrameBuffer load first!

Try these tests out and let us know your results. Hopefully we'll inch a bit closer to a solution for ya :)

Cheers!
 
Thanks for all the help AppleMacIdiot and Fisle.

I have made some headway, but I'm not quite there yet. I did a fresh install so I didn't have any kexts that were mucked with. I added 0x68E11002 to AMDRadeonAccelerator.kext & ATI5000Controller.kext and was successfully able to boot into ML with 2 monitors. So yay!

In my previous setup the OS was able to detect I was using a 5450 1024mb, but now it is picking it up as 5000 1024mb. Chess still doesn't work. I've also noticed that TextEdit doesn't work either which is kind of annoying. I installed IORegistryExplorer and took a screenshot of what I'm seeing as far as framebuffer setup.

I have tried specifying Eulemur, Vervet and Hoolock in my org.chameleon.Boot.plist without any difference.

ScreenShot2012-08-30at55948PM.png


So any idea what the next step is here?
 
Thanks for all the help AppleMacIdiot and Fisle.

I have made some headway, but I'm not quite there yet. I did a fresh install so I didn't have any kexts that were mucked with. I added 0x68E11002 to AMDRadeonAccelerator.kext & ATI5000Controller.kext and was successfully able to boot into ML with 2 monitors. So yay!

In my previous setup the OS was able to detect I was using a 5450 1024mb, but now it is picking it up as 5000 1024mb. Chess still doesn't work. I've also noticed that TextEdit doesn't work either which is kind of annoying. I installed IORegistryExplorer and took a screenshot of what I'm seeing as far as framebuffer setup.

I have tried specifying Eulemur, Vervet and Hoolock in my org.chameleon.Boot.plist without any difference.

So any idea what the next step is here?

Hi, Mod.

Are you using 'GraphicsEnabler=Yes'?

If so, it looks like it might be worth looking into your boot loader, if you're not able to force 'Eulemur' or any other personalities at boot then you might not get vary far with the rest of your options.

For now what i'm going to need is the following info:
1.) The version of Chimera are you currently using.
2.) A copy of your org.chameleon.Boot.plist - to check that we aren't missing something minor or silly. Happens to me all the time.
3.) A screenshot or text dump of your Graphics/Display values from Apple System Profiler (The one displaying Radeon HD 5000 etc.)
4.) If you have a Windows volume or partition, a ROM/BIN dump for your graphics card using TechPowerUp's GPU-Z. This will help get more accurate SystemID, VendorID and RevisionID for your card and get maybe Chimera recognising your card a little better.

That's all for now.

I'm currently installing my 5450 into my Mountain Lion build. Taking a bit longer than usual, because I had a boot sector error that needed sorting out. Oh, the joys of Hackintoshing ;)

Chat soon.
 
Status
Not open for further replies.
Back
Top