Contribute
Register

NVIDIA Launches "Maxwell" GeForce GTX 750 and 750 ti Graphics Cards

Status
Not open for further replies.
That would be great if you share your settings for clover.

Just a heads up Yosemite auto-updated to 10.10.1 which I have now disabled the auto updating setting.

It evidently broke the current NVIDIA driver so when the machine rebooted OS X wasn't recognising my video card. I could still use my displays via the video card but it was low-res and chuggy. When I tried to update the NVIDIA driver during the installation it would stop and say it couldn't detect my video card. I got a little worried the NVIDIA wouldn't install but fortunately the user cccip has posted up a revision of the driver with the hardware checking off. After installing and rebooting everything came back to normal.

Link
http://www.tonymacx86.com/graphics/...drivers-10-10-1-343-01-02-a-2.html#post926209

Driver:
http://www.mediafire.com/download/ylsp1one9imbw2x/WebDriver-343.01.02f01_nohw.pkg.zip

Thanks for that. Just downloading it. So, the first link is the original drivers untouched and the second from mediafire is the modified ones? Let me know about this as I only downloaded the drivers from the mediafire link, I already have the 10.10.1 nvidia web drivers.

I have done some progress and I found out that just by installing the nvidia web drivers while booted with chimera I managed to, on the very next boot, to boot with clover efi again. Any subsequent boot after that, it refuses to boot again.

What do you mean by not detecting the card? You mean with the nvidia web drivers right? There is still no native support in OS X for the Maxwell cards.

I sold my GTX 650, this was my fail safe option in case I would get in trouble as the 650 works native, no need for the web drivers at all. I can't wait for the next Mac's with Maxwell so we can finally have support and not have to mess about and be on edge like this.

I am not too worried, I have a full, operational, mirrored drive in case something goes horribly wrong with my install. I still have not had to use it, but I just tested it, and it is working fine.

I am currently in the process of getting the hex string from the card but gfxutil is giving me some issues.

I will update you once I have some news.
 
you guys have the palit card.... that is why DVI is working.....
it doesnt work on any other card....

Yep, it has less ports and no g-sync because it doesn't have a display port but all ports are working.

AC Unity looks amazing with this card. This was the main reason why I bought it.
 
you guys have the palit card.... that is why DVI is working.....
it doesnt work on any other card....

Mine is working great (Gigabyte version here). Did you try both DVI ports?
 
That would be great if you share your settings for clover.

Just a heads up Yosemite auto-updated to 10.10.1 which I have now disabled the auto updating setting.

It evidently broke the current NVIDIA driver so when the machine rebooted OS X wasn't recognising my video card. I could still use my displays via the video card but it was low-res and chuggy. When I tried to update the NVIDIA driver during the installation it would stop and say it couldn't detect my video card. I got a little worried the NVIDIA wouldn't install but fortunately the user cccip has posted up a revision of the driver with the hardware checking off. After installing and rebooting everything came back to normal.

Link
http://www.tonymacx86.com/graphics/...drivers-10-10-1-343-01-02-a-2.html#post926209

Driver:
http://www.mediafire.com/download/ylsp1one9imbw2x/WebDriver-343.01.02f01_nohw.pkg.zip

Boom! Got it working mate!

Right, after searching for a solution for what seemed like an eternity I have finally managed to get it working with clover and it boots every time!

The solution is the simplest thing ever. All you need to do is, when installing clover, instead of installing OsxAptioFixDrv-64, install the option above OsxAptioFix2Drv-64. Thats it, booting with no problems at all.

In my case I have chimera as a non efi bootloader and clover as efi. I just had to run the latest clover v2k r3021 and when it gets to choose the location I clicked customise.
All the options I had selected when I first installed were there and analysed as upgrade.
I then just unticked the OsxAptioFixDrv-64 and ticked OsxAptioFix2Drv-64.

Rebooted and bingo, it's working again.

Just to make sure I rebooted 5 times and it's working fine.
 

Attachments

  • Screen Shot 2014-11-21 at 19.35.57.jpg
    Screen Shot 2014-11-21 at 19.35.57.jpg
    208.4 KB · Views: 712
Mine is working great (Gigabyte version here). Did you try both DVI ports?

:thumbup: I also have a Gigabyte GTX 750 Ti (GV-N75TWF2OC-2GI) and bought it for the dual DVI capability to drive two monitors. Just took delivery of it. Works like a champ! See more information here including benchmarks.
 
My Gigabyte GTX 750 is working perfectly, except for one problem - it isn't giving me the full 4K resolution. I bought a Samsung U28D590 4K monitor and the Gigabyte GV-N750OC-2GL. The highest resolution that the Display preferences will show me is 3200x1800 (rather than 3840x2160). I'm using the latest Nvidia Web drivers, and I'm connected with DisplayPort (which is what is required for full resolution at full refresh with this monitor). Any ideas?

UPDATED: OK, now I'm very confused. The Display Preferences says I'm at 2560x1440, and only gives me choices up to 3200x1800 (see attachment #1), but the monitor itself and About This Mac both say I'm at 3840x2160 (see attachment #2). I'll have to try to figure out which is correct...but why are they different?

SECOND UPDATE: Display Preferences is correct, I measured it with a little app called Free Ruler. It's at 2560x1440. So the question remains, why won't it let me choose the monitor's full resolution? And why does the monitor itself say it's at 4K resolution?

DisplayPrefs.png#1
AboutMacDisplay.png#2

(and I apologize for posting the same thing in two threads, but I thought it would be useful to do so).
 
Mine is working great (Gigabyte version here). Did you try both DVI ports?

I found that there are multiple versions of the gigabyte 750 ti! which one in perticular do you have??

I havent tried DVI-D, because i dont have a DVI-D monitor, and dont have a cable to convert it to HDMI....

BTW, i have the GV-N75TOC-2GI, but Stork has the GV-N75TWF2OC-2GI


which one do you have???? (sorry for dogey text.... copied off gigabytes website....)


PS. Stork's one has a higher clock.... (but in windows i got mine to 1400mhz anyway...)
 
My Gigabyte GTX 750 is working perfectly, except for one problem - it isn't giving me the full 4K resolution. I bought a Samsung U28D590 4K monitor and the Gigabyte GV-N750OC-2GL. The highest resolution that the Display preferences will show me is 3200x1800 (rather than 3840x2160). I'm using the latest Nvidia Web drivers, and I'm connected with DisplayPort (which is what is required for full resolution at full refresh with this monitor). Any ideas?

UPDATED: OK, now I'm very confused. The Display Preferences says I'm at 2560x1440, and only gives me choices up to 3200x1800 (see attachment #1), but the monitor itself and About This Mac both say I'm at 3840x2160 (see attachment #2). I'll have to try to figure out which is correct...but why are they different?

SECOND UPDATE: Display Preferences is correct, I measured it with a little app called Free Ruler. It's at 2560x1440. So the question remains, why won't it let me choose the monitor's full resolution? And why does the monitor itself say it's at 4K resolution?

(and I apologize for posting the same thing in two threads, but I thought it would be useful to do so).

I'm pretty sure it's a driver issue. From my understanding Apple recommends the Sharp 4K display as it's one of the few (only?) that they can fully support. I'm betting that if you try it in a Windows environment it will work fine. Given that 4K is still relatively new I would guess it may be a bit of a waiting game before support really grows.
 
I'm pretty sure it's a driver issue. From my understanding Apple recommends the Sharp 4K display as it's one of the few (only?) that they can fully support. I'm betting that if you try it in a Windows environment it will work fine. Given that 4K is still relatively new I would guess it may be a bit of a waiting game before support really grows.

Actually, I found out, that's just how Mac OS X handles Retina and Retina-type displays. If I hold down Cmd and click the "Scaled" option button on the Displays preferences pane, it shows me all available resolutions. Then I can select 3840x2160. However, it turns out I don't really want the full 3840x2160 - it's too small for my 47-year-old eyes to comfortably read. What I want is the 2560x1440 Retina resolution, which is really 3840x2160 scaled down to 2560x1440 by OS X. It's crisp, clear, gorgeous, and has more than enough screen real estate for my needs.
 
Status
Not open for further replies.
Back
Top