Contribute
Register

[SOLVED] Nvidia GeForce GTX 1080/1070

Status
Not open for further replies.
that awkward moment when the drivers haven't been released and we're two weeks or so short 6 entire months...
 
You will save energy with the secondary card in idle anyway... Not as nice as having it switched off completely, but almost. You just have to disable it from Device Manager, and keep both cards connected to the monitor, maybe with a switch.

I don't really love this type of solution, but if it rocks your boat, go ahead :)
Cool! i'll try that thanks. It's not ideal ofcourse, since i'd like to have my 1070 in MacOS aswell. But for now it's the best option. Luckily my monitor has 2 display ports :)
 
Cool! i'll try that thanks. It's not ideal ofcourse, since i'd like to have my 1070 in MacOS aswell. But for now it's the best option. Luckily my monitor has 2 display ports :)

It is not ideal in the sense that I would always use 2x 980Ti in order to have great performance both in MacOS and Windows.
With 2x 970 I have yet to find a single game that can't run at Ultra settings at less than 50FPS.
 
There are still major fixes coming for metal pretty soon. DeusEx HR delayed because of this. So if Nvidia waits for Apple to fix something there might be a very slight possibility.
 
There are still major fixes coming for metal pretty soon. DeusEx HR delayed because of this. So if Nvidia waits for Apple to fix something there might be a very slight possibility.
^^ I confirm that. In WoW I saw a sensible FPS increase on 10.12.1 already, and it'll get better.
 
so... is there a way to adjust at least only the resolution with a 1060? From 1024x768 to 1920x1080? or nothing?
 
I would like to use the 1080 card for machine learning experiments, not for video. Is this possible with OS X/macOS?

Nightsd asked this same question back in August and master_II responded:

i think you need display drivers in order to use cuda! This means CUDA uses the nvidia drivers to communicate with the card.

Is this really true?

The CUDA download page says "developers should use Xcode 7.x with the CUDA Toolkit 8 for compiling their code on Mac macOS 10.12." I thought CUDA Toolkit 8 would support the 1080 card, so if this doesn't mean that I can use the 1080 card with CUDA on macOS 10.12 then I am confused. Can someone please clarify?

master_II and Nightsd, if you are still reading this (is there a way to formally "mention" someone so that they get notified?), I'd love to hear about your current setup.
 
At this point, I'm just going to get a full size motherboard with multi-GPU support and install a supported card in there just for OS X and disable it in Windows.

This would be also my backup plan, except with Linux instead of Windows. Does anyone have a tried-and-true hardware setup that does this?

@NemesisX, which motherboard and OS X supported graphics card are you planning to buy?

I have been doing the same thing for a few months now. Works perfectly!

@Bagklapper, are you doing this with the hardware mentioned in your profile (GA-Z97-D3H | i7 4790K | GTX 750)?
 
Status
Not open for further replies.
Back
Top