Contribute
Register

Nvidia announces budget friendly GeForce GTX 660 and 650

Status
Not open for further replies.
Article: Nvidia announces budget friendly GeForce GTX 660 and 650

The GTX 650 doesn't have support for SLI

Uhm, what's the point of this comment? OS X doesn't support SLI, I think the previous poster wanted a solution for four displays and as only two outputs work on the higher-end cards...
Also, Nvidia cards without the SLI connector do work in SLI, as it can be done without the SLI bridge, you just get lower performance.
 
Article: Nvidia announces budget friendly GeForce GTX 660 and 650

In the market for a new Card, you guys think the 2GB GTX 650 or a HD 6870 would work better OOB?

Well, ML doesn't like the 6870's as far as the installation is concerned, but my card works fine otherwise. That said, not everyone is having the same luck and some cards are said not to be working that well. The 650 on the other hand should as I wrote in the new story, be identical to that of Apple's GT 650M so it should work OOB but you need to set GraphicsEnabler=No for it to work, just like all the other 600-series cards to date.
 
Article: Nvidia announces budget friendly GeForce GTX 660 and 650

So performance wise the 6870 is still the better buy, or is it too early to tell at this point?
 
Article: Nvidia announces budget friendly GeForce GTX 660 and 650

So performance wise the 6870 is still the better buy, or is it too early to tell at this point?

It would seem like the 6870 should be a bit faster than the 650, but I couldn't really say for OS X, as the drivers aren't the same as in Windows. It also depends on what you're going to be using the card for I guess.
 
GTX650 Mountain Lion native support???
 
Article: Nvidia announces budget friendly GeForce GTX 660 and 650

GTX650 Mountain Lion native support???

I'm sorry, was I unclear somewhere? "Price wise these should be appealing as good CustoMac options, especially as the GTX 650 is pretty much a faster version of the GT 650M that Apple uses in the retina MacBook Pro."

So yes, it works with ML. as if it didn't, neither would the GT 650M that Apple uses.
 
I am considering getting one of these cards for my hackintosh. I am under the impression that they work natively, the question i have does HDMI work natively. I currently have a GT 240 which i want to upgrade anyway, but originally i had set up HDMI with that card but now it it doesn't work so now is a good time to switch.
 
Article: Nvidia announces budget friendly GeForce GTX 660 and 650

I am considering getting one of these cards for my hackintosh. I am under the impression that they work natively, the question i have does HDMI work natively. I currently have a GT 240 which i want to upgrade anyway, but originally i had set up HDMI with that card but now it it doesn't work so now is a good time to switch.

Yes, but you're limited to two outputs simultaneously at the moment.
 
Article: Nvidia announces budget friendly GeForce GTX 660 and 650

OK, I'm tired of waiting and hesitating for a better GFX card, I'll go for a GTX 650. I like the price and the compact form factor. If I decide on something more powerful later, I will not loose much money.

NewEgg has a whole lineup of 650's from different manufacturers. $109 after rebate for the 1Gb version and $139 for the 2Gb:
http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&N=100006662&isNodeId=1&Description=GTX+650&x=0&y=0

$119? NewEgg seem to be "cheating" on the MSRP, maybe because the card is so new.

I have a few questions if someone minds bringing me some light...

1- Besides the particular arrangement of output connectors, DVI's Vs. HDMI's, are there important differences between manufacturers? Reputation in reliability maybe?
I am leaning towards the Zotac with its dual DVI and dual HDMI, I find it more flexible when I move this card to another system later...

2- This one will probably look like a stupid question for someone used to playing with cards (not my case):
How important is memory size on a GFX card, 1Gb Vs. 2Gb?
I mean besides games performance which I don't care about. I am only interested for using Apple's pro apps Aperture and Final Cut. Does bigger memory improve OpenCL?
Also does screen size make a difference in choosing memory size? I intend to move to a dual 27" monitor setup, as my current 27"/24" setup is a bit of a pain due to different dpi's - I feel like an arrogant spoiled kid saying that, while the only thing I am sure of is not being a kid! ;O)

[EDIT] Question 3: If I connect 3 screens, how to I select the only tow that are active? May sound silly, but I am thinking of switching between two screens from time to time (one being a 27" and the other a color accurate 19" or so when I need color accuracy).

Thanks for your time answering me...
 
Article: Nvidia announces budget friendly GeForce GTX 660 and 650

1. Not much of a difference in most cases, but some people are having issues due to slightly different graphics card BIOS' although it's impossible to say which cards will or won't work before someone's tested them. Zotac actually has a $139 card with DisplayPort output as well which looks nice, if a bit expensive.

2. Graphics card memory is a "differentiator" and a bit of a trick to make people buy the cards. Check the clock speed, as sometimes cards with more memory use slower memory. In a card like this, it's unlikely that getting more memory will matter much, as even in the case of playing games, the GPU is the limiting factor when it comes to increasing the resolution and this is really where you need more graphics memory, as with higher resolution, the textures used in the games increase, so you need more memory to store them in.
For general usage, I have a feeling it makes little to no difference for these cards, unless you're planning on running some specific OpenCL/CUDA related programs that can take advantage of the extra memory.
Screen size- no, screen resolution, yes. If you're planning on a pair of 2560x1440 displays, the GTX 650 isn't going to cut it, as it's not a powerful enough GPU to handle anything 3D related on those screens. It might just about work for general 2D tasks though.

3. You can't, at least not with the new Nvidia cards on a hack, as currently all the cards are limited to two display outputs.
 
Status
Not open for further replies.
Back
Top