Contribute
Register

[SOLVED] Nvidia GeForce GTX 1080/1070

Status
Not open for further replies.
Anyone knows about 1070?? Good price but what is the power under the hood??
 
Anyone knows about 1070?? Good price but what is the power under the hood??

Preliminary quotes from nVidia have it matching the Titan X in most tasks, but no benchmarks yet.
 
Anyone knows about 1070?? Good price but what is the power under the hood??
1920 Cuda Cores, 1.6Ghz boost, 8GB GDDR5 (not X at 256GB/s) and the MSRP is pretty much all we know right now.
 
Oh it's no secret AMD has been the clear winner in the compute world when the app(s) involved did not have CUDA support. That's never been in dispute. And very few games make actual use of compute power, so typically nVidia will win on the gaming front. A/V editing though...AMD wins.

But for OS X, while AMD may have good hardware, the drivers are usually in a barely functional state and stop getting bugfixes shortly after the next generation GPU comes out, which makes them miserable to use and very suboptimal relative to the optimizations and bugfixes nVidia puts out.

That said, I'm SO drooling over the GTX 1080. Knowing my luck though I won't be able to afford a new GPU until its Ti part surfaces, which is just fine with me. :)

True things!

Waiting for 1080 Ti or big Vega is probably a smart move. The 1080, specially the FE cards, is super overpriced-- a medium size, mainstream chip sold as interim high end. Not only that, but it runs hot and thermal throttles very quickly with the reference cooler (pretty sneaky way to make it look like reference can clock at 2+ ghz, when it can barely maintain stock boost clocks!). The 1070 looks to be very, very cut down.

Of course, let's see what actual benching tells us!
 
True things!

Waiting for 1080 Ti or big Vega is probably a smart move. The 1080, specially the FE cards, is super overpriced-- a medium size, mainstream chip sold as interim high end. Not only that, but it runs hot and thermal throttles very quickly with the reference cooler (pretty sneaky way to make it look like reference can clock at 2+ ghz, when it can barely maintain stock boost clocks!). The 1070 looks to be very, very cut down.

Of course, let's see what actual benching tells us!

What? That seems to be entirely untrue. http://hothardware.com/reviews/nvidia-geforce-gtx-1080-pascal-gpu-review
 

Sorry, I don't want to get too off topic (but I don't think there's anything new to talk about re 10-series compatibility just yet).

The GTX Founders Edition has some serious issues, notably at the $100 premium. It thermal throttles after 20 minutes... to below boost clocks! The cooler just isn't very effective, and it's loud. It hits 83 C and clocks drop. It doesn't have the PCB power delivery or connecters to support higher watts, meaning there's a serious limit to overclocking or VBios editing (the former more of a Windows issue, the latter applicable for Clover).

The die itself (GP104) is medium sized, only a little bigger than 300mm2. In comparison, large die GPUs range between 450mm2 to 600mm2 (980ti, Fury, even Hawaii/290/390x) generally. This is a mainstream chip released as high end. It's powerful, but it's going to seem like a bad purchase once 1080ti or Vega is released (big die); Nvidia and AMD have bifurcated their releases on this and recent nodes, selling mainstream chips as high end for short interval before selling the actual big die chips. Smart marketing/release strategy!

I'm not saying 1080 is a bag chip. It's great! But I think it's very overpriced given these things-- definitely in current reference (ahem, "founder's edition") form at $100 premium. I'd wait for partners to release custom coolers, at minimum. If you can wait for GP102 or Vega, later this year or early next... that's going to be the best bang for the buck time to make a purchase. (Past releases support this notion.)

http://www.computerbase.de/2016-05/...bschnitt_bis_zu_1785_mhz_takt_unter_dauerlast

To quote via Google Translate:

The maximum achievable 1,886 MHz, the GeForce GTX 1080 Founders Edition are more of a theoretical value: In games you never see this clock. There, the highest ever measured frequency was during the tests 1,835 MHz, but was also the clock after a few seconds history. In demanding titles 1.785 MHz has finally placed as the highest, realistic clock out.

The Founders Edition is often too warm
With the default settings, however, this is often also not last long. In some titles at once the power reaches a target, finally slows in all games at the latest after a few minutes, however, the Temperature Target the GPU a. The highest permanently reached clock in a game was 1,785 MHz. In Ashes of the Singularity and F1 2015, it is located on repeatedly.

On average, the GPU clocked at 1,667 MHz
DISPLAY

In Anno 2205, Star Wars: Battlefront and The Witcher 3, the GeForce GTX 1080 Founders Edition is no longer on the base clock of 1,607 MHz to a few minutes. Most of the games on the other hand operates at a frequency that is somewhere in the middle between the extremes. On average, the test card runs at 1,667 MHz.

If that's not enough, you can raise the power and the Temperature-target without warranty loss. If both maximized, the GeForce GTX 1080 in 17 works of 21 games with the full 1,785 MHz. In Anno 2205 you have to with 1,721 MHz satisfied, the lowest value. In Star Wars: Battlefront, The Talos Principle and The Witcher 3 is due to not the full cycle. In all four cases, then limits the target power and the temperature no longer

The higher the resolution, the lower the clock
What can not be seen in the table, is the somewhat varying timing behavior between resolutions. The higher the resolution and therefore the GPU utilization, the lower falls of the clock. Say: One and the same game clocked in 1,920 × 1,080 higher than in 2,560 × 1,440 and 3,840 × 2,160 in accordance with again a little lower. The frequency differences are mostly small, but reproducible.
 
Last edited:
BTW (because it's hard to tell over the Intertubes): I hope I'm not coming off as argumentative or aggressive-- that's not my goal at all!

I just think that at the moment it's easy to make a buying mistake with the new GTX 1080/70 FE. Of course, if one can afford to upgrade on each release, it's no issue at all-- and there are no mistakes to be made! But if you follow the 680, 780(ti), 980(ti) sequence, there were some real pitfalls in the medium vs. big die (or full die vs cut wrt to the x70 versions).
 
I'm actually quite surprised to see thermal throttling on a GPU with just 180W TDP.
My reference GTX 780 (TDP: 250W!) can maintain boost clocks all day without throttling, even slightly OC'ed. One would assume that this (extraordinary expensive) "Founders Edition" isn't any worse than the old "Titan cooler" sitting on my 780...?!

Or is the TDP rated at base clocks, so the boost states will draw more power? The 1080 is 1x8pin though, so it should never draw more than 225W...

I'm also a little shocked about the official prices. Almost 800€ in Germany for a reference Gx104 card is just insane, considering this isn't even the high-end model (1080Ti, Titan still to come). Let's hope that AMD will finally come up with some competitive products (and that Apple will get their drivers fixed...)
 
Today is the release for the founders edition of gtx 1080! I wanted to propose a thread (or perhaps we can continue to use this one) for any one who has picked up a card and is testing / troubleshooting compatibility! I look forward to hearing some good news *fingers crossed*.
 
Today is the release for the founders edition of gtx 1080! I wanted to propose a thread (or perhaps we can continue to use this one) for any one who has picked up a card and is testing / troubleshooting compatibility! I look forward to hearing some good news *fingers crossed*.

It's been released, but it's sold out everywhere it seems.
 
Status
Not open for further replies.
Back
Top