It looks like the Nvidia (NVDA) GTX590/dual GPU slides are starting to leak, and just as we said a year ago, Nvidia is bitten by their woeful power use. In the end, the GTX590 loses to the 6990, but you would never know that from the curious set of benchmarks Nvidia released.
If you want the raw slides, they are on item.taobao.com taken from a Chinese presentation by Gigabyte. The numbers are perfectly readable though, and the specs are just what everyone expected. Short story, two seriously downclocked GF110/GTX580s on a card with a central fan, just like the HD6990.
Because Nvidia can’t keep power under control, not even close, the 590 is clocked at 607MHz compared to the 772MHz of the 580. That is slightly over 20% slower for the core, and memory drops by almost 15%, 855MHz vs 1002 on the 580. Luckily, no shaders were fused off, so it looks to be a complete 512 * 2/1024 shader card.
Nvidia obviously cherry picked the heck out of the GF110 parts bin, the claimed 365W power draw is laughably low. Given the stories about how the GF100 achieved the official 250W rating, and what an unthrottled GTX580 draws, we don’t expect this number to hold up to even the most cursory of tests once they are in the wild.
That said, if you buy one that has decent power delivery, it should OC pretty well as long as the dual vapor chambers don’t dry out. We expect this not to be a problem, the RFQs Nvidia put out a few months ago seem to have decent headroom on them.
How does it perform? That depends on what you look at, the official numbers show the GTX590 beating a HD6990, even in higher performance mode, by a hair. The benchmarks in the slides linked above, have one really curious problem however, they are are painfully low resolution. This is not by accident.
By painfully low resolution, how does 1280*720 with no AA/AF sound on a card like this? How about not a single test of the 11 tests listed going above 1920*1200? Two of the tests are at 1280*1024/720, and the rest are all at 1920*1200/1080. Why is this a big deal? Easy, a single GTX580 or HD6970 is more than powerful enough to drive any single 2MP (1920*1200) class screen at playable frame rates, and a dual card is more than enough to spit out frames faster than a 60Hz panel can show them.
This is the long way of saying that both the GTX590 and HD6990 are a total waste of money on 2MP screens. You need multiple 2MP screens, 3D/120Hz, or a 4MP (2560*1600) panel before there is anything close to a difference between these two cards, or a dual card vs the single GPU variant. Realistically, if you are not using a 2560 or 3 1920 panels, stick with a single GPU, you won’t really miss a thing.
Which brings us back to the benchmarks that Nvidia put out. In a surprising twist, they lose 5 of the 11 benchmarks that they themselves published. That is bad. What is worse is what will happen when you jack the resolution up. ATI cards tend to gain performance relative to their Nvidia counterparts as resolution increases, which is why Nvidia is desperately trying to show benchmarks in as low a rez as they can get away with.
Should they show the same tests on the same system in on a a 2560/4MP panel, it is doubtful that they would win a single test. Three 2MP panels are worse for the green team, and lets not talk about three 4MP panels, much less 5 or 6, not that they have the capability.
In the end, Nvidia is hammered by heat, just like we said a year or more ago. The GF100 base architecture is broken, and there is nothing the company can do to change that. Arguing with the laws of physics, no matter how much you scream and flecks of spit fly from your mouth, is ultimately fruitless.
Nvidia does not seem to understand this, and the architectures they are floating for Kepler and Maxwell are aiming in the wrong direction. Until the company does a 180 degree turn, they are lost in graphics. The GTX590 is a shining example of this, it runs headlong in to the power wall with a dull thud. It may be slower, but it costs more than the competition, and things are only going to get worse on the next generation.S|A
Latest posts by Charlie Demerjian (see all)
- Who will have the first 8K consumer monitor? - Aug 4, 2015
- Lattice outs two new SuperMHL chips and more specs - Aug 3, 2015
- Intel and Micron introduce a new memory type, 3D XPoint - Jul 29, 2015
- Silicon Image demos SuperMHL, 60GHz wireless, and USB-C alt modes - Jul 27, 2015
- Another major SoC tapes out, with a twist - Jul 22, 2015