SemiAccurate Forums  

 
Go Back   SemiAccurate Forums > Main Category > GPUs

GPUs Talk about graphics, cards, chips and technologies

Reply
 
Thread Tools Display Modes
  #31  
Old 11-15-2010, 05:28 PM
distinctively's Avatar
distinctively distinctively is offline
2^11
 
Join Date: Sep 2009
Location: Waterloo, Ontario
Posts: 2,517
distinctively will become famous soon enoughdistinctively will become famous soon enough
Default

Quote:
Originally Posted by Palliot View Post
Which is exactly the same with previous gen ATI cards. Well almost the same, in the case of the 580 you gotta hack the drivers.




Really? So you are telling me AMD is listing maximum power consumption for unknown thermal viruses for all their GPUs and CPUs? Gee, I thought they listed TDP and ACP, just like intel, nvidia and everyone else. And I think they define it slightly different too.
There's a pile of measurements for the power draw of the cards all over the internet. AMD's numbers are legit. They've been tested and are reliable. NVidia's are ridiculous, hence the coined question "are those regular watts or nVidia watts.

The 580's potential power draw is beyond sane and it's only one chip. Good luck getting any life out of the card. As I've asked before, is trend going to be followed with the 560?
__________________
There are two kinds of people in the world: Those who like bacon and those who will be used as fodder in the case of a zombie apocalypse.
Reply With Quote
  #32  
Old 11-15-2010, 05:34 PM
APU_Fusion APU_Fusion is offline
2^10
 
Join Date: Oct 2010
Location: Oregon coast
Posts: 1,180
APU_Fusion is on a distinguished road
Default

Quote:
Originally Posted by distinctively View Post
There's a pile of measurements for the power draw of the cards all over the internet. AMD's numbers are legit. They've been tested and are reliable. NVidia's are ridiculous, hence the coined question "are those regular watts or nVidia watts.

The 580's potential power draw is beyond sane and it's only one chip. Good luck getting any life out of the card. As I've asked before, is trend going to be followed with the 560?
What do you mean by "trend going to be followed with the 560?" I don't think even a fully enabled 560 (460) would come close to the 300 watt barrier. Why would they need a limiter? Do you mean that they would provide one so the reviewers numbers look better than they really are for marketing purposes? Just curious as to what you mean specifically (not questioning validity of statement)
__________________
What you talkin bout Willis?
Reply With Quote
  #33  
Old 11-15-2010, 05:53 PM
distinctively's Avatar
distinctively distinctively is offline
2^11
 
Join Date: Sep 2009
Location: Waterloo, Ontario
Posts: 2,517
distinctively will become famous soon enoughdistinctively will become famous soon enough
Default

Quote:
Originally Posted by APU_Fusion View Post
What do you mean by "trend going to be followed with the 560?" I don't think even a fully enabled 560 (460) would come close to the 300 watt barrier. Why would they need a limiter? Do you mean that they would provide one so the reviewers numbers look better than they really are for marketing purposes? Just curious as to what you mean specifically (not questioning validity of statement)
What I mean is that when the 560 comes out, its going to have big power draw again, not improved. The throttle is an artificial means of controlling the TDP. The chip thermals have not been improved at all. If this is true, then this means that nVidia hasn't figured out how to reduce power draw for the architecture in its entirety. That doesn't look like a good avenue for cheaper cards or potential future mobile products.
__________________
There are two kinds of people in the world: Those who like bacon and those who will be used as fodder in the case of a zombie apocalypse.
Reply With Quote
  #34  
Old 11-15-2010, 06:00 PM
rambaldi's Avatar
rambaldi rambaldi is offline
2^11
 
Join Date: Jan 2010
Location: Auckland, New Zealand
Posts: 2,435
rambaldi will become famous soon enoughrambaldi will become famous soon enough
Default

Quote:
Originally Posted by distinctively View Post
Uh, the 580 is supposed to be a fixed GPU addressing thermal problems. This was the best they could do? The X2 was ATi's first dual GPU card. Yup, it was hot and power hungry. They improved upon that with the 5970 and yup, thats a hot card too. Of course, you don't here AMD bragging about better thermals with it.
The 3870x2 wishes to disagree with you. And the 2600 Xt x2 says pick me first. While the Sapphire X1950 Pro Dual wonders if it counts.

Also it seems my post responding to Redpriest has been lost

The gist was 700 gen uses software limites, initially looking for furmark.exe after that looking at the ratio of ALU to texture unit computations. 800 gen monitors the VRMs looking for specific overload coniditions (I think in hardware) if they occur then it drops the card down a powerplay level.
__________________
E-Peen: AMD Phenom II x4 965 (stock) | MSI 790FX-GD70 | ATI HD 5970 (stock) | Kingston HyperX DDR3 1333 CL7 2x2GB | Seagate 7200.12 4x1TB | Ikonic RaX10 Liquid
Reply With Quote
  #35  
Old 11-15-2010, 06:05 PM
APU_Fusion APU_Fusion is offline
2^10
 
Join Date: Oct 2010
Location: Oregon coast
Posts: 1,180
APU_Fusion is on a distinguished road
Default

Quote:
Originally Posted by distinctively View Post
What I mean is that when the 560 comes out, its going to have big power draw again, not improved. The throttle is an artificial means of controlling the TDP. The chip thermals have not been improved at all. If this is true, then this means that nVidia hasn't figured out how to reduce power draw for the architecture in its entirety. That doesn't look like a good avenue for cheaper cards or potential future mobile products.
Ah gotcha. I wouldn't be surprised if the 560 used more power while fully enabled but I would be very surprised if they used a limiter as it wouldn't be needed (at least for the 300 watt issue). I wonder what impact in the real world the limiter is going to have if games don't ever hit the need for it? If the card with the limiter shows it not going over 300 watts does it make a difference in the end for the consumer as they are going to trade off heat and power for performance ala the 5970 overclocked (and over 300 watts). I would imagine delimiting the card and running it over 300 watts consistently could shorten the lifespan but by what amount? Interesting stuff.
__________________
What you talkin bout Willis?
Reply With Quote
  #36  
Old 11-15-2010, 06:09 PM
fusion fusion is offline
640k who needs more?
 
Join Date: Oct 2009
Posts: 745
fusion is on a distinguished road
Default

Unless you bought your video card to run furmark, then it's a non-issue. If it operates way above TDP in games or regular application, then there's a problem. Over wise this is exactly like the 4870x2, small backlash, then forgotten.
Reply With Quote
  #37  
Old 11-15-2010, 06:22 PM
TakeANumber's Avatar
TakeANumber TakeANumber is offline
2^11
 
Join Date: Dec 2009
Posts: 2,535
TakeANumber will become famous soon enoughTakeANumber will become famous soon enough
Default

I think we saw an unaccounted danger of a driver-throttled card when Starcraft II first came out and paired with a driver that was destroying cards. The point is that some unexpected combinations may lead to Furmark-like results without intending it.
Reply With Quote
  #38  
Old 11-15-2010, 06:30 PM
distinctively's Avatar
distinctively distinctively is offline
2^11
 
Join Date: Sep 2009
Location: Waterloo, Ontario
Posts: 2,517
distinctively will become famous soon enoughdistinctively will become famous soon enough
Default

Quote:
Originally Posted by rambaldi View Post
The 3870x2 wishes to disagree with you. And the 2600 Xt x2 says pick me first. While the Sapphire X1950 Pro Dual wonders if it counts.

Also it seems my post responding to Redpriest has been lost

The gist was 700 gen uses software limites, initially looking for furmark.exe after that looking at the ratio of ALU to texture unit computations. 800 gen monitors the VRMs looking for specific overload coniditions (I think in hardware) if they occur then it drops the card down a powerplay level.
Oops. The brain broke down there. I stand corrected.
__________________
There are two kinds of people in the world: Those who like bacon and those who will be used as fodder in the case of a zombie apocalypse.
Reply With Quote
  #39  
Old 11-15-2010, 06:46 PM
fusion fusion is offline
640k who needs more?
 
Join Date: Oct 2009
Posts: 745
fusion is on a distinguished road
Default

Quote:
Originally Posted by TakeANumber View Post
I think we saw an unaccounted danger of a driver-throttled card when Starcraft II first came out and paired with a driver that was destroying cards. The point is that some unexpected combinations may lead to Furmark-like results without intending it.
Well i would say serves them right for buying sc2, but then again i hate activision.

but yeah, if it happens in games, then i'd say it's an issue. And if it's just throttling based on the executable, it's not that great of a solution. Though honestly, menu's should not be made to even remotely tax a GPU anyway, let alone more than the game. I guess that's another subject though.
Reply With Quote
  #40  
Old 11-15-2010, 06:56 PM
ShinyShoes's Avatar
ShinyShoes ShinyShoes is offline
2^11
 
Join Date: Sep 2009
Location: New Zealand
Posts: 3,752
ShinyShoes has a spectacular aura aboutShinyShoes has a spectacular aura aboutShinyShoes has a spectacular aura about
Default

Quote:
Originally Posted by distinctively View Post
Uh, the 580 is supposed to be a fixed GPU addressing thermal problems. This was the best they could do?
It seems the problems with GF100 were so severe not even a base layer respin and some other minor tweaking over the last 6-10 months has been able to improve it more than marginally.

Just holding my breath until we see the reality of Cayman now. nVidia have no more bullets left in the clip.
__________________
Athlon II X3 455 (Rana) @ 3.83 GHz, GA-MA78G-DS3H rev 1.0, 4GB DDR2, XFX HD 5850 @ 801/1099 MHz, Win XP Pro-32 SP3, Seagate 7200.11 Barracuda 500GB (OS) - recovered from BSY error on SD1A firmware, Vantec Ion2+ 550W, NZXT Hush
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off


Forum Jump


All times are GMT -5. The time now is 09:22 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.
SemiAccurate is a division of Stone Arch Networking Services, Inc. Copyright 2009 Stone Arch Networking Services, Inc, all rights reserved.