Nvidia will crater GTX260 and GTX275 prices soon

Shortages are completely artificial

WHAT DO YOU DO when you have nothing, and are facing quarters of buying markeshare and have no competitive products on the horizon? If you are Nvidia, you spin, and use the F, fear, U, uncertainty, and D, doubt, in FUD to pretend there are shortages.

What is the deep dark secret that Nvidia doesn’t want you to know? That there is no shortage of GPUs, it is all artificial, and there are two reasons for it. Bear with me for a tour through the deep dark underbelly of computer parts marketing, how it gets you to spend when you should be saving, and fear things you should be happy about.

In the last few days, there have been reports of shortages of Nvidia 55nm parts, especially the 260 and 275 models. Reports like this one ‘leaked’ out on a few pro-Nvidia sites, then moved on to semi-official PR outlets like Digitimes, then to mainstream sites like Tech Report. As with almost all ‘leaks’ like this, there are financial motives behind the spin and FUD.

When a chip is shrunk, it tends to take up about half the area of it’s older sibling. A CPU shrunk from 65nm to 45nm is a perfect example of this, so if wafer costs and yields are about the same, the 45nm part costs about 50% as much to make. While this is a simplification on several levels, once the new process is up to speed, it is not at all unreasonable.

Right now, a 55nm wafer at TSMC costs about 80% of what a 40nm one does, so the same GPU run on both processes would cost about 2/3 as much to make on 40nm as on 55nm. You can either take that reduced cost per transistor and use it to make more transistors at the same cost, or make the same number at a much reduced cost.

ATI, when it went from the R770 to Evergreen, did both. Cypress, the high end, took the 2x transistors route, and the mid-range Juniper took the same count at a lower price route. Actually, both chips are bigger than their immediate predecessors, in area and transistor count, but the general rule is there.

This leads to what is called a pricing waterfall. The cards that were at the top end of the older generation are now about as powerful as the middle of the newer generation. That means they can only be sold for about half of what they debuted at, say $200 instead of $400. Normally, prices drop over the lifespan of a part in a fiercely competitive market like graphics cards, but normally not 50%.

This generation, the G200 vs R7xx battle is an exception, and the current ATI cards are already at the $200 and 50% mark. If they were not, then as soon as Juniper is launched, the new cards would not only obsolete the older ones, but force them to drop in price, usually to a price/performance point relative to the new cards. If the new cards are 10% faster, the older ones have to be priced at 90% of the newer ones, less if there are any killer new features.

A manufacturer tries to time things to stop making the older ones just in time for the debut of the newer versions. The term for this is product planning. If they screw up their product planning, there will be a gap where they don’t have product on store shelves, and that is very bad. Far worse however is if they time it wrong the other way, and have tons of product on store shelves.

Retailers don’t like it when a company obsoletes the product they have on the shelf. If a retailer has to eat money by dropping prices, they take it out on the vendor somehow, usually by not purchasing from them again. If the card makers tell the retailer that the new ones are coming, the retailer stops buying and restocking to protect against this, and the shelves end up bare. If they keep ordering, the card makers have to make it up to the retailer, and that making up is called price protection.

Price protection works like this. Imagine that a card maker sets a retail price of $249 for their ABC GPU. Months later, it introduces the next generation XYZ at $199 with the same performance, or just cuts the price on ABC. ABC is suddenly worth $199 or less, but the retailer paid whatever they would for a $249 card, not a $199 card. When a board maker price protects the channel, they basically give the retailer a check for the difference. This makes the retailers happy.

If you have 50 cards in stock, and the price drops by $50, the card maker will cut you a check for $2500, or more likely, give you $2500 off your next order. This tends to be VERY expensive, much more expensive than just devaluing the GPU chips themselves in inventory and writing off the loss. Once they are are put on a board, you have to write down the whole thing. Chip makers hate to price protect more than just about anything else. They do it because they have to, since the consequences of not doing it are worse.

So to minimize price protection costs, the chip makers balance the number of parts in the channel keeping them to a bare minimum around product or price transition times. It is an art, half black art, half luck with a sprinkle of science topped off with a bit of competitive intelligence.

Why is this important to the Nvidia FUD? ATI has Evergreen, and the high end Cypress/5870/5850 parts just crushed the life out of the high end Nvidia G200 parts, and sell for far less money. Nvidia had to crater the price of its GTX 285 and 295 boards. The 295 cards recently dropped by more than $50, they are now a hair above $450. The 285 cards suffered a similar drop, and you can now get them for a little over $300.

Price protection is first seen as rebates, then a full-on price drop once inventory sold to retailers at the higher price works it’s way through the system. Nvidia is eating a lot of cash on the 285 and 295 parts now. Since they sell 295 chips to the board makers for about $375, if Nvidia didn’t kick a lot of cash back and crater the price after that, every partner and reseller would just stop buying.

Having the entire value chain post Nvidia, that would be board makers, distributors and retailers, go from splitting more than $125 to less than $75 is a recipe for revolution. Add in that the competition is selling a better board in the 5870 for about $75 less than that, and you compound the pain.

What does that have to do with a shortage? Glad you asked. Cypress/5870/5850 is out, and that crushed Nvidia’s very profitable high end. Juniper, the mid range card, is about to debut, it will be here before the Windows Me II SP7 launch on October 22. It is rumored to be about as fast as a 4890 or a GTX275 give or take a little, and the cut down version is right on top of a GTX260. The Juniper parts are rumored to be selling for well under $200 for the high end variant, and notably below that for the cut down variant.

On Newegg, the cheapest you can get the GTX275 for is $220 and the cheapest GTX260 is $165. Lets just make an educated guess that the Juniper XT (high end) will have a MSRP of $175, and the Juniper Pro (low end) will cost $125. These are completely made up prices, but in line with past industry pricing tiers.

Nvidia now has to price protect the channel for anything that isn’t sold in the next 2-3 weeks. Worse yet, if people know the price is about to drop, they wait for it before they buy. Hint: Nvidia is about to crater the price of the 260 and 275. By a lot. More than it is comfortable with in fact.

So it spins. If it makes people think there are going to be shortages, and if anyone is thinking of getting one, they will be much more likely to buy the first one they see, and feel lucky that they got it. Until the price drops days later, then they feel used. That isn’t Nvidia’s problem, it loves its customers enough to still not name the Bumpgate defective products. With any luck, these new victims won’t remember being played when it comes to buying the next card.

The first secret about the ‘shortage’ news is that Nvidia is trying to clear out the channel by manipulating people into thinking there is a real shortage. This time, the press is its tool, and since no one seems to question messages anymore, it is an easy tool for Nvidia to use. This saves it a lot of money, probably about $50 a card. The worst thing that can happen to it is word getting out that prices are going to drop heavily in the next week or so, three on the outside.

A bigger problem for Nvidia is the cost of those GPUs. The G200 is a monster of a chip. I mean that as a size, and therefore cost, problem. The 512b memory interface (448b in the 260/275) is far too wide for a part selling in the sub-$200 market. It requires many more board layers, adding huge expense, and more memory chips. GDDR3 at 1GHz (2GHz effective) costs about $3.25 per chip. GDDR5 at the same 1GHz is 4GHz effective, twice the bandwidth, but only costs about 10% more. 1.25GHz/5GHz is about 10% more than that.

Nvidia is currently selling GTX275 kits to AIBs (Add In Board makers) at around $160 for the 1792MB version, $135 for the 896MB. GTX 260/216 kits sell for about $110 for the kit, $85 for the bare GPU. These ‘kits’ are a GPU, RAM, and a few support chips, but do NOT include the board, connectors, heat sinks, fans, and passive components.

At a bare minimum, those additional items cost $25 for a card of this level, and that is being generous. If an AIB can buy the rest for a G200 based card and assemble, box and ship it for only $50, it is in very good shape. If you look at the price difference between the 275 kits, that $25 is almost exactly the difference in RAM costs, and the same for the 260 kit vs bare chip.

If Nvidia has to set 275 pricing at $175, and it sells the kit for $10 less than it does now, no one will make money. If AIBs, distributors and retailers have no potential for profit, they will take up a new hobby, say fishing, knitting, or selling profitable lines of GPUs. So, Nvidia has to drop the kit price even farther, to the sub-$100 range on the GTX275/1792MB in order for anyone to make money.

The other card kits have go lower than that, simply based on RAM prices. Samsung won’t drop GDDR3 prices to below cost to subsidize Nvidia, trust us there. That means 275/896MB will have to be in the $75 or less area to compete with ATI, and the 260 way below that. This is below the cost of a current GTS250, I mean 9800GTX, I mean 9600GSO, I mean 8800GT kit. FWIW, those currently sell for a hair under $90. Where does this leave these G92 based parts? Similarly under water, the term pricing waterfall could not be more apt.

Looking at raw wafer and silicon costs, a TSMC 55nm wafer costs about $4000, and you can get about 110 G200 die candidates on one. That puts the raw silicon cost at about $35 for any GTX260/275/285 card, and if you are being generous, add only $5 for packaging and testing. That means there is at least $40 worth of silicon in the G200b based GPUs.

On the cards, there is at least another $30 worth of components like RAM and HDMI chips, so there is no way these can be sold for a profit. The chip is too big, the boards are too complex, and the performance simply isn’t there. There is no way that the 260 and 275 can make money if ATI prices Juniper at $125 and $175 for the low and high end variants respectively.

The problem in a nutshell is that Nvidia has a card that is based on a 484mm^2 die against a 185mm^2 die with a much cheaper board and memory subsystem. Basically, ATI can make cards much cheaper than Nvidia, and sell at a profit while Nvidia can’t match the price, performance, or even value while making money.

Nvidia is in a world of hurt, and there is nothing it can do. Losing money on every part in a line and making it up in volume is tolerable if you can subsidize things with high end GPU profits. Unfortunately Nvidia just lost those too. There is no money to be had in the G200 based cards any more, they are lame ducks that have to have a $20 bill wrapped around each one in order for a buyer to take them from the warehouse.

If ATI sells its parts for less than $175, NV is in more trouble, more than $175, NV has a bit more breathing room, but ATI launched a very similar RV740/HD4770 for $109 six months ago. The formerly high end Nvidia cards are now relegated to the mid-market, and with it comes the waterfall to mid-market pricing. This is pain for any silicon maker, and Nvidia has at least two quarters of that ahead of it.

What do you do in this situation? You extend the artificial shortage. It has to keep selling cards in some quantity or it looks even dumber than usual. Until the cut down GT300/Fermi boards come out, there is no chance that Nvidia can make money in the mid-range or the high end. The low end waterfalls down in Q1 when Cedar and Redwood hit the streets, and then it is game over for Nvidia. Top to bottom, it will have nothing to sell without hard cash subsidies.

In the end, the shortage is artificial, both short term and long term. The short term is FUD to get you to clear the shelves while it can still make money. Longer term, it is to limit quantity to stem the bleeding while the company hunkers down and tries to last out the storm. Given their execution of late, Nvidia looks unlikely to have anything substantial out before Northern Islands kicks it in the teeth once again. I just feel sorry for the partners that trusted Nvidia. I wonder if they realize they are being played too now?S|A

The following two tabs change content below.

Charlie Demerjian

Roving engine of chaos and snide remarks at SemiAccurate
Charlie Demerjian is the founder of Stone Arch Networking Services and SemiAccurate.com. SemiAccurate.com is a technology news site; addressing hardware design, software selection, customization, securing and maintenance, with over one million views per month. He is a technologist and analyst specializing in semiconductors, system and network architecture. As head writer of SemiAccurate.com, he regularly advises writers, analysts, and industry executives on technical matters and long lead industry trends. Charlie is also available through Guidepoint and Mosaic. FullyAccurate