JUST WHEN YOU thought things could not get any worse, Nvidia’s Jen-Hsun Huang gets in front of the press and blames TSMC for Fermi’s problems. Yes, you heard it right, Fermi’s delays, performance misses, and problems were all the fault of TSMC.
In a pretty stunning interview and story put up by Golem.de, the video is embedded in the story, and that is where the interesting bits are. From about 2:00-2:25 in the video is the accusation.
“The parasitic, umm, characterization from our foundries and the tools and the reality are simply not related. At all. We found that a major breakdown between the models, the tools and reality.” Does that sound like Nvidia made mistakes or those darn incompetent TSMC employees caused Fermi to fail?
If you didn’t know better, that and the fabric explanation would be a good story. Unfortunately, the problems with the early Fermis, massively late tape out, 2% yields on first silicon, unmanufacturable design, and lots more are not related bad interconnect characterization. The fabric may be metal layers, but you don’t fix runaway power consumption, low clocks, and cracking vias with metal layer spins.
This is not to say that the characterization problems mentioned by Jen-Hsun were not real, but if they were the fault of faulty fab data, then everyone out there would have been affected. ATI wasn’t. None of the FPGA guys were. In fact, far from a rash of bad fab process characterization data related design failures, there were none. OK, one, but that is kind of a given.
In over a year of following the TSMC 40nm and Nvidia GT300/GF10x problems, SemiAccurate has talked to dozens of engineers, fab workers, and related designers working on TSMC 40nm chip designs. Guess how many mentioned bad fab characterization data from TSMC as a problem? Yes, zero.
Fermi may be a failed architecture on a level that will have reverberations in the industry for years, but pointing the finger at TSMC is the wrong thing to do. Nvidia has never taken the blame for anything, but the evidence that it is not TSMC’s fault is overwhelming once again. At least the characterization data of Nvidia management seems accurate.S|A
Latest posts by Charlie Demerjian (see all)
- How is Intel solving their 14nm capacity problems? - Jun 13, 2019
- How big is AMD’s new Navi GPU? - Jun 7, 2019
- Intel kills off a (minor) product line - Jun 7, 2019
- A look at Intel’s Ice Lake and Sunny Cove - Jun 5, 2019
- Leaked roadmap shows Intel’s 10nm woes - Apr 25, 2019