AMD & ATI: The Acquisition from all Points of View
by Anand Lal Shimpi on August 1, 2006 10:26 PM EST- Posted in
- CPUs
Our Thoughts: Will AMD manufacture ATI GPUs?
One of the most intriguing aspects of this proposed acquisition is the potential for ATI to cease to be a fabless GPU manufacturer. In the past, fabless companies like ATI and NVIDIA have had to rely on foundries like TSMC or UMC in order to manufacture their chip designs. By relying on a 3rd party to manufacture all of their chips, ATI and NVIDIA avoid having to invest in multi-billion dollar fabs that require continued multi-billion dollar investments to stay up to date. As a point of reference, AMD's Fab 36 cost $2.5 billion dollars and that's before even fully being converted to 65nm.
The upside of being a fabless manufacturer is obviously that you leave manufacturing to those who are good at it; the downside is that you lose manufacturing as a competitive advantage. The other downside is that foundries like TSMC aren't as quick to transition to new process technologies as companies like AMD and Intel. While Intel has been shipping 65nm product for quite a while now, ATI and NVIDIA GPUs are still being built using 90nm transistors. If the buyout does go through, ATI could potentially gain a manufacturing advantage over NVIDIA since AMD has already invested in its own fab plants.
On the surface, the potential for ATI to have access to its own fab is tremendous; however, there are a number of limitations to success that are worth talking about. For starters, AMD isn't Intel, and AMD's manufacturing processes have always lagged behind Intel's, which is why AMD is still currently shipping 90nm CPUs while Intel is in the middle of its 90nm to 65nm transition. So the advantage that ATI would gain from having access to its own fab would not be as significant as if Intel were the buyer. AMD should still be slightly ahead of TSMC and UMC, but Intel is almost one year ahead of AMD right now on the process transitions.
The other problem is that AMD is significantly capacity constrained as is. AMD currently has two fabs of its own, Fab 30 and Fab 36, both in Dresden currently producing 90nm CPUs. Fab 30 is set up for production on 200mm wafers while Fab 36 uses 300mm. AMD just recently started shipping revenue generating parts through Chartered Semiconductor, a 3rd party fab that is manufacturing Athlon 64 CPUs for AMD in order to help relieve some of its capacity constraints. So with Fab 30, Fab 36 and Chartered all working to just meet demand for AMD's CPUs, it's not like AMD has a lot of extra capacity to use to manufacture ATI GPUs.
AMD's new fab in New York State will help deal with some of the capacity issues, but construction won't even begin until July 2007 at the earliest and we're looking at another 3 - 5 years before it's operational. Without tons of excess capacity, it doesn't seem like ATI will be able to benefit from AMD's manufacturing facilities in the production of GPUs. Chipsets are a different story as it will give AMD something to do with older fabs, as chipsets are no where near the size of modern day GPUs and are already produced on an n-1 manufacturing process (e.g. Intel's Core 2 processors are 65nm while Intel's P965 chipset is built on a 90nm process). GPUs could also be manufactured at older fab plants as newer ones are upgraded, though the sheer size of modern day GPUs makes this an uncertain option. More than likely, the only GPUs manufactured at AMD's plants would be those integrated into chipsets or found on-die/on-package with AMD CPUs, meaning that they'd be very low end, low margin parts.
AMD has already announced that at least for the next 1 - 2 years, there will be no changes in manufacturing with AMD or ATI, meaning that ATI will continue to produce GPUs and chipsets at TSMC and UMC, while AMD will continue to produce CPUs at Fab 30, Fab 36 and Chartered. After that period of time, we'd venture a guess that AMD would start bringing some manufacturing in house (e.g. chipsets) or start producing CPUs with integrated graphics (either on-die or on-package).
What AMD is doing with the proposed ATI acquisition is taking one step towards becoming more like Intel. Intel currently has four 200mm 130nm fabs that are producing chipsets (anything older than Intel's 965 chipset); those fabs would have been useless for CPU production so the fact that Intel can get some additional life out of them by using them for chipset production helps amortize their high construction costs. When the chipsets are ready to transition from 130nm to 90nm, these fabs can then be upgraded to 65nm to get ready for the next wave of chipsets they will be producing.
While ATI will give AMD something to do with older fabs, there's also the argument that AMD could have become a chipset manufacturer on its own without having to pay $5.4B for ATI. AMD has manufactured chipsets in the past, and one would think that it would be cheaper to hire engineers and construct your own chipset team than it would be to purchase ATI. Obviously there's additional value that ATI brings to the table above and beyond the chipset, so it may just be that the sum of all of ATI's advantages are what make the acquisition sensible to AMD.
61 Comments
View All Comments
johnsonx - Thursday, August 3, 2006 - link
Yep, you two are both old. Older than me. Heath H8? I didn't think selling candy bars would pay for college. You actually had to build candy bars from a kit back then? Wow. ;)Mostly the 'kids' comment was directed at your esteemed CEO, and maybe Kubicki too (who I'm well aware is with Dailytech now), and was of course 99.9% joke. Anand may be young, but he's already accomplished a lot more than many of us ever will.
Gary Key - Thursday, August 3, 2006 - link
where is the edit button... led toPrinceGaz - Wednesday, August 2, 2006 - link
Well according to ATI's investors relations webby and also Wikipedia, they were founded in 1985 and started by making integrated-graphics chips for the like of IBM's PCs, and by 1987 had started making discrete graphics-cards (the EGA Wonder and VGA Wonder).Yes, they quite obviously do predate the 3D revolution by many years. VGA graphics date from 1987 and no doubt the VGA Wonder was one of the first cards supporting it. I imagaine that EGA Wonder card they also made in 1987 would have had the 9-pin monitor connection you mention as that is the EGA standard (I've never used it but that's what the Wiki says).
All useless information today really, but a bit history is worth knowing.
johnsonx - Wednesday, August 2, 2006 - link
Yep, I stuck quite a few EGA and VGA wonder cards in 386's and 486's back then. They were great cards because they could work with any monitor. Another minor historical point: Monochrome VGA was common in those days too - better graphics ability than old Hercules Mono, but hundreds of $ less than an actual color monitor.yacoub - Wednesday, August 2, 2006 - link
Your comment should get rated up b/c you correctly state that ATI has been around for some time. Let us also not forget that NVidia bought 3dfx, 3dfx did not simply disappear. And Matrox, while mostly focused in the graphic design / CAD market with their products, has also survived their forays into the gaming market with products like the G200 and G400. Perhaps something about basing your graphics card company in Canada is the trick? :)johnsonx - Wednesday, August 2, 2006 - link
Well, 3dfx was dead. NVidia was just picking at the carcass. Matrox survives only because they make niche products for professional applications. Their 3D products (G200/G400/G450, Parhelia) were hotly anticipated at the time, but quickly fell flat (late to market, surpassed by the competition by the time they arrived, or very shortly after).mattsaccount - Wednesday, August 2, 2006 - link
>>NVIDIA also understands that dining with Intel is much like dining with the devil: the food may be great but you never know what else is cooking in the kitchen.The food in Intel's cafeteria is actually quite good :)
stevty2889 - Wednesday, August 2, 2006 - link
Not when you work nights..it really sucks then..dev0lution - Thursday, August 3, 2006 - link
But the menu changes so often you don't get bored ;)NMDante - Wednesday, August 2, 2006 - link
Night folks get shafter with cafe times.That's probably why there's so many 24 hr. fast food offerings around RR site. LOL