AMD & ATI: The Acquisition from all Points of View
by Anand Lal Shimpi on August 1, 2006 10:26 PM EST- Posted in
- CPUs
Our Thoughts: Will AMD manufacture ATI GPUs?
One of the most intriguing aspects of this proposed acquisition is the potential for ATI to cease to be a fabless GPU manufacturer. In the past, fabless companies like ATI and NVIDIA have had to rely on foundries like TSMC or UMC in order to manufacture their chip designs. By relying on a 3rd party to manufacture all of their chips, ATI and NVIDIA avoid having to invest in multi-billion dollar fabs that require continued multi-billion dollar investments to stay up to date. As a point of reference, AMD's Fab 36 cost $2.5 billion dollars and that's before even fully being converted to 65nm.
The upside of being a fabless manufacturer is obviously that you leave manufacturing to those who are good at it; the downside is that you lose manufacturing as a competitive advantage. The other downside is that foundries like TSMC aren't as quick to transition to new process technologies as companies like AMD and Intel. While Intel has been shipping 65nm product for quite a while now, ATI and NVIDIA GPUs are still being built using 90nm transistors. If the buyout does go through, ATI could potentially gain a manufacturing advantage over NVIDIA since AMD has already invested in its own fab plants.
On the surface, the potential for ATI to have access to its own fab is tremendous; however, there are a number of limitations to success that are worth talking about. For starters, AMD isn't Intel, and AMD's manufacturing processes have always lagged behind Intel's, which is why AMD is still currently shipping 90nm CPUs while Intel is in the middle of its 90nm to 65nm transition. So the advantage that ATI would gain from having access to its own fab would not be as significant as if Intel were the buyer. AMD should still be slightly ahead of TSMC and UMC, but Intel is almost one year ahead of AMD right now on the process transitions.
The other problem is that AMD is significantly capacity constrained as is. AMD currently has two fabs of its own, Fab 30 and Fab 36, both in Dresden currently producing 90nm CPUs. Fab 30 is set up for production on 200mm wafers while Fab 36 uses 300mm. AMD just recently started shipping revenue generating parts through Chartered Semiconductor, a 3rd party fab that is manufacturing Athlon 64 CPUs for AMD in order to help relieve some of its capacity constraints. So with Fab 30, Fab 36 and Chartered all working to just meet demand for AMD's CPUs, it's not like AMD has a lot of extra capacity to use to manufacture ATI GPUs.
AMD's new fab in New York State will help deal with some of the capacity issues, but construction won't even begin until July 2007 at the earliest and we're looking at another 3 - 5 years before it's operational. Without tons of excess capacity, it doesn't seem like ATI will be able to benefit from AMD's manufacturing facilities in the production of GPUs. Chipsets are a different story as it will give AMD something to do with older fabs, as chipsets are no where near the size of modern day GPUs and are already produced on an n-1 manufacturing process (e.g. Intel's Core 2 processors are 65nm while Intel's P965 chipset is built on a 90nm process). GPUs could also be manufactured at older fab plants as newer ones are upgraded, though the sheer size of modern day GPUs makes this an uncertain option. More than likely, the only GPUs manufactured at AMD's plants would be those integrated into chipsets or found on-die/on-package with AMD CPUs, meaning that they'd be very low end, low margin parts.
AMD has already announced that at least for the next 1 - 2 years, there will be no changes in manufacturing with AMD or ATI, meaning that ATI will continue to produce GPUs and chipsets at TSMC and UMC, while AMD will continue to produce CPUs at Fab 30, Fab 36 and Chartered. After that period of time, we'd venture a guess that AMD would start bringing some manufacturing in house (e.g. chipsets) or start producing CPUs with integrated graphics (either on-die or on-package).
What AMD is doing with the proposed ATI acquisition is taking one step towards becoming more like Intel. Intel currently has four 200mm 130nm fabs that are producing chipsets (anything older than Intel's 965 chipset); those fabs would have been useless for CPU production so the fact that Intel can get some additional life out of them by using them for chipset production helps amortize their high construction costs. When the chipsets are ready to transition from 130nm to 90nm, these fabs can then be upgraded to 65nm to get ready for the next wave of chipsets they will be producing.
While ATI will give AMD something to do with older fabs, there's also the argument that AMD could have become a chipset manufacturer on its own without having to pay $5.4B for ATI. AMD has manufactured chipsets in the past, and one would think that it would be cheaper to hire engineers and construct your own chipset team than it would be to purchase ATI. Obviously there's additional value that ATI brings to the table above and beyond the chipset, so it may just be that the sum of all of ATI's advantages are what make the acquisition sensible to AMD.
61 Comments
View All Comments
HopJokey - Wednesday, August 2, 2006 - link
I beg to differ. It gets old after a while:(
Regs - Tuesday, August 1, 2006 - link
The distant future looks good. Though we yet to see any more green slides about new core technologies from AMD. It almost seems AMD will be making baby-steps for the next 5 or so years to try to compete with the performance Intel is now currently offering.For stock holders - lets just hope AMD can pull something off to gain revenue from other markets with the help of Dell and ATi. Their growing capital and recent acquisition need some definite profits to pay it off.
AnandThenMan - Tuesday, August 1, 2006 - link
I think it's fair to say the article has a very strong pro Intel and NVIDIA slant. For starters, it needs to be pointed out that ATI is actually the #2 graphic maker, not NVIDIA. Saying that NVIDIA is #1 in the desktop space is only part of the market, so why state it that way? Trying to make NVIDIA look good of course...And this:
This statement is just dumb. Unless the planet is destroyed by an asteroid, the deal is pretty much done. It is HIGHLY unlikely that the deal will not happen.
defter - Wednesday, August 2, 2006 - link
The desktop market is very important market since most of the profits are made in the high-end desktop market.For example ATI has much bigger overall marketshare than NVidia (27.6% vs 20.3%) and has lot of presense in other markets (consumer electronics, handhelds). Still, NVidia has bigger revenue, meaning that ASP of NVidia chips is much higher.
If you look at profits, the difference is even bigger, during the last quarter, NVidia made three times as much profit as ATI. Thus high-end desktop market is definitely very important.
Here are some GPU market share numbers for Q2:
http://www.xbitlabs.com/news/video/display/2006073...">http://www.xbitlabs.com/news/video/display/2006073...
PrinceGaz - Wednesday, August 2, 2006 - link
Most of the profits are not made in the high-end desktop market, in fact the very high end probably struggles just to break even due to the relatively tiny number of units shipped compared to development costs. Most of the money in discrete graphics is actually made in the low-end discrete graphics segment, cards like the 7300 and the X1300.
defter - Wednesday, August 2, 2006 - link
This is like saying: "most of the revenue is made on $100 CPUs instead of FX/Opteron parts..."The revenue can be higher on the low end of the market. But GPUs like 7300/X1300 are selling at $20 or less, profit margins for those can't very high. High-end chips like 7900/X1900 are selling for about $100 and the margins are much higher. (Compare the die size between 7900 and 7300, the difference isn't THAT big).
JarredWalton - Wednesday, August 2, 2006 - link
Hey, I'm a skeptic and you can blame me for the comment. Still, until the deal is well and truly done we have a proposed merger. Government interference, cold feet, whatever other setback you want... these things can and do happen. Do I think the deal *won't* happen? Nope - no more than I think the deal *will* happen. If you had asked me three months ago when I first heard the rumors, I think I would have been about 90% sure it wouldn't happen, so obviously I'm less skeptical now than before.As for NVIDIA and Intel slant, the NVIDIA perspective is their view. That doesn't mean it's correct, any more than the ATI, AMD, or Intel perspectives. However, ATI is #2 for the same reason Intel is #1: integrated graphics, specifically on laptops, and again we're talking about the underpowered, mediocre kind that will choke on Vista's Glass GUI. Wipe out all of the low-end GPUs, and NVIDIA has a clear lead in the market. Not in performance, necessarily, but in mindset and brand recognition? Definitely. We are an enthusiast website, and so we're looking at the stuff that moves the market forward, not just what suffices to run office apps.
AnandThenMan - Wednesday, August 2, 2006 - link
Being #1 in one market is not good enough anymore. NVIDIA NEEDS to be in the integrated graphics sector, the ultra thin mobile sector, the console market, the HD devices market etc. etc. This is where ATI is much more diverse than NVIDIA.
The article is about the implications of AMD/ATI and how it affects Intel, NVIDIA, and the whole industry. I understand what you are saying about the discreet enthusiest market, and naturally this is the most interesting and desirable segment we all like to talk about. But the merger is about much more than that. IMO, NVIDIA has to re-invent itself to be capable of taking on AMD/ATI. NVIDIA has come out and bragged about how they are not the "last man standing" but this is marketing spin at best. NVIDIA is on the record years ago as saying they want to "be where ever there is a pixel" but honestly, AMD/ATI is far better positioned to deliver this than NVIDIA IMO.
defter - Wednesday, August 2, 2006 - link
Care to elaborate? NVidia is doing fine financially, why it NEEDS to be strongly present on those sectors?
NVidia has been in the console market since 2001.
Calin - Wednesday, August 2, 2006 - link
NVidia IS in the integrated graphics sector - if you are referring to the "enthusiast" integrated graphic sector