AMD & ATI: The Acquisition from all Points of View
by Anand Lal Shimpi on August 1, 2006 10:26 PM EST- Posted in
- CPUs
ATI's Position
Obviously ATI is also very excited about the acquisition, but from ATI's perspective the motivations for and benefits of the acquisition are a bit different.
ATI's goal is to continue to grow at a rate of 20% per year, but maintaining that growth rate becomes increasingly more difficult as an independent GPU manufacturer. The AMD acquisition will give ATI the ability to compete in areas that it hasn't before, while also giving the company the stable footing it needs to maintaining aggressive growth.
From ATI's position, it's NVIDIA that is left out in the cold as Intel is surely not going to support NVIDIA enough to be a truly great partner. ATI will have AMD, and Intel is content being fairly self sufficient, so unless NVIDIA becomes a CPU manufacturer, its future is bleak according to ATI.
Preparing for the Inevitable Confrontation with Intel
From ATI's standpoint, it's only a matter of time before the GPU becomes general purpose enough that it could be designed and manufactured by a CPU maker. Taking the concern one step further, ATI's worried that in the coming years Intel will introduce its standalone GPU and really turn up the heat on the remaining independent GPU makers. By partnering with AMD, ATI believes that it would be better prepared for what it believes is the inevitable confrontation with Intel. From ATI's perspective, Intel is too strong in CPU design, manufacturing and marketing to compete against when the inevitable move into the GPU space occurs.
Competing with NVIDIA is Tough, this Makes it Easier
It's no surprise to anyone that competing with NVIDIA isn't easy; the easiest time ATI had competing with NVIDIA in recent history was back during the Radeon 9700 Pro days, but since then NVIDIA has really turned up the heat and currently enjoys greater desktop market share. Not only does it have greater desktop market share, but NVIDIA also enjoys greater profit margins per GPU sold thanks to smaller die sizes. By being acquired by AMD, ATI gets a bit of relief from the competition with NVIDIA, as well as some potential advantages. Those advantages include the potential to build and execute better AMD chipsets as well as gaining greater overall graphics market share by shipping more platforms with integrated graphics (either on CPU or on chipset). Intel is actually the world's largest graphics manufacturer, since the vast majority of Intel systems sold ship with some form of Intel integrated graphics; through this acquisition, AMD can use ATI to do the same, which should increase ATI's overall market share.
Making Better AMD Chipsets
ATI has struggled to design, manufacture and execute a chipset that could compete with NVIDIA's nForce line. To date, ATI has come close but not been able to close the deal and it has been trying for years. In theory, with better access to AMD engineers and designers, being able to leverage AMD's IP (e.g. CrossFire implemented over Hyper Transport) and eventually being able to use AMD's fabs, ATI could design a truly competitive platform for AMD processors. As long as the product is decent, AMD would also be able to significantly increase sales by simply offering attractive platform bundles similar to what Intel does today. Whether the approach is more similar to Centrino where AMD requires that you purchase only AMD silicon, or more like how Intel does business on the desktop side where AMD makes sure that only its chipsets are available at launch has yet to be seen.
The Manufacturing & Design Advantage
Currently both ATI and NVIDIA have to turn to third party manufacturers to produce both their chipsets and GPUs. If this acquisition were to go through, AMD could eventually begin manufacturing some chipsets or GPUs for ATI. By manufacturing components in house, ATI would be able to enjoy a cost advantage over competing NVIDIA products (especially if ATI is simply using leftover capacity at older fabs that are awaiting transition to smaller manufacturing processes). ATI could potentially begin to release GPUs using newer process technologies before the competition as well, reducing die size and increasing clock speeds at the same time.
Manufacturing aside, there's also this idea that companies like AMD and Intel are better at designing silicon because they work on a more granular level with the design. There's far more custom logic in Intel's Core 2 Duo than in NVIDIA's GeForce 7900 GTX; ATI would gain access to AMD's entire portfolio of custom logic and may be able to implement some of it in its upcoming GPUs, giving ATI a performance and efficiency advantage over NVIDIA.
It Makes Financial Sense
Of course the actual acquisition itself is very beneficial to ATI's investors, as the deal is mostly cash and thus little risk is assumed on behalf of ATI investors. ATI's stock has been doing quite well since the announcement, and why shouldn't it? The #2 x86 microprocessor maker wants to buy ATI.
What about Intel Chipsets?
Currently 60 - 70% of ATI's chipset revenues come from Intel platforms, but ATI expects that number to decline significantly over the coming months. While the current 6 month roadmap won't change, beyond that ATI is not counting on incredible support from Intel so ATI will begin focusing its efforts on AMD platforms exclusively at that point. If Intel wants ATI chipsets, ATI will supply them. And if you're wondering, CrossFire will continue to work on Intel chipsets.
Keep in mind that when we say 60-70% of ATI's chipset revenues come from Intel platforms, that doesn't actually mean ATI is selling a ton of chipsets. ATI accounts for slightly less than 10% of Intel platform chipsets sold recently, and about one fourth of AMD platform chipsets. However, even though they sell a decent number of chipsets, the quality of ATI chipsets has been considered something of a distant third place, with Intel and NVIDIA in the lead. ATI could lose all of their Intel chipset sales and still come out ahead if they can become the dominant chipset for AMD platforms.
61 Comments
View All Comments
johnsonx - Thursday, August 3, 2006 - link
Yep, you two are both old. Older than me. Heath H8? I didn't think selling candy bars would pay for college. You actually had to build candy bars from a kit back then? Wow. ;)Mostly the 'kids' comment was directed at your esteemed CEO, and maybe Kubicki too (who I'm well aware is with Dailytech now), and was of course 99.9% joke. Anand may be young, but he's already accomplished a lot more than many of us ever will.
Gary Key - Thursday, August 3, 2006 - link
where is the edit button... led toPrinceGaz - Wednesday, August 2, 2006 - link
Well according to ATI's investors relations webby and also Wikipedia, they were founded in 1985 and started by making integrated-graphics chips for the like of IBM's PCs, and by 1987 had started making discrete graphics-cards (the EGA Wonder and VGA Wonder).Yes, they quite obviously do predate the 3D revolution by many years. VGA graphics date from 1987 and no doubt the VGA Wonder was one of the first cards supporting it. I imagaine that EGA Wonder card they also made in 1987 would have had the 9-pin monitor connection you mention as that is the EGA standard (I've never used it but that's what the Wiki says).
All useless information today really, but a bit history is worth knowing.
johnsonx - Wednesday, August 2, 2006 - link
Yep, I stuck quite a few EGA and VGA wonder cards in 386's and 486's back then. They were great cards because they could work with any monitor. Another minor historical point: Monochrome VGA was common in those days too - better graphics ability than old Hercules Mono, but hundreds of $ less than an actual color monitor.yacoub - Wednesday, August 2, 2006 - link
Your comment should get rated up b/c you correctly state that ATI has been around for some time. Let us also not forget that NVidia bought 3dfx, 3dfx did not simply disappear. And Matrox, while mostly focused in the graphic design / CAD market with their products, has also survived their forays into the gaming market with products like the G200 and G400. Perhaps something about basing your graphics card company in Canada is the trick? :)johnsonx - Wednesday, August 2, 2006 - link
Well, 3dfx was dead. NVidia was just picking at the carcass. Matrox survives only because they make niche products for professional applications. Their 3D products (G200/G400/G450, Parhelia) were hotly anticipated at the time, but quickly fell flat (late to market, surpassed by the competition by the time they arrived, or very shortly after).mattsaccount - Wednesday, August 2, 2006 - link
>>NVIDIA also understands that dining with Intel is much like dining with the devil: the food may be great but you never know what else is cooking in the kitchen.The food in Intel's cafeteria is actually quite good :)
stevty2889 - Wednesday, August 2, 2006 - link
Not when you work nights..it really sucks then..dev0lution - Thursday, August 3, 2006 - link
But the menu changes so often you don't get bored ;)NMDante - Wednesday, August 2, 2006 - link
Night folks get shafter with cafe times.That's probably why there's so many 24 hr. fast food offerings around RR site. LOL