AMD & ATI: The Acquisition from all Points of View
by Anand Lal Shimpi on August 1, 2006 10:26 PM EST- Posted in
- CPUs
ATI's Position
Obviously ATI is also very excited about the acquisition, but from ATI's perspective the motivations for and benefits of the acquisition are a bit different.
ATI's goal is to continue to grow at a rate of 20% per year, but maintaining that growth rate becomes increasingly more difficult as an independent GPU manufacturer. The AMD acquisition will give ATI the ability to compete in areas that it hasn't before, while also giving the company the stable footing it needs to maintaining aggressive growth.
From ATI's position, it's NVIDIA that is left out in the cold as Intel is surely not going to support NVIDIA enough to be a truly great partner. ATI will have AMD, and Intel is content being fairly self sufficient, so unless NVIDIA becomes a CPU manufacturer, its future is bleak according to ATI.
Preparing for the Inevitable Confrontation with Intel
From ATI's standpoint, it's only a matter of time before the GPU becomes general purpose enough that it could be designed and manufactured by a CPU maker. Taking the concern one step further, ATI's worried that in the coming years Intel will introduce its standalone GPU and really turn up the heat on the remaining independent GPU makers. By partnering with AMD, ATI believes that it would be better prepared for what it believes is the inevitable confrontation with Intel. From ATI's perspective, Intel is too strong in CPU design, manufacturing and marketing to compete against when the inevitable move into the GPU space occurs.
Competing with NVIDIA is Tough, this Makes it Easier
It's no surprise to anyone that competing with NVIDIA isn't easy; the easiest time ATI had competing with NVIDIA in recent history was back during the Radeon 9700 Pro days, but since then NVIDIA has really turned up the heat and currently enjoys greater desktop market share. Not only does it have greater desktop market share, but NVIDIA also enjoys greater profit margins per GPU sold thanks to smaller die sizes. By being acquired by AMD, ATI gets a bit of relief from the competition with NVIDIA, as well as some potential advantages. Those advantages include the potential to build and execute better AMD chipsets as well as gaining greater overall graphics market share by shipping more platforms with integrated graphics (either on CPU or on chipset). Intel is actually the world's largest graphics manufacturer, since the vast majority of Intel systems sold ship with some form of Intel integrated graphics; through this acquisition, AMD can use ATI to do the same, which should increase ATI's overall market share.
Making Better AMD Chipsets
ATI has struggled to design, manufacture and execute a chipset that could compete with NVIDIA's nForce line. To date, ATI has come close but not been able to close the deal and it has been trying for years. In theory, with better access to AMD engineers and designers, being able to leverage AMD's IP (e.g. CrossFire implemented over Hyper Transport) and eventually being able to use AMD's fabs, ATI could design a truly competitive platform for AMD processors. As long as the product is decent, AMD would also be able to significantly increase sales by simply offering attractive platform bundles similar to what Intel does today. Whether the approach is more similar to Centrino where AMD requires that you purchase only AMD silicon, or more like how Intel does business on the desktop side where AMD makes sure that only its chipsets are available at launch has yet to be seen.
The Manufacturing & Design Advantage
Currently both ATI and NVIDIA have to turn to third party manufacturers to produce both their chipsets and GPUs. If this acquisition were to go through, AMD could eventually begin manufacturing some chipsets or GPUs for ATI. By manufacturing components in house, ATI would be able to enjoy a cost advantage over competing NVIDIA products (especially if ATI is simply using leftover capacity at older fabs that are awaiting transition to smaller manufacturing processes). ATI could potentially begin to release GPUs using newer process technologies before the competition as well, reducing die size and increasing clock speeds at the same time.
Manufacturing aside, there's also this idea that companies like AMD and Intel are better at designing silicon because they work on a more granular level with the design. There's far more custom logic in Intel's Core 2 Duo than in NVIDIA's GeForce 7900 GTX; ATI would gain access to AMD's entire portfolio of custom logic and may be able to implement some of it in its upcoming GPUs, giving ATI a performance and efficiency advantage over NVIDIA.
It Makes Financial Sense
Of course the actual acquisition itself is very beneficial to ATI's investors, as the deal is mostly cash and thus little risk is assumed on behalf of ATI investors. ATI's stock has been doing quite well since the announcement, and why shouldn't it? The #2 x86 microprocessor maker wants to buy ATI.
What about Intel Chipsets?
Currently 60 - 70% of ATI's chipset revenues come from Intel platforms, but ATI expects that number to decline significantly over the coming months. While the current 6 month roadmap won't change, beyond that ATI is not counting on incredible support from Intel so ATI will begin focusing its efforts on AMD platforms exclusively at that point. If Intel wants ATI chipsets, ATI will supply them. And if you're wondering, CrossFire will continue to work on Intel chipsets.
Keep in mind that when we say 60-70% of ATI's chipset revenues come from Intel platforms, that doesn't actually mean ATI is selling a ton of chipsets. ATI accounts for slightly less than 10% of Intel platform chipsets sold recently, and about one fourth of AMD platform chipsets. However, even though they sell a decent number of chipsets, the quality of ATI chipsets has been considered something of a distant third place, with Intel and NVIDIA in the lead. ATI could lose all of their Intel chipset sales and still come out ahead if they can become the dominant chipset for AMD platforms.
61 Comments
View All Comments
jjunos - Wednesday, August 2, 2006 - link
I believe that the high % ATI is getting here is because of a shortage of chipsets that INTEL was experiencing in the last quarter. As the article states the increase of 400% in ATI chipset marketshare.So as such, I wouldn't necessarily take this breakdown as permanent future marketshares.
JarredWalton - Wednesday, August 2, 2006 - link
I added some clarification (the quality commentary). Basically, Intel is the king of Intel platform chipsets, and NVIDIA rules the AMD platform. ATI sells more total chipsets than NVIDIA at present, but a lot of those go into laptops, and ATI chipset performance on Intel platforms has never been stellar. Then again, VIA chipset performance on Intel platforms has never been great, and they still provide 15% of all chipsets sold.Of course, the budget sector ships a TON of systems, relatively speaking. That really skews the numbers. Low average profit, lower quality, lower performance, but lots of market share. That's how Intel remains the #1 GPU provider. I'd love to see numbers showing chipset sales if we remove all low-end "budget" configurations. I'm not sure if VIA or SiS would even show up on the charts if we only count systems that cost over $750.
jones377 - Wednesday, August 2, 2006 - link
Before Nvidia bought Uli they had about 5% marketshare IIRC. And I bet of those current 9% Nvidia share, ULi still represents a good portion (probably on the order of 3-4%) and it's all in the low-end. For some reason, Nvidia really sucks at striking OEM deals compared to ATI which have historically been very good at it. The fact that Intel is selling motherboards with ATI chipsets is really helping ATI though.According to my previous link, ATI has 28% of the AMD platform market despite being a relative newcomer compared to Nvidia there. I think even without this buyout, ATI would have continued to grow this share, now this is a certain. I think Nvidia also realised this a while back because they started pushing their chipsets for the Intel platform much more.
Still, Nvidia will have an uphill struggle in the long term. If Intel chooses a new chipset partner in the future (they might just go at it alone instead), they are just as likely to pick SiS or VIA (though I doubt they will go VIA) over Nvidia and SiS already have a massive marketshare advantage over Nvidia there (well since Nvidia has almost none everyone has). So while ATI will likely loose most/all of their Intel chipset marketshare eventually, I doubt Nvidia will gobble up all of that. They will face stiff competition from Intel, SiS and VIA in that order. The one bright spot for Nvidia is that they should continue to hold on to the profitable high-end chipset market for the AMD platform and grow it for the Intel platform. Still, overall this market is very small..
And lets not even mention the mobile market... Intel has that one all gobbled up for itself with the Centrino brand and this AMD/ATI deal will ensure that AMD will have something simular soon. Given the high power consumption of Nvidia chipset making them already unsuited for the mobile market, even if they come out with an optimised mobile chipset, their window of opportunity is all but gone there now.
Sunrise089 - Wednesday, August 2, 2006 - link
In terms of style, I don't think the initial part of the article (the part with each company's slant) was very clear. Was it pure PR speak (which is how the NVIDIA part read) or AT's targeted analysis (how the Intel part read).Second, I think quotes like this:
"Having each company operate entirely independently makes no sense, since we've already discussed that it's what these two can do together that makes this acquisition so interesting."
continue to show you guys are great tech experts, but may also suggest that you guys aren't the best business writers on the web (not saying I am either of course). A lot of companies are acquired by another company solely for the purpose of making money, not any sort of integration or creation of a competitive advantage. If a company is perceived to be undervalued, and another company feels it's currently in a good financial situation, it's a smart move to spend the $$$ on an acquisition if you feel the acquired company's long term growth may out-pace that of the currently wealthier company. Yahoo and Google do this sort of thing all the time, and big companies like Berkshire-Hathaway do it as well. Do you think Warren Buffet really wanted to invest in Gillette to allow his employees to obtain cheaper shaving products? No, he simply felt he had money available and buying a share of Gilette would net him money in the long term.
If the ATI/AMD merger only creates the possibility of major collaboration between the two companies in the future (basically as a hedge for both companies against unexpected or uncertain changes in the marketplace) but ATI continue to turn a profit when seen as an individual corporate entity, than the acquisition was the correct thing to do so long as AMD had no better use for the $$$ it will spend on the purchase.
defter - Wednesday, August 2, 2006 - link
If the deal goes through, ATI won't be an individual corporate entity. There will be just one company, and it will be called "AMD".
That's why Anand's comment makes sense. It would be quite silly e.g. for marketing teams to operate independently. Imagine: first AMD's team goes to OEM to sell the CPU. Then the "ATI" team goes to the same OEM to sell GPU/chipset. Isn't it much better to combine those teams so they can offer CPU+GPU/chipset to the OEM at once?
jjunos - Wednesday, August 2, 2006 - link
I can't see ATI simply throwing away their name. They've spent way too much money and time building up their brand, why throw it away now?Sunrise089 - Wednesday, August 2, 2006 - link
I'm not sold on the usefullness of combining the marketing at all. Yes, to OEMs you would obviously do it, but why on the retail channel? ATI has massive brand recognition, AMD has none in the GPU maerketplace. Even if the teams are the same people, using the ATI name would not at all be a ridiculous notion. Auto manufactures do this all the time: Ford ownes Mazda, and for economics are scale purposes builds the Escape and Tribute at the same plant. Then when selling to a rental company, they would both be sold by corporate fleet sales, but when sold to the public Ford and Mazda products are marketed completely independently of one another.Once again, even if both companies are owned by AMD it is not impossible to still keep the two divisions farely distinct, and that's where my "when seen as an individual corporate entity" comment came from.
johnsonx - Wednesday, August 2, 2006 - link
Obviously you are referring to ATI and NVIDIA. The 3d revolution certainly did spawn NVIDIA, but my recollection says that ATI has been around far longer than that. I think I still have an ATI Mach-32 EISA-bus graphics card in a box somewhere, and that was hardly ATI's first product. ATI products even predate the 2D graphics accelerator, and even predate VGA if I recall correctly (anyone see any 9-pin monitor plugs lately?). I do suppose your statement is correct in the sense that there were far more graphics chip players in the market 'back then'; today there really are just two giants and about 3 also-rans (Matrox, SiS, XGI). ATI was certainly one of the big players 'back then'; indeed it took me (and ATI too, for that matter) quite some time to figure out that in the 3D Market that ATI was a mere also-ran themselves for awhile; the various 3D Rage chips were rather uncompetitive vs the Voodoo and TNT series of the times.
No offense to you kids who write for AT, but I actually remember the pre-3D days. I sold and serviced computers with Hercules Monochrome graphics adapters, IBM CGA and EGA cards, etc. The advent of VGA was a *BIG DEAL*, and it took quite some time before it was at all common, as it was VERY expensive. I remember many of ATI's early VGA cards had mouse ports on them too (and shipped with a mouse), since the likely reason to even want VGA was to run Aldus Pagemaker which of course required a mouse (at least some versions of it used a run-time version of Windows 1.0... there was also a competing package that used GEM, but I digress).
To make a long story short, in turn by making it even longer, ATI was hardly 'spawned' by the 3D revolution.
now I'll just sit back and wait for the 'yeah, well I remember farther back than you!' flames, along with the 'shut up geezer, no one cares about ancient history!' flames.
Wesley Fink - Wednesday, August 2, 2006 - link
Not everyone at AT is a kid. My 3 children are all older than our CEO, and Gary Key has been around in the notebook business since it started. If I recall I was using a CPM-based Cromemco when Bill Gates was out pushing DOS as a cheaper alternative to expensive CPM. I also had every option possible in my earlier TI99-4A expansion box. There amy even be a Sinclair in a box in the attic - next to the first Apple.You are correct in that ATI pre-dates 3-D and had been around eons before nVidia burst on the scene with their TNT. I'm teaching this to my grandchildren so they won't grow up assuming - like some of our readers - that anyone older than 25 is computer illiterate. All my kids are in the Computer Business and they all still call me for advice.
Gary Key - Thursday, August 3, 2006 - link
I am older than dirt. I remember building and selling Heath H8 kits to pay for college expenses. The days of programming in Benton Harbor Basic and then moving up to HDOS and CP/M were exciting times, LOL. My first Computer Science course allowed me to learn the basics to program the H10 paper tape reader and punch unit and sell a number of units into the local Safeway stores with the H9 video/modem (1200 baud) kit. A year later I upgraded everyone with the H-17 drive units, dual 5.25" floppy drives ($975) that required 16k ($375) of RAM to operate (base machine had 4k of RAM).Anyway, NVIDIA first started with the infamous NV1 (VRAM) or STG2000 (DRAM) cards that featured 2d/3d graphics and an advanced audio (far exceeded Creative Labs offerings) engine. Of course Microsoft failed to support Quadratic Texture Maps in the first version of Direct3D that effectively killed the cards. I remember having to dispose of several thousand Diamond EDGE 3D cards at a former company. They rebounded of course with the RIVA 128 (after spending a lot of time on the ill-fated NV2 for Sega, but it paid the bills) and the rest is history.
While ATI pre-dated most graphic manufacturers, they were still circling the drain from a consumer viewpoint and also starting to lose OEM contracts (except for limited Rage Pro sales due to multimedia performance) in 1997 when they acquired Tseng Labs. Thanks to those engineers the Rage 128 became a big OEM hit in 1998/1999 although driver performance was still terrible on the consumer 3D side even though the hardware was competitive but lead to the once again OEM hit, Radeon 64. The biggest break came in 2000 when they acquired ArtX and a couple of years later we had the R300, aka Radeon 9700 and the rest is history. If S3 had not failed so bad with driver support and buggy hardware releases in the late 1998 with the Savage 3D, ATI very well could have gone the way of Tseng, Trident, and others as S3 was taking in significant OEM revenue from the Trio and ViRGE series chipsets.
Enough old fart history for tonight, back to work..... :)