AMD sent us word that tomorrow they will be hosting a Livecast celebrating 30 years of graphics and gaming innovation. Thirty years is a long time, and certainly we have a lot of readers that weren't even around when AMD had its beginnings. Except we're not really talking about the foundation of AMD; they had their start in 1969. It appears this is more a celebration of their graphics division, formerly ATI, which was founded in… August, 1985.

AMD is apparently looking at a year-long celebration of the company formerly known as ATI, Radeon graphics, and gaming. While they're being a bit coy about the exact contents of the Livecast, we do know that there will be three game developers participating along with a live overclocking event. If we're lucky, maybe AMD even has a secret product announcement, but if so they haven't provided any details. And while we can now look forward to a year of celebrating AMD graphics and most likely a final end-of-the-year party come next August, why not start out with a brief look at where AMD/ATI started and where they are now?

Commodore-64-Computer.png
Source: Wikimedia Evan-Amos

I'm old enough that I may have been an owner of one of ATI's first products, as I began my addiction career as a technology enthusiast way back in the hoary days of the Commodore 64. While the C64 initially started shipping a few years earlier, Commodore was one of ATI's first customers and they were largely responsible for an infusion of money that kept ATI going in the early days.

By 1987, ATI began moving into the world of PC graphics with their "Wonder" brand of chips and cards, starting with 8-bit PC/XT-based board supporting monochrome or 4-color CGA. Over the next several years ATI would move to EGA (640x350 and provided an astounding 16 colors) and VGA (16-bit ISA and 256 colors). If you wanted a state-of-the-art video card like the ATI VGA Wonder in 1988, you were looking at $500 for the 256K model or $700 for the 512K edition. But all of this is really old stuff; where things start to become interesting is in the early 90s with the launch and growing popularity of Windows 3.0.

Mach8isa.jpg
Source: Wikimedia Misterzeropage

ATI's Mach 8 was their first true graphics processor from the company. It was able to offload 2D graphics functions from the CPU and render them independently, and at the time it was one of the few video cards that could do this. Sporting 512K-1MB of memory, it was still an ISA card (or it was available in MCA if you happened to own an IBM PS/2).

Two years later the Mach 32 came out, the first 32-bit capable chip with support for ISA, EISA, MCA, VLB, and PCI slots. Mach 32 shipped with either 1MB or 2MB DRAM/VRAM and added high-color (15-bit/16-bit) and later True Color (the 24-bit color that we're still mostly using today) to the mix, along with a 64-bit memory interface. And two years after came the Mach 64, which brought support for up to 8MB of DRAM, VRAM, or the new SGRAM. Later variants of the Mach 64 also started including 3D capabilities (and were rebranded as Rage, see below), and we're still not even in the "modern" era of graphics chips yet!


Rage Fury MAXX

Next in line was the Rage series of graphics chips, and this was the first line of graphics chips built with 3D acceleration as one of the key features. We could talk about competing products from 3dfx, NVIDIA, Virge, S3, etc. here, but let's just stick with ATI. The Rage line appropriately began with the 3D Rage I in 1996, and it was mostly an enhancement of the Mach64 design with added on 3D support. The 3D Rage II was another Mach64 derived design, with up to twice the performance of the 3D Rage. The Rage II also found its way into some Macintosh systems, and while it was initially a PCI part, the Rage IIc later added AGP support.

That part was followed by the Rage Pro, which is when graphics chips first started handling geometry processing (circa 1998 with DirectX 6.0 if you're keeping track), and you could get the Pro cards with up to 16MB of memory. There were also low-cost variations of the Rage Pro in the Rage LT, LT Pro, and XL models, and the Rage XL may hold the distinction of being one of the longest-used graphics chips in history, as I know even in 2005 or thereabouts there were many servers still shipping with that chip on the motherboard providing graphics output. In 1998 ATI released the Rage 128 with AGP 2X support (the enhanced Rage 128 Pro added AGP 4X support among other things a year later), and up to 32MB RAM. The Rage 128 Ultra even supported 64MB in its top configuration, but that wasn't the crowning achievement of the Rage series. No, the biggest achievement for Rage was with the Rage Fury MAXX, ATI's first GPU to support alternate frame rendering to provide up to twice the performance.


Radeon 9700 Pro

And last but not least we finally enter the modern era of ATI/AMD video cards with the Radeon line. Things start to get pretty dense in terms of releases at this point, so we'll mostly gloss over things and just hit the highlights. The first iteration Radeon brought support for DirectX 7 features, the biggest being hardware support for transform and lighting calculations – basically a way of offloading additional geometry calculations. The second generation Radeon chips (sold under the Radeon 8000 and lower number 9000 models) added DirectX 8 support, the first appearance of programmable pixel and vertex shaders in GPUs.

Perhaps the best of the Radeon breed goes to the R300 line, with the Radeon 9600/9700/9800 series cards delivering DirectX 9.0 support and, more importantly, holding onto a clear performance lead over their chief competitor NVIDIA for nearly two solid years! It's a bit crazy to realize that we're now into our tenth (or eleventh, depending on how you want to count) generation of Radeon GPUs, and while the overall performance crown is often hotly debated, one thing is clear: games and graphics hardware wouldn't be where it is today without the input of AMD's graphics division!

That's a great way to finish things off, and tomorrow I suspect AMD will have much more to say on the subject of the changing landscape of computer graphics over the past 30 years. It's been a wild ride, and when I think back to the early days of computer games and then look at modern titles, it's pretty amazing. It's also interesting to note that people often complain about spending $200 or $300 on a reasonably high performance GPU, when the reality is that the top performing video cards have often cost several hundred dollars – I remember buying an early 1MB True Color card for $200 back in the day, and that was nowhere near the top of the line offering. The amount of compute performance we can now buy for under $500 is awesome, and I can only imagine what the advances of another 30 years will bring us. So, congratulations to AMD on 30 years of graphics innovation, and here's to 30 more years!

Source: AMD Livecast Announcement

Comments Locked

32 Comments

View All Comments

  • atlantico - Saturday, August 23, 2014 - link

    Well done people of AMD, 30 years in tech is very impressive, few make it that far and stay on the cutting edge like you have done. Here's to another 30.
  • HisDivineOrder - Sunday, August 24, 2014 - link

    So, uh, when is your nVidia retrospective that ignores any discussion of the competition in the same timeframe? ;)

    Ahem. Wow, look at all the red buttons and titling and whats-it's.

    AMD must be throwing all the money they used to put toward new R&D (not trickling out GPU designs completed pre-1Q2013) into their Anandtech Red Section Payment Department.

Log in

Don't have an account? Sign up now