Create a free Manufacturing.net account to continue

Two CPU Designers Who Changed The World

If quotation frequency was a measurement of significance, Gordon Moore definitely would be the most important semiconductor engineer in history. Moore's Law – which states the number of transistors in semiconductors doubles every 18 months – has been Silicon Valley canon law for 40 years.

 

If quotation frequency was a measurement of significance, Gordon Moore definitely would be the most important semiconductor engineer in history. Moore's Law – which states the number of transistors in semiconductors doubles every 18 months – has been Silicon Valley canon law for 40 years.

However, Moore’s Law has nothing to do with engineering and everything to do with marketing. The cynical view of Moore’s saying is that technology is growing so fast that you have to keep upgrading. Unless you are reading this on the very first computer you ever owned, you are proof of that truism.

Despite Moore’s noteworthy accomplishments, lesser-known men have been far more important players in CPU history. Some have achieved great success and significant stature in academia and are only known to students of the trade. For example, among those unsung (or less-often-sung) heroes is David Patterson, the U.C. Berkeley professor who developed the RISC architecture and performed key research into RAID storage. Patterson's book Computer Architecture: A Quantitative Approach has been standard issue for computer science students for years. He co-authored the book with John Hennessy, a professor of computer science at Stanford University when they first wrote the book, and now the acclaimed university’s president.

While engineering is not a one-man show, it was two engineers at competing companies who led their employer’s respective evolution of the x86 architecture: Pat Gelsinger and Derrick “Dirk” Meyer. Gelsinger was a 29-year veteran of Intel Corp., starting with the company right out of high school, before he departed for EMC. Currently he's CEO of VMware. Meyer began as an engineer with Digital Equipment Corp. (DEC) before jumping to Advanced Micro Devices and becoming its CEO. He left in 2011 and is currently enjoying time off from the grind of the semiconductor world.

Let me explain why I think these gentlemen deserve more attention.

After more than 30 years, the x86 architecture continues to grow, when most technologies go obsolete within a decade. That's because it’s an additive technology. Intel added new technology to its CPU architecture, but it never took away the old.

“On the original 8086 there was no way to context switch. You couldn't run a multiprocessing OS on it,” said John Culver, who runs the CPUShack Museum. “Then Intel added protected mode on the 286 and 386 for multiprocessing. That's how x86 and other architectures have morphed.”

No one would argue that x86 is the superior architecture, but it was the most supported, and that has led to its success as well. “You have a design that they came up with in 1978 and IBM used it in the IBM PC, so you ended up with developers that knew how to code x86 assembly,” said Culver. “So you ended up with such a huge amount of existing code and programmers that knew how to use it; and those people aren't going to change their habits very much.”

Pat Gelsinger: the x86’s public face

Gelsinger is the more famous and recognized of the two engineers because he was with Intel for so long and was often its public face, especially after the retirement of Intel’s charismatic CEO, Andy Grove.

Gelsinger joined the company in 1979 right out of high school. The recruiter's note on him is somewhat legendary: "Smart, arrogant, aggressive – he’ll fit right in." He did. Gelsinger worked his way up through the company’s ranks, and also through school. Intel would cover his tuition so long as he maintained a B average.

While working on the 80286 project, Gelsinger earned a B.S. in electrical engineering, and he got his Masters degree from Stanford while working on the 386. By the time Gelsinger was 25, Andy Grove offered to let him lead the 80486 project to keep him from quitting the company, since Gelsinger wanted to go to Stanford full time to earn his PhD in electrical engineering.

Back when IBM outsourced the CPU for its Personal Computer to Intel, IBM was leery of a single vendor supplying the vital CPU. So Intel was forced to strike a deal with AMD to make 8086 chip clones. IBM’s goal at the time was to get as many players to participate in the PC market as possible, so it did not pursue any legal action against vendors that reverse-engineered the 8086 or the BIOS.

Before Intel knew it, it was competing with AMD, Cyrix, NEC, IBM, and Fujitsu on the x86 chips. It faced the very real threat of bankruptcy, but had pinned its hopes on a new 32-bit processor, the iAXP432.

Most of Intel’s top talent worked on the iAXP432 project, but Gelsinger and John Crawford worked on a side project, the 80386. It's a good thing, too, because the iAPX432 was a dud. While it was ambitious and tried to do many things modern CPUs do – including object-oriented memory and capabilities, garbage collection, multitasking, and interprocess communication and multiprocessing – the performance of the iAPX432 was atrocious.

It was the two junior engineers, Gelsinger and Crawford, who saved Intel by giving the company a fallback technology, the 80386. “Intel management told those two, ‘You figure out how to keep the 286 alive as an interim product.’ Then the 432 imploded and the 386 had to carry the company forward. Fortunately, because of the extensions and improvements they made, it did a pretty good job of keeping Intel in the market," said Nathan Brookwood, research fellow with Insight 64, who follows the semiconductor market.

As lead of the 486 project, Gelsinger introduced things like L1 cache with SRAM and an on-board math co-processor. The 486 would also be the first CPU with clock doubling, where the internal clock speed of the CPU was faster than the I/O bus.

After that project, Gelsinger moved up into management, with more good results than bad. He ran the division that produced the Pentium Pro and other CPUs, but stumbled with Intel's attempt at a video conferencing product called ProShare. As CTO, Gelsinger created the Intel Developer Forum, an event which now attracts thousands every year at its multiple global locations.

Many industry pundits felt Gelsinger would become CEO of the company, and he made it quite clear he wanted the job. However, Gelsinger suddenly left the company in 2009 for the Chief Operating Officer position with EMC.

Dirk Meyer: Mr. 64-bit

Meyer can be rightfully nicknamed “Mr. 64-bit” because that's where he made his mark. While an engineer at DEC, Meyer was involved in co-development of the second 64-bit CPU on the market, the Alpha, which shipped in 1992. (MIPS was first, coming out with the R4000 the prior year.)

The Alpha was intended to be DEC’s replacement for the 32-bit processor used in the VAX line of servers. The Alpha chip was quite an innovation for the time. It was a very fast processor, with speeds in the hundreds of megahertz at a time when most CPUs (RISC or CISC) were still in double-digit speeds. The Alpha introduced the idea of a large secondary cache on chip and complex out-of-order execution microarchitecture. A later generation chip was the first high performance processor to have an on-chip memory controller.

But it went nowhere, which perhaps is why Meyer’s name isn’t a (geek) household word. "Alpha was such a small part of the market. It was just a tiny segment year after year. There just wasn’t that much drive for it," said Culver.

Meyer joined AMD in 1996 and led the development of what would become the Athlon processor. Intel at this point had gone from being an innovative company to “almost a patent troll,” said Culver. Intel was spending more time suing competitors than innovating, he pointed out.

Meyer's history with DEC extends into AMD even more than most people realize, said Culver. The Athlon came out with a Slot A processor design that was also used on Alpha hardware. The original design of Athlon was called Slot B, which would run both on both DEC and AMD hardware. AMD even made chipsets that supported both DEC and AMD.

Why would DEC give away a secret sauce like this? Because, like Intel when it struck its deal with IBM in 1979, DEC was in a pinch. "DEC always struggled with having enough support of their processors,” said Culver. “They had to do stuff in house, like making motherboards. If they could get others to support their architecture, it could make their hardware more popular. There's a fine line to keeping a closed shop and getting other people to support you.”

And Meyer wasn't interested in making another chip no one used. "Dirk told me he never was going to build another chip that didn't have a large installed base of software ready to run on it,” said Brookwood. “With the Alpha, he did all these great optimizations for it and the Alpha ran great for all three apps that were available on it.”

With the Athlon, Meyer and AMD caught Intel flatfooted. Athlon introduced a dual core processor, and it moved the memory controller from a separate chip, the front side bus, onto the chip die. Most important, it was 64-bit. This meant it could address more than 4GB of memory, which was the limit of a 32-bit processor.

AMD quickly modified the Athlon for servers and called it the Opteron. Then came the needed software. When Microsoft released a 64-bit version of Windows Server 2003, it became the first major company to embrace 64-bit computing.

Athlon and Opteron forced Intel's hand. It got to work on 64-bit processors and multicore. The on-die memory controller came last. Some of that technology existed on high end Intel chips but not the mass market products.

"Suddenly it went from a server and high-end desktop to anyone could have it, and that was something Intel wasn't ready for. They wanted to keep it in the high-end market and AMD brought it to the consumer market," said Culver.

Intel had gone down the NetBurst technology road, which turned into a dead end, said Brookwood. Then a design team in Israel showed off a variation of the P6 architecture, called Banias, which was better-performing and ran cooler. Eventually the Israel design team came out with Conroe, a single architecture that spanned from laptops to Xeon servers. This group was led by David “Dadi” Perlmutter, whose star rose quickly inside Intel just as Gelsinger’s had.

AMD would go from zero market share in the server market to 20% almost overnight. It also grabbed about 30% desktop share with the Athlon. However, a series of missteps in 2007 and a reinvigorated Intel reversed almost all of those gains inside of five years. Meyer would become AMD CEO in 2008, only to be pushed out in 2011, unable to stem its slide.

So what happened to AMD? Its reach exceeded its grasp, according to Culver. “Intel vastly outspends them in R&D. When you are talking chips with a billion transistors you can't do anything cheap. AMD started promoting engineers into management and became too big a company to innovate and adapt. Intel caught them with larger R&D budget,” he said.

But he credits AMD with waking up a lazy giant. "If it wasn’t for them, Intel would have had no motivation to do anything. Why would Intel make a better processor if there was no drive for it?" said Culver.

Both companies lost their way around the time Gelsinger and Meyer moved from engineering to management. Nonetheless, Brookwood thinks the management path was a good idea for both men.

"I think you want to have guys at the top [in tech companies] who appreciate what the technology is all about and can sometimes push the technologists in the organization to do more than they think they can do. So I don’t fault either of those guys for trying to move on," he said.

The future is now looking more like a war between x86 and ARM, the microarchitecture designed by the British firm ARM Holdings, than a battle between Intel and AMD. ARM makes no chips of its own, it simply comes up with a core design and lets licensees make their own version of it. It has more than 100 licensees, including Samsung, Qualcomm, Nvidia, and Marvell.

So it seems odd that the two engineers who advanced x86 the most are out of the game: Gelsinger is with a software firm, Meyer is in semi-retirement. Their companies might need them more than ever. Or perhaps it's time for new blood to step up in this new world, in which our smartphones have more computing power than the retired Space Shuttle.

More in Industry 4.0