It was in the clay room, a closet filled with plastic bags of gray muck at the back of Mr. Ziska’s art room, where I made my move. For the first time ever, I found myself standing alone with Nancy Wilkins, the love of my life, the girl of my dreams. She was a vision in her green and black plaid skirt and white blouse, with little flecks of clay dusted across her glasses. Her blonde hair was in a ponytail, her teeth were in braces, and I was sure—well, pretty sure—that she was wearing a bra.

“Run away with me, Nancy,” I said, wrapping my arms around her from behind. Forget for a moment, as I obviously did, that we were both 13 years old, trapped in the eighth grade, and had nowhere to run away to.

“Why would I want to run away?” Nancy responded, gently twisting free. “Let’s stay here and have fun with everyone else.”

It wasn’t a rejection, really. There had been no screams, no slaps, no frenzied pounding on the door by Earl Ziska, eager to throw his 120 pounds of fighting fury against me for making a pass at one of his art students. And she’d used the word let’s, so maybe I had a chance. Still, Nancy’s was a call to mediocrity, to being just like all the other kids.

Running away still sounded better to me.

What I really had in mind was not running away but running toward something, toward a future where I was older (16 would do it, I reckoned) and taller and had lots of money and could live out my fantasies with impunity, Nancy Wilkins at my side. But I couldn’t say that. It wouldn’t have been cool to say, “Come with me to a place where I am taller.”

We never ran anywhere together, Nancy and I. It was clear from that moment in the clay room that she was content to live her life in formation with everyone else’s and to limit her goals to within one standard deviation on the upside of average. Like nearly everyone else in school and in the world, she wanted more than anything else to be just like her best friends. Only prettier, of course.

Fitting in is the root of culture. Staying here and having fun with everyone else is what allows societies to function, but it’s not a source of progress. Progress comes from discord—from doing new things in new ways, from running away to something new, even when it means giving up that chance to have fun with the old gang. To engineers—really good ones, interested in making progress—the best of all possible worlds would be one in which technologies competed continuously and only the best technologies survived. Whether the good stuff came from an established company, a start-up, or even from Earl Ziska wouldn’t matter. But it usually does matter because the real world, the one we live in, is a world of dollars, not sense. It’s a world where commercial interests are entrenched and consumers typically pay closer attention to what everyone else is buying than to whether what they are buying is any good. In this real world, then, the most successful products become standards against which all other products are measured, not for their performance or cleverness but for the extent to which they are like that standard.

In the standards game, as in penmanship, the best grades often go to the least interesting people.

In 1948, CBS introduced the long-playing record album—the LP. The new records spun at 33V3 revolutions per minute rather than the 78 RPM that had been the standard for forty years. This slower speed, combined with the fact that the smaller needle allowed the grooves to be closer together than on the old 78s, made it possible to put more music than ever before on each side of a record. The sound quality of the LPs was better, too. They called it “stereo high fidelity.”

The smaller needle used to play an LP and its light tracking weight meant that records wouldn’t wear out as quickly as they had with the old steel needles. And the light needles meant that LPs could be made out of unbreakable vinyl rather than the thick, brittle plastic that had been used before.

LPs were better in every way than the old 78s they replaced. Sure, listeners would have to buy new record players, and LPs might cost more to buy, but those were minor penalties for the glories of true high fidelity.

Also in 1948, at about the same time that CBS was introducing the LP, RCA was across town bringing out the first 45 RPM single. The 45 had a better sound than the old 78s, too, though not as good as the LP and not in stereo. But where the LPs put twenty minutes of music on one record side, the 45s opted for a minimalist solution—one song per side—which made 45s cheaper than the 78s they replaced, and lots cheaper than LPs. Forty-fives worked well in jukeboxes, too, because their large center holes made life easier for robot fingers.

The 45s were pretty terrific, though you still had to buy a new record player.

So here it was 1948. One war was over, and the next one was not even imagined, America and American tastes ruled the world. and the record industry had just offered up its two best ideas for how music should be sold for the next forty years. What happened?

The recording industry immediately entered a four-year slump as Americans, who couldn’t decide what type of record to buy, decided not to buy any records at all.

What happened to the record industry in 1948 was the result of two major players’ deciding to promote new technical standards at exactly the same time.

“You’ll sell millions of 45s,” the RCA salesmen told record store owners.

“Just listen to the music,” said the CBS salesman.

“Who’s going to pay six bucks for one record?” asked RCA.

“Think profit margins,” ordered CBS.

“Think sales volume!”

Who could think? So they didn’t, and the industry fumbled along until an act of God or Elvis Presley decided which standard would dominate what parts of the business. Forty-fives eventually gained the youth vote, while LPs took the high end of the market. In time, machines were built that could play both types of records, and the two technical standards were eventually marketed in a manner that made them complementary. But that wasn’t the original intention of their inventors, each of whom wanted to have it all.

Markets hate equality. That was the problem with this battle between LPs and 45s: both were better than the old standard, and each had advantages over the other. In the world of music, circa 1948, it just wasn’t immediately clear which standard would be dominant, so the third parties in the industry did not know how to align themselves. If either CBS or RCA had been a couple of years later, the market would have had a chance to adopt the first new standard and then consider the second. Everybody would have been listening to more music.

In any major market, there are always two standards, and generally only two, because people are different enough that they won’t all be satisfied with the same thing, yet consumers naturally align themselves into either the “us” or “them” camp. No triangles. Even the Big Three U.S. automakers don’t constitute a triangle because they have all chosen to support the same standard—the passenger automobile. For all the high school bickering I remember about whether a Ford was better than a Chevy, the alternative standard to a Mustang is not a Camaro; it’s a pickup truck.

Just as there are always two standards, one of those standards is always dominant. Eighty-five percent of the folks who go shopping for a passenger vehicle come home with a car, while 15 percent come home with a truck. Eighty-five percent of the home videocassette recorders in America are VHS, while 15 percent are Betamax. Those numbers—85 and 15—keep coming back again and again. Maybe that’s the natural relationship between primary and secondary standards, somehow determined by the gods of consumer products.

In the personal computer business today, about 85 percent of the machines sold are IBM compatible, and 15 percent are Apple Macintoshes. Sure, there are other brands—Commodore Amigas, Atari STs, and weird boxes built in England that function in ways that make sense only to English minds—and even the makers of these machines complain that somehow they have trouble getting noticed by anything but the hobbyist market. The mainstream American market—the business market—just doesn’t see these machines as computers, even though some of them offer superior features. It’s not that they aren’t good; it’s that they are third.

When IBM introduced its Personal Computer, the world was ready for a change. The 8-bit computers of the time were doing their best to imitate the battle between LPs and 45s. There just wasn’t much of a qualitative difference between the Apple IIs, TRS-8os, and CP/M boxes of the time, so no one standard had broken out, taking the overall market to new heights with it. The market needed differentiation, and that was provided by the entry of IBM, raising its 16-bit standard.

Eight-bit partisans looked down their noses at the new PC, said that it was overpriced and underpowered, and asked who would ever need that much memory, anyway. With 3,000 Apple II applications and 5,000 CP/M applications on the market, sheer volume of software would keep IBM and PC-DOS from succeeding, they argued. Their letters of protest in InfoWorld had a note of shrillness, though, as if the writers were suddenly and for the first time aware of their own mortality. That’s the way it is with soon-to-be passing standards. Collectors of 78s sounded that way too until they vanished.

In the world of standards, ubiquity is the last step before invisibility.

The new standard was going to be 16-bit computing, that was clear, but what wasn’t immediately clear was that the new standard would be 16-bit computing using IBM hardware and the PC-DOS operating system. Many companies saw as much opportunity to build the new 16-bit standard computing with their hardware and their operating system as with IBM’s.

There were lots of IBM competitors. There was the Victor 9000, sold by Kidde, an old-line office machine company. The Victor had more power, more storage, more memory, and better graphics than the IBM PC, and for less money. There was the Zenith Z-100, which had two processors, so it could run 8-bit or 16-bit software, and it too was a little cheaper than the IBM PC. There was the Hewlett-Packard HP-150, which had more power, more storage, more memory than the IBM PC, and a nifty touchscreen that let users make choices by pointing at the screen.

There was the DEC Rainbow 100, which had more power, more storage, and the DEC name. There was a Xerox computer, a Wang computer, and a Honeywell computer. There were suddenly lots of 16-bit computers hoping to snatch the mantle of de facto industry standard away from IBM, through either superior technology or pricing.

One reason that all these players were trying to take on IBM was that Microsoft encouraged them to. Bill Gates, too, was uncertain that IBM’s PC-DOS would become the new standard, so he urged all the other companies doing 16-bit computers with Intel processors to implement their own versions of DOS. And it was good business, too, since Microsoft programmers were doing the work of making MS-DOS work on each new platform. No matter which company set the standard, Microsoft was determined that it would involve a version of their operating system.

But there was another reason for Microsoft to encourage IBM’s competitors to commission their own versions of DOS. Charles Simonyi and friends had been working up a suite of MS-DOS applications with these varied platforms specifically in mind. Multiplan, the spreadsheet. Multiword, later called just Word, and all the other early Microsoft applications were designed to be quickly ported to strange operating systems and new hardware.

The idea was that Bill Gates would convince, say, Zenith, to commission a custom version of MS-DOS. Once that project was underway, it was time to remind Zenith that this new DOS version might not work with all (or any) of the other DOS applications on the market, most of which were customized for the IBM PC.

Panic time at Zenith headquarters in Illinois, where it became imperative to find some applications quickly that would work with its new version of DOS. Son-of-a-gun, Microsoft just happened to have a few portable applications lying around, written in a pseudocode that could be quickly adapted to almost any computer. They weren’t very good applications, but they sure were portable. And so Zenith, having been encouraged by Microsoft to do hardware incompatible with IBM’s, then suckered into commissioning a custom version of MS-DOS, finally ended up having to pay Microsoft to adapt its applications, too. With all his costs covered, Bill Gates could start to make money even before the first copy of Multiplan or Word for Zenith was even sold.

This squeeze play happened for every new platform and every new version of MS-DOS and was just the first of many instances when Microsoft deliberately coordinated its operating system and application strategies, something the company continues to claim it never did.

As for the Victor 9000, the Z-100, the HP-150, the DEC Rainbow 100, and all the other early MS-DOS machines, those computers are gone now, dead and mainly forgotten. We can come up with all sorts of individual reasons why each machine failed, but at bottom they all failed because they were not IBM PC compatible. When the IBM PC, for all its faults, instantly became the number one selling personal computer, it became the de facto industry standard, because de facto standards are set by market share and nothing else. When Lotus 1-2-3 appeared, running on the IBM, and only on the IBM, the PC’s role as the technical standard setter was guaranteed not just for this particular generation of hardware but for several generations of hardware.

The IBM PC defined what it meant to be a personal computer, and all these other computers that were sorta like the IBM PC, kinda like it, were doomed to eventual failure. They didn’t even qualify as the requisite second standard—the pickup truck rather than the car—because although they were all different from the IBM PC, they weren’t different enough to qualify for the number two spot.

Even the Grid Compass, the first laptop computer, was a failure because of a lack of IBM compatibility. Brilliant technology but different graphics and storage standards meant that Grid needed a version of 1-2-3 different from the one that worked on the IBM PC. When Grid supplied its own applications with the computer, including a spreadsheet, it still wasn’t enough to attract buyers who wanted their 1-2-3. It was back to the drawing board to develop a second-generation laptop that was IBM compatible.

Entrepreneurs often lack the discipline to keep their new products tightly within a technical standard, which was why the idea of 100 percent IBM compatibility took so long to be accepted. “Why be compatible when you could be better?” the smart guys asked on their way to bankruptcy court.

IBM compatibility quickly became the key, and the level to which a computer was IBM compatible determined its success. Some long-established microcomputer makers learned this lesson slowly and expensively. Hewlett-Packard actually paid Lotus to adapt 1-2-3 to the HP-150, but the computer was still doomed by its lack of hardware compatibility (you couldn’t put an IBM circuit card in an HP-150 computer). The other problem with the HP-150 was what was supposed to have been its major selling point—the touchscreen, which was a clever idea nobody really wanted. Not only was it hard to get software companies to make their products work with HP’s touchscreen technology, users didn’t like it. Secretaries, who apparently measure their self-worth by typing speed, didn’t want to take their fingers off the keys. Even middle managers, who were the intended users of the system, didn’t like the touch screen. The technology was clever, but it should have been a tip-off that HP’s own engineers chose not to use the systems. You could walk through the cavernlike open offices at HP headquarters in those days without seeing a single user pointing at his or her touchscreen.

The best and most powerful computers come from designers who actually use their technologies—whose own tastes model those of intended users. Ivory towers, no matter how high, don’t produce effective products for the real world.

Down at Tandy Corp. headquarters in Fort Worth, where ivory towers are unknown, Radio Shack’s answer to the IBM PC was the Model 2000, another workalike, which appeared in the fall of 1983. The Model 2000 was intended to beat the IBM PC with twice the speed, more storage, and higher-resolution graphics. The trick was a more powerful processor, the Intel 80186, which could run rings around IBM’s old 8088.

Because Tandy had its own distribution through 5,000 Radio Shack stores and through a chain of Tandy Computer Centers, the company thought for a long time that it was somehow immune to the influence of the IBM standard. They thought of their trusty Radio Shack customers as Albanians who would loyally shop at the Albanian Computer Store, no matter what was happening in the rest of the world. But Radio Shack’s white-collar customer list turned out to include very few Albanians.

Bill Gates was a strong believer in the Model 2000 because it was the only personal computer powerful enough to run new software from Microsoft called Windows without being embarrassingly slow. Windows was an attempt to bring a Xerox Alto-style graphical user interface to personal computers. But Windows took a lot of power to run and was a real dog on the IBM PC and the other computers using 8088 processors. For Windows to succeed. Bill Gates needed a computer like the Model 2000. So young Bill, who handled the Tandy account himself, predicted that the computer would be a grand success—something the boys and girls in Fort Worth wanted badly to hear. And Gates made a public endorsement of the Model 2000, hoping to sway customers and promote Windows as well.

Still, the Model 2000 failed miserably. Nobody gave a damn about Windows, which didn’t appear until 1985, and even then didn’t work well. The computer wasn’t hardware compatible with IBM. It wasn’t very software compatible with IBM either, and the most popular IBM PC programs—the ones that talked directly to the PC’s memory and so worked lots faster than those that allowed the operating system to do the talking for them— wouldn’t work at all. Even the signals from the keyboard were different from IBM’s, which drove software developers crazy and was one of the reasons that only a handful of software houses produced 2000-specific versions of their products. Oh, and the Intel 80186 processor had bugs, too, which took months to fix.

Today the Model 2000 is considered the magnum opus of Radio Shack marketing failures. Worse, a Radio Shack computer buyer in his last days with the company for some reason ordered 20,000 more of the systems built even when it was apparent they weren’t selling. Tandy eventually sold over 5,000 of those systems to itself, placing one in each Radio Shack store to track inventory. Some leftover Model 2000s were still in the warehouse in early 1990, almost seven years later.

Still, the Model 2000’s failure was Bill Gates’s gain. Windows was a failure, but the head of Radio Shack’s computer division, Jon Shirley, the very guy who’d been duped by Bill Gates into doing the Model 2000 in the first place, sensed that his position in Fort Worth was in danger and joined Microsoft as president in 1983.

Big Blue’s share of the personal computer market peaked above 40 percent in the early 1980s. In 1983, IBM sold 538,000 personal computers. In 1984, it sold 1,375,000.

IBM wasn’t afraid of others’ copying the design of the PC, although nearly the entire system was built of off-the-shelf parts from other companies. Conventional wisdom in Boca Raton said that competitors would always pay more than IBM did for the parts needed to build a PC clone. To compete with IBM, another company would have to sell its PC clone at such a low price that there would be no profit. That was the theory.

In one sense, nothing could have been easier than building a PC clone, since IBM was so generous in supplying technical information about its systems. Everything a good engineer would need to know in order to design an IBM PC copy was readily available. While it seems like this would encourage copying, it was intended to do just the opposite because a trap lay in IBM’s technical documentation. That trap was the complete code listing for the IBM PC’s ROM-BIOS.

Remember, the ROM-BIOS was Gary Kildall’s invention that allowed the same version of CP/M to operate on many different types of computers. The basic input/output system (BIOS) was special computer code that linked the generic operating system to specific hardware. The BIOS was stored in a read-only memory chip—a ROM—installed on the main computer circuit board, called the motherboard. To be completely compatible with the IBM PC, a clone machine either would have to use IBM’s ROM-BIOS chip, which wasn’t for sale, or devise another chip just like IBM’s. But IBM’s ROM-BIOS was copyrighted. The lines of code burned into the read-only memory were protected by law, so while it would be an easy matter to take IBM’s published ROM-BIOS code and use it to prepare an exact copy of the chip, doing so would violate IBM’s copyright and incur the legendary wrath of Armonk.

The key to making a copy of the IBM PC was copying the ROM-BIOS, and the key to copying the ROM-BIOS was to do so without reading IBM’s published BIOS code.


As we saw with Dan Bricklin’s copyright on VisiCalc, a copyright protects only the specific lines of computer code but not the functions that those lines of code made the computer perform. The IBM copyright did not protect the company from others who might write their own completely independent code that just happened to perform the same BIOS function. By publishing its copyrighted BIOS code, IBM was making it very hard for others to claim that they had written their own BIOS without being exposed to or influenced by IBM’s.

IBM was wrong. Welcome to the world of reverse engineering.

Reverse engineering is the science of copying a technical function without copying the legally protected manner in which that function is accomplished in a competitor’s machine. Would-be PC clone makers had to come up with a chip that would replace IBM’s ROM-BIOS but do so without copying any IBM code. The way this is done is by looking at IBM’s ROM-BIOS as a black box —a mystery machine that does funny things to inputs and outputs. By knowing what data go into the black box—the ROM— and what data come out, programmers can make intelligent guesses about what happens to the data when they are inside the ROM. Reverse engineering is a matter of putting many of these guesses together and testing them until the cloned ROM-BIOS acts exactly like the target ROM-BIOS. It’s a tedious and expensive process and one that can be accomplished only by virgins—programmers who can prove that they have never been exposed to IBM’s ROM-BIOS code—and good virgins are hard to find.

Reverse engineering the IBM PC’s ROM-BIOS took the efforts of fifteen senior programmers over several months and cost $1 million for the company that finally did it: Compaq Computer.

Compaq is the computer company with good penmanship. There was so little ego evident around the table when Rod Canion, Jim Harris, and Bill Murto were planning their start-up in the summer of 1981 that the three couldn’t decide at first whether to open a Mexican restaurant, build hard disk drives for personal computers, or manufacture a gizmo that would beep on command to help find lost car keys. Oh, and they also considered starting a computer company. The computer company idea eventually won out, and the concept of the Compaq was first sketched out on a placemat at a House of Pies restaurant in Houston.

All three founders were experienced managers from Texas Instruments. TI was the company that many computer professionals expected throughout the late 1970s and early 1980s eventually to dominate the microcomputer business with its superior technology and management, only that never happened. Despite having the best chips, the brightest engineers, and Texas-sized ambition, the best TI did was a disastrous entry into the home computer business that eventually lost the company hundreds of millions of dollars. Later there was also an incompatible MS-DOS computer that came and went, suffering the same problem of attracting software as all the other rogue machines. Eventually TI produced a modest line of PC clones.

Unlike most of the other would-be IBM competitors, the three Compaq founders realized that software, and not hardware, was what really mattered. In order for their computer to be successful, it would have to have a large library of available software right from the start, which meant building a computer that was compatible with some other system. The only 16-bit standard available that qualified under these rules was IBM’s, so that was the decision—to make an IBM-compatible PC—and to make it damn compatible— 100 percent. Any program that would run on an IBM PC would run on a Compaq. Any circuit card that would operate in an IBM PC would operate in a Compaq. The key to their success would be leveraging the market’s considerable investment in IBM.

Crunching the numbers harder than IBM had, the Compaq founders discovered that a smaller company with less overhead than IBM’s could, in fact, bring out a lower-priced product and still make an acceptable profit. This didn’t mean undercutting IBM by a lot but by a significant amount—about $800 on the first Compaq model compared to an IBM PC with equivalent features.

Compaq, like any other company pushing a new product, still had to ride the edges of an existing market, offering additional reasons for customers to choose its computer over IBM’s. Just to be different, the first Compaq models were 28-pound portables— luggables, they came to be called. People didn’t really drag these sewing machine-sized units around that much, but since IBM didn’t make a luggable version of the PC, making theirs portable gave Compaq a niche to sell in right next to IBM.

Compaq appealed to computer dealers, even those who already sold IBM. Especially those who already sold IBM. For one thing, the Compaq portables were available, while IBM PCs were sometimes in short supply. Compaq pricing allowed dealers a 36 percent markup compared to IBM’s 33 percent. And unlike IBM, Compaq had no direct sales force that competed with dealers. A third of IBM’s personal computers were sold direct to major corporations, and each of those sales rankled some local dealer who felt cheated by Big Blue.

Just like IBM, Compaq first appeared in Sears Business Centers and ComputerLand stores, though a year later, at the end of 1982. With the Compaq’s portability, compatibility, availability, and higher profit margins, signing up both chains was not difficult. Bill Murto made the ComputerLand sale by demonstrating the computer propped on the toilet seat in his hotel bathroom, the only place he could find a three-pronged electrical outlet.

Just like IBM, Compaq’s dealer network was built by Sparky Sparks, who was hired away from Big Blue to do a repeat performance, selling similar systems to a virtually identical dealer network, though this time from Houston rather than Boca Raton.

By riding IBM’s tail while being even better than IBM, Compaq sold 47,000 computers worth $111 million in its first year—a start-up record.

With the overnight success of Compaq, the idea of doing 100 percent IBM-compatible clones suddenly became very popular (“We’d intended to do it this way all along,” the clone makers said), and the IBM workalikes quickly faded away. The most difficult and expensive part of Compaq’s success had been developing the ROM-BIOS, a problem not faced by the many Compaq impersonators that suddenly appeared. What Compaq had done, companies like Phoenix Technologies could do too, and did. But Phoenix, a start-up from Boston, made its money not by building PC clones but by selling IBM-compatible BIOS chips to clone makers. Buying Phoenix’s ROM-BIOS for $25 per chip, a couple of guys in a warehouse in New Jersey could put together systems that looked and ran just like IBM PCs, but cost 30 percent less to buy.

For months, IBM was shielded from the impact of the clone makers, first by Big Blue’s own shortage of machines and later by a scam perpetrated by dealers.

When IBM’s factories began churning out millions and millions of PCs, the computer giant set in place a plan that offered volume discounts to dealers. The more computers a dealer ordered, the less each computer cost. To make their cost of goods as low as possible, many dealers ordered as many computers as IBM would sell them, even if that was more computers than they could store at one time or even pay for. Having got the volume price, these dealers would sell excess computers out the back door to unauthorized dealers, at cost. Just when the planners in Boca Raton thought dealers were selling at retail everything they could make, these gray market PCs were being flogged by mail order or off the back of a truck in a parking lot, generally for 15 percent under list price.

Typical of these gray marketeers was Michael Dell, an 18-year-old student at the University of Texas with a taste for the finer things in life, who was soon clearing $30,000 per month selling gray market PCs from his Austin dorm room. Today Dell is a PC clone-maker, selling $400 million worth of IBM compatible computers a year.

Seeing this gray market scam as incessant demand, IBM just kept increasing production, increasing at the same time the downward pressure on gray market prices until some dealers were finally selling machines out of the back door for less than cost. That’s when Big Blue finally noticed the clones.

For companies like IBM, the eventual problem with a hardware standard like the IBM PC is that it becomes a commodity. Companies you’ve never heard of in exotic places like Taiwan and Bayonne suddenly see that there is a big demand for specific PC power supplies, or cases, or floppy disk drives, or motherboards, and whumpl the skies open and out fall millions of Acme power supplies, and Acme deluxe computer cases, and Acme floppy disk drives, and Acme Jr. motherboards, all built exactly like the ones used by IBM, just as good, and at one-third the price. It always happens. And if you, like IBM, are the caretaker of the hardware standard, or at least think that you still are, because sometimes such duties just drift away without their holder knowing it, the only way to fight back is by changing the rules. You’ve got to start selling a whole new PC that can’t use Acme power supplies, or Acme floppy disk drives, or Acme Jr. motherboards, and just hope that the buyers will follow you to that new standard so the commoditization process can start all over again.

Commoditization is great for customers because it drives prices down and forces standard setters to innovate. In the absence of such competition, IBM would have done nothing. The company would still be building the original PC from 1981 if it could make enough profit doing so.

But IBM couldn’t keep making a profit on its old hardware, which explains why Big Blue, in 1984, cut prices on its existing PC line and then introduced the PC-AT, a completely new computer that offered significantly higher performance and a certain amount of software compatibility with the old PC while conveniently having no parts in common with the earlier machine.

The AT was a speed demon. It ran two to three times faster than the old PCs and XTs, It had an Intel 80286 microprocessor, completely bypassing the flawed 80186 used in the Radio Shack Model 2000. Instead of a 360K floppy disk drive, the AT used a special 1.2-megabyte floppy, and every machine came with at least a 20-megabyte hard disk.

At around $4,000, the AT was also expensive, it wasn’t able to run many popular PC-DOS applications, and sometimes it didn’t run at all because the Computer Memories Inc. (CMI) hard disk used in early units had a tendency to die, taking the first ten chapters of your great American novel with it. IBM was so eager to swat Compaq and the lesser clone makers that it brought out the AT without adequate testing of the CMI drive’s controller card built by Western Digital. There was no alternative controller to replace the faulty units, which led to months of angry customers and delayed production. Some customers who ordered the PC-AT at its introduction did not receive their machines for nine months.

The 80286 processor had been designed by Intel to operate in multi-user computers running a version of AT&T’s Unix operating system called Xenix and sold by Microsoft. The chip was never intended to go in a PC. And in order to run Xenix efficiently, the 286 had two modes of operation—real mode and protected mode. In real mode, the 286 operated just like a very fast 8086 or 8088, and this was the way it could run some, but not all, MS-DOS applications. But protected mode was where the 286 showed its strength. In protected mode, the 286 could emulate several 8086s at once and could access vast amounts of memory. If real mode was impulse power, protected mode was warp speed. The only problem was that you couldn’t get there from here.

The 286 chip powered up in real mode and then could be shifted into protected mode. This was the way Intel had en-visoned it working in Xenix computers, which would operate strictly in protected mode. But the 286 was a chip that couldn’t downshift; it could switch from real to protected mode but not from protected mode to real mode. The only way to get back to real mode was to turn the computer off, which was fine for a Xenix system at the end of the workday but pretty stupid for a PC that wanted to switch between a protected mode application and a real mode application. Until most applications ran in protected mode, then, the PC-AT would not reach its full potential.

And not only was the AT flawed, it was also late. The plan had been to introduce the new machine in early 1983, eighteen months after the original IBM PC and right in line with the trend of starting a new microcomputer generation every year and a half. But IBM’s PC business unit was no longer able to bring a product to market in only eighteen months. They’d done the original PC in a year, but that had been in the time of gods, not men, before reality and the way that things have to be done in enormous companies had sunk in. Three years was how long it took IBM to invent a new computer, and the marketing staff in Boca Raton would just have to accept that and figure clever ways to keep the clones at bay for twice as long as they had been expected to before.

Still, the one-two punch of lowering PC prices and then introducing the AT took a toll on the clone makers, who had their already slim profit margins hurt by IBM’s lower prices while simultaneously having to invest in cloning the AT.

The market loyally followed IBM to the AT standard, but life was never again as rosy for IBM as it had been in those earlier days of the original PC. Compaq, in a major effort, cloned the AT in only six months and shipped 10,000 of its Deskpro 286 models before IBM had solved the CMI drive problem and resumed its own AT shipments. But in the long term, Compaq was a small problem for IBM, compared to the one presented by Gordie Campbell.

Gordon Campbell was once the head of marketing at Intel. Like everyone else of importance at the monster chip company, he was an engineer. And as only an engineer could, one day Gordie fell in love with a new technology, the electrically erasable programmable read-only memory, or EEPROM, which doesn’t mean beans to you or me but to computer engineers was a dramatic new type of memory chip that would make possible whole new categories of small-scale computer products. But where Gordie Campbell saw opportunity, the rest of Intel saw only a major technical headache because nobody had yet figured out how to manufacture EEPROMs in volume. Following a long Silicon Valley tradition, Campbell walked away from Intel, gathered up $30 million in venture capital, and started his EEPROM company—SEEQ Technologies. Who knows where they get these names?

With his $30 million, Campbell built SEEQ into a profitable company over the next four years, led the company through a successful public stock offering, and paid back the VCs their original investment, all without selling any EEPROMs, which were always three months away from being a viable technology. Still, SEEQ had its state-of-the-art chip-making facility and was able to make enough chips of other types to be profitable while continuing to tweak the EEPROM, which Campbell was sure would be ready Real Soon Now (a computer industry expression that means “in this lifetime, maybe”).

Then one day Campbell came in to work at SEEQ’s stylish headquarters only to find himself out of a job, fired by the company’s lead venture capital firm, Kleiner Perkins Caulfield and Byers. Kleiner Perkins had the votes and Gordie, who held less than three percent of SEEQ stock, didn’t, so he was out on the street, looking for his next start-up.

What happened to Campbell was that he came up against the fundamental conflict between venture capitalists and entrepreneurs. Like all other high-tech company founders, Campbell mistakenly assumed that Kleiner Perkins was investing in his dream, when, in fact, Kleiner Perkins was investing in Kleiner Perkins’s dream, which just happened to involve Gordie Campbell. Sure SEEQ was already profitable and the VC’s original investment had been repaid, but to an aggressive venture capitalist, that’s just when real money starts to be made. And to Kleiner Perkins, it looked as if Gordie Campbell, for all his previous success, was making some bad decisions. Bye-bye, Gordie.

Campbell walked with $2 million in SEEQ stock, licked his wounds for a few months, and thought about his next venture. It had to be another chip company, he knew, but the question was whether to start a company to make general-purpose or custom semiconductors. General-purpose semiconductor companies like Intel, National Semiconductor, and Advanced Micro Devices took two to three years to develop chips, which were then sold in the millions for use in all sorts of electronic equipment. Custom chip companies developed their products in only a few months through the use of expensive computer design tools, with the result being high-performance chips that were sold in very small volumes, mainly to defense contractors at astronomical prices.

Campbell decided to follow an edge of the market. He would apply to general-purpose chip development the computer-intensive design tools of the custom semiconductor makers. Just as Compaq could produce a new computer in six months, Campbell wanted to start a semiconductor company that could develop new chips in that amount of time and then sell millions of them to the personal computer industry.

The investment world was doubtful. Becoming increasingly convinced that he had been blackballed by Kleiner Perkins, Campbell traveled the world looking for venture capital. His pitch was rejected sixty times. The new company, Chips & Technologies, finally got moving on $1.5 million from Campbell and a friend who was a real estate developer. Nearly all the money went into leasing giant IBM 3090 and Amdahl 470 mainframes used to design the new chips. When that money was gone, Campbell depleted his savings and then borrowed from his chief financial officer to make payroll. Broke again, and with still no chip designs completed, he finally went to the Far East to look for money, financing the trip on his credit cards. On his last day abroad, Campbell met with Kay Nishi, who then represented Microsoft in Japan. Nishi put together a group of Japanese investors who came up with another $1.5 million in exchange for 15 percent of the company. This was all the money Chips & Technologies ever raised—$3 million total.

At SEEQ, most of the $30 million in venture capital had been spent building a semiconductor factory. That’s the way it was with chip companies, where everyone thought that they could do a better job than the other guys at making chips. But Chips & Technologies couldn’t afford to build a factory. Then Campbell discovered that all the chip makers with edifice complexes had produced a glut of semiconductor production capacity. He could farm out his chip production cheaper than doing it in-house.

As always, the real value lay in the design—in software— not in hardware. There was nothing sacred about a factory.

The first C&T product was a set of five chips that hit the market in the fall of 1985. These five chips, which sold then for $72.40, replaced sixty-three smaller chips on an IBM PC-AT motherboard. Using the C&T chip set, clone makers could build a 100 percent IBM-compatible AT clone with 256K of memory using only twenty-four chips. They could buy 100 percent IBM compatibility. Their personal computers could suddenly be smaller, easier to build, more reliable, even faster than a real IBM AT. And because they weren’t having to buy all the same individual parts as IBM, the clone makers could put together AT clones for less than it cost IBM, even with Big Blue’s massive buying power, to build the real thing.

Chips & Technologies was an overnight success, getting the world back on the traditional track of computers doubling in power and halving in price every eighteen months. Venture capital firms—the same ones that rejected Campbell sixty times in a row—immediately funded half a dozen companies just like Chips.

The commoditization of the PC-AT was complete, and though it didn’t know it at the time, IBM had lost forever its control of the personal computer business.