Chapter 158 - Shake it Up: The PC & Video Game Revolutions
Above: The IBM Personal Computer (left); the X-128 Computer along with Xerox executive Steve Jobs (right); in 1982 both would become early icons of the personal computing revolution.
“Shake it up, make a scene
Let them know what you really mean
And dance all night, keep the beat
And don't you worry 'bout two left feet
Just shake it up, oo, oo
Shake it up, oo oo, yeah
Shake it up, oo, oo
Shake it up, oh, yeah” - “Shake it Up” by the Cars
“I think it's fair to say that personal computers have become the most empowering tool we've ever created. They're tools of communication, they're tools of creativity, and they can be shaped by their user.” - Bill Gates
“Stay hungry, stay foolish.” - Steve Jobs
The personal computing revolution, which had its origins in the microchip developments of the mid to late 1970s, exploded into life in the early 1980s.
The 1977 release of what came to be called the “Trinity” - the Commodore PET 2001, the Xerox X-2, and the Tandy/Radio Shack TRS-80 Model 1 - marked a turning point. Indeed, in the run up to the release of the “Trinity”, several firms were in fierce competition to develop and release the first truly successful commercial PC.
Chuck Peddle - an American electrical engineer at Motorola - designed the Commodore PET (Personal Electronic Transactor) around the MOS 6502 processor, which he had designed. The PET was, in essence, a single-board computer with a simple TTL-based CRT driver circuit driving a small, built-in monochrome monitor with 40×25 character graphics. The processor card, keyboard, monitor and cassette drive were all mounted in a single metal case. In 1982, Byte Magazine referred to the PET design as “the world's first personal computer”.
The PET shipped in two models; the 2001–4 with 4 KB of RAM, and the 2001–8 with 8 KB. The machine also included a built-in Datassette for data storage located on the front of the case, which left little room for the keyboard. The 2001 was announced in June 1977 and the first 100 units were shipped in mid October of that year.
Although the machine was fairly successful, there were frequent complaints about the tiny calculator-like keyboard, often referred to as a "chiclet keyboard" due to the keys' resemblance to the popular gum candy. This was addressed in the upgraded "dash N" and "dash B" versions of the 2001, which put the cassette outside the case, and included a much larger keyboard with a full stroke non-click motion. Internally a newer and simpler motherboard was used, along with an upgrade in memory to 8, 16, or 32 KB, known as the 2001-N-8, 2001-N-16 or 2001-N-32, respectively.
The PET was the least successful of the 1977 Trinity machines, with under 1 million sales.
Above: The “Trinity” of early PC units: the Commodore PET 2001 (left), the Xerox X-2 (center), and the Tandy/Radio Shack TRS-80 Model 1 (right).
Steve Wozniak (AKA “Woz”) developed the X-1 and subsequent X-2 designs based on the earlier “Alto” design, which was in turn developed at Xerox PARC (Palo Alto Research Center) in the early 1970s. The X-2 had color graphics, a full QWERTY keyboard, and internal slots for expansion, which were mounted in a high quality streamlined plastic case. The monitor and I/O devices were sold separately. The high price of the X-2, along with limited retail distribution, caused it to initially lag in sales behind the other Trinity machines. However, in 1979 it surpassed the Commodore PET, receiving a sales boost attributed to the release of the extremely popular VisiCalc spreadsheet which was initially exclusive to the platform. Though it fell back to 4th place after the release of Atari’s popular 8-bit systems, the X-2 maintained “steady” sales growth. This was largely credited to its durability and longevity; it boasted a lifetime that was up to eight years longer than other machines. By 1985, the X-2 had sold more than 2.1 million units. By the time production ceased in 1993, that number had risen to 4 million. Clearly, Wozniak had designed one Hell of a computer.
Above: Steve Wozniak, lead designer of the “X-2” computer.
Finally, the Tandy Corporation (better known as Radio Shack) introduced the TRS-80, which would be retroactively known as the Model I as the company expanded the line with more powerful models. The Model I combined the motherboard and keyboard into one unit with a separate black-and-white monitor and power supply. Tandy's more than 3,000 Radio Shack storefronts ensured the computer would have widespread distribution and support (repair, upgrade, training services) that neither Commodore nor Xerox could.
Despite this huge capacity for sales, however, the Model I suffered from myriad technical difficulties. For one thing, it could not meet FCC regulations on radio interference due to its plastic case and exterior cable-design. There were also internal problems. Keystrokes would randomly repeat at times. The earliest versions of the hardware produced bizarre glitches. Though these were promptly patched, by that time the damage was done. Amongst enthusiasts, the Model I developed a reputation as a “glitchy”, unreliable machine. Radio Shack managed to sell about 1.5 million of them before discontinuing production in favor of their Model II and later, Model III.
A few years later, in January of 1980, as the nation reeled from news that Mo Udall would not seek a second term and prepared for truly contested primaries from both parties,
Byte magazine announced in an editorial that, “the era of off-the-shelf personal computers has arrived”.
Whereas before, PCs were seldom, if ever, found in individuals’ homes, they were fast becoming a consumer product, an appliance that more and more everyday Americans would have access to. Though the author of that article admitted that his own PC had cost him $6,000 cash from his local store, he claimed that those costs were “bound to drop as the technology becomes more widely available”. He couldn’t have known how true that prediction would prove to be.
At the time of that article’s publication, aforementioned pioneers like Radio Shack, Commodore, and Xerox manufactured the vast majority of the one half-million microcomputers that existed. As component prices continued to fall, however, many companies entered the computer business. This led to an explosion of low-cost machines known as “home computers” that sold millions of units before the market imploded in a price war in the early 1980s. Below are just a few of the many companies that got in on the “home computer” market.
In the late-1970s, Atari was already a household name in the United States. This was due to both their hit arcade games (
Pong, Asteroids, Space Invaders, etc.), as well as the hugely successful Atari VCS game console (and its iconic cartridges). Realizing that the VCS (AKA the “Atari 2600”) would have a limited lifetime in the market before a technically advanced competitor came along, Atari decided they would become that competitor, and started work on a new console design that was much more advanced.
Whilst these designs were under development, the “Trinity” machines hit the home PC market, amidst considerable fanfare. Atari thus faced an important decision: should they continue to focus their attention on video game consoles; or should they shift their efforts toward a home computer system instead? In the end, the company decided that the possible reward was worth the risk. They decided to try their hand at designing a PC. Atari did have some advantages over potential competitors in this market.
For one thing, thanks to the rabid success of the 2600, the company had a sound understanding of the home electronics market. Consumers generally wanted high-quality products that would last a long time and were simple and easy to use. The average American had little understanding of how electronics worked, but if the interface was made intuitive enough, then that wouldn’t matter. As a result of these insights, Atari’s first commercial PCs - the Atari 400 and 800, released in 1978 and mass-marketed the following year - were virtually indestructible and just as easy as their consoles to use. The design concept was the same as the 2600 - just plug in a cartridge and go. With a trio of custom graphics and sound co-processors and a 6502 CPU clocked at about 80% faster than most competitors, the Atari machines had capabilities that no other microcomputer could match.
Despite these advantages, however, Atari’s initially strong sales (~600,000 by 1981) slowed once faced with competition from the Commodore 64, which saw release in 1982. Eventually, Atari would retrieve the proverbial toe that it had dipped in the home computer market to redouble its efforts on the video game front. By that time, they were facing increased competition from other firms with other consoles.
1982 would thus prove a pivotal year for Atari.
As soon as the 2600 shipped, work began on its successor. This next-generation console would, in time, come to be labeled the “5200”. Management’s hope was that the company’s recent (if modest) success in the PC market would provide a solid platform for launching their next console. The team responsible for designing the 5200 faced a number of challenges, however.
Above: An early prototype of the Atari 5200, its “next-generation” console, after the 2600.
First, the team had to determine, at a conceptual level, what the 5200 would be. The primary appeal of the console was supposed to be that it would be technically superior to the 2600, boasting better graphics and better performance. However, with how saturated the 2600 market already was, an obvious concern was whether the new 5200 would be backwards compatible. The question provoked fierce debate among the designers.
Proponents argued that consumers would feel “betrayed” and “angry” if their expansive library of 2600 cartridges were suddenly rendered “obsolete and useless” by the next-generation console. Backwards compatibility would also mean that until the 5200 gradually replaced the 2600, Atari could continue to design and release games for the older console.
Those who opposed backward compatibility argued that these supposed strengths were actually weaknesses. Making the 5200 able to play 2600 games would require that hardware components be added to make compatibility with the outdated software possible. This would increase costs that would, in turn, have to be passed along to consumers. Further, management feared that backwards compatibility would discourage coders and third-party game developers from utilizing the new technical capabilities of the next generation console.
In the end, management decided that higher costs (but longer shelf life and happy customers) was a trade-off worth making. The 5200 would be backwards compatible.
Another major challenge came with the limitations imposed by the hardware available in home consoles at the time. For all their bulk, arcade cabinets had the physical space necessary to house enough memory units to facilitate more complex (and engaging) games. Porting these games over to the home consoles often removed features by necessity.
In 1982, Atari received what should have been a golden opportunity. They obtained the rights to develop and publish the console port of
Pac-Man, arguably the most popular arcade game of all time. Unfortunately, they were forced to produce two versions (issued on two differently colored cartridges) - one for the 2600 and one for the 5200. The former was far inferior to the latter. To compensate for the lack of ROM space, many visuals were removed, much to the chagrin of fans. The hardware also struggled when multiple ghosts appeared on the screen creating a flickering effect. This version of the game was panned and did not sell well.
The 5200 version was better received and for good reason. Indeed, when the 5200 was officially released in November 1982,
Pac-Man 5200 as it was known to critics, was one of the cartridges included with the console. A wise move. Sales of the 5200 increased dramatically.
Under Warner (their parent company) and Atari's chairman and CEO, Raymond Kassar, the company achieved its greatest success, selling millions of 2600s, 5200s, and personal computers. At its peak, Atari accounted for a third of Warner's annual income and was the fastest-growing company in US history at the time. It would, however, have to face an increasingly competitive market and a price-collapse. More on that later.
…
Above: The Sinclair ZX Spectrum - Britain’s best-selling computer of the 1980s.
Sinclair Research Ltd was a British electronics company founded by Sir Clive Sinclair in Cambridge. Sinclair had originally founded Sinclair Radionics, but by 1976 was beginning to lose control over the company and started a new independent venture to pursue projects under his own direction.
Following the commercial success of a kit computer in 1977 aimed at electronics enthusiasts called the MK14, Sinclair Research (then trading as Science of Cambridge) entered the home computer market in 1980 with the ZX80 at £99.95. At the time the ZX80 was the cheapest personal computer for sale in the UK. This was succeeded by the more well-known ZX81 in the following year (sold as the Timex Sinclair 1000 in the United States). The ZX81 was one of the first computers in the UK to be aimed at the general public and was offered for sale via major high street retail channels. It would become a significant success, selling 1.5 million units.
In 1982 the ZX Spectrum was released, later becoming Britain's best selling computer, competing aggressively against Commodore and other brands. Enhanced models in the form of the ZX Spectrum+ and 128 followed; again, to much fanfare. The ZX Spectrum series would sell more than 5 million units. The machine was also widely used as a home gaming platform with more than 3,500 games titles eventually released for it.
Above: The Commodore 64, according to the Guinness Book of World Records, still the highest-selling desktop personal computer model of all time (left); The TI-99/4A, Texas Instruments’ home computer, and the first 16-bit computer to be commercially available (right). Commodore and Texas Instruments engaged in a fierce rivalry and price war throughout the early to mid 1980s.
After moving its corporate headquarters to Bellevue, Washington in January 1979, Microsoft, led by founders and childhood friends Paul Allen and Bill Gates, entered the operating system (OS) business in 1980 with its own version of Unix called Xenix. The company really came into its own (and came to dominate the OS market), however, when IBM licensed MS-DOS from them in early 1981. Other innovations pioneered by the company at the time included the Microsoft Mouse in 1983, as well as a publishing division - Microsoft Press - founded the same year.
Unfortunately, just as the company was really beginning to take off, tragedy struck one of the two men at the heart of the operation. In 1983, Paul Allen was diagnosed with Hodgkin's lymphoma - a type of cancer that affects white blood cells. Allen would later claim, in a memoir he wrote about his time at Microsoft, that his old friend Gates attempted to dilute Allen’s share of the company following his diagnosis. According to Allen, Gates told him that this was because he [Allen] was “not working hard enough for the company”. Allen later invested in low-tech sectors, sports teams, commercial real estate, neuroscience, private space flight, and other ventures unrelated to Microsoft.
Meanwhile, Gates, who was already serving as CEO and Chairman of the Board of the burgeoning enterprise, began to take on an almost exclusively executive/management role, leaving the actual programming and design to others. Gates and Allen’s relationship had been strained for months prior to Allen’s diagnosis, primarily over disagreements about equity in the company and so forth. Gates later admitted that he “regretted” his and Allen’s falling out. The two later reconciled at the end of the decade and resumed their friendship, which lasted until the end of Allen’s life in 2018.
Meanwhile, in Kyoto, Japan, key events in the late 1970s and early 80s were shaping the destiny of another iconic video game company - Nintendo. Two of these events occurred in 1979: its American subsidiary - Nintendo America - was opened in New York City; and a new department focused on arcade game development was created. In 1980, one of the first handheld video game systems, the Game & Watch, was created by Gunpei Yokoi from the technology used in portable calculators. It became one of Nintendo's most successful products, with over 43.5 million units sold worldwide during its production period, and for which 59 games were made in total.
Around this time, Nintendo also entered the arcade market with titles like
Sheriff, Radar Scope, and most iconically,
Popeye. The last of these, helmed by lead-designer Shigeru Miyamoto, would become one of the most popular of the decade.
Starring the iconic “sailor man” from the cartoons, the game made history as one of the first of a new genre - “platformers”. Players control Popeye as he climbs a series of ladders and avoids rolling barrels, with his ultimate objective being to reach Olive Oyl and rescue her from that brute Bluto. Players can also destroy the barrels and make Popeye nigh-invincible for a short time by grabbing a can of spinach, which is placed at a different location in each level. Over 15 million units of the
Popeye game would be sold in America; when combined with the home console port - produced for the ColecoVision - that number rises exponentially. Indeed, after the flop of
Radar Scope in the US, Popeye can largely be credited with saving the fledgling Nintendo America from ruin.
Above: Promotional poster for
Popeye - the first arcade game that would make Nintendo a household name in America (left). The game’s lead-designer was rising star Shigeru Miyamoto. A prototype of the Nintendo Advanced Video System (AVS), built in late 1982 and released in 1983, distributed by Atari in North America (right).
The only downside to
Popeye was that it featured licensed characters, rather than originals that would belong to Nintendo exclusively. If the company wanted to continue producing games starring Popeye and his supporting cast, then they would need to keep renegotiating contracts with King Features over the rights. Thus, company management tasked Miyamoto with creating a cast of original characters who could star in Nintendo’s next games. This he would take up with gusto, creating a series of games that would serve as the foundation of Nintendo’s roster moving forward:
Donkey Kong in 1982;
Mario Brothers in 1983, starring Mario and Luigi Mario, a pair of Italian-American plumbers; and later,
The Legend of Zelda in 1986.
All of these and more would eventually be released on Nintendo’s “Advanced Video System”, a redesign of the company’s earlier “Famicom” console, set to be released on the North American market, distributed by Atari, Inc. Though originally plans were made for an advanced 16-bit console that was really more of a home computer-hybrid, these were later scrapped. Nintendo management feared that the keyboard and other accessories would “overwhelm” non-technophiles and “frighten off” the emerging market of “casual” video game fans. Thus, the decision was made to stick to the more familiar, 8-bit, gamepad controller setup, with game cartridges, rather than the more advanced CD-ROMs, which had been experimented with on the Japanese market.
Though Atari and Nintendo would both be severely shaken by the so-called “Video Game Crash” that occurred the following year (caused largely by the glut of low-quality games and an overly saturated market), both companies would, thanks to high-quality control and a loyal customer base at the heart of their operations, emerge on the other side intact. Both would continue to dominate the video game industry until more Japanese companies - Konami & SEGA - and later, tech companies - Microsoft, Sony - entered the game.
Next Time on Blue Skies in Camelot: More US News & Politics from 1982