All good things come to an end: an iMac retrospective

Photo: The Originial iMac

By the mid 1990s, Apple was in trouble. It was losing money and market share. Apple computers were expensive compared to generic beige-box PCs, not by a little, but by a lot. Windows 95 made IBM-compatible PCs truly easy to use for the first time, making it increasingly difficult for consumers to understand why they should pay more for an Apple computer when they could get everything they wanted in a PC costing half as much. If there was a “Macintosh advantage” a succession of uninspiring corporate officers had been unable to get that message out, and Microsoft aggressively pushed their Windows operating system in the key home and office markets.

The computers themselves did little to sell themselves: the PowerMacs were fine machines in many ways, but each generation of Pentium processor always seemed that bit faster. Maybe it was just the megahertz thing where a Pentium has to be that bit faster just to keep up with a PowerPC computer, like Apple said. At the consumer end of the market, a bunch of very ordinary Performa computers simply failed to ignite any sort of user demand. Often deliberately crippled so they wouldn’t compete with the more expensive PowerMac models, and some were just plain bad by any standard: ugly, slow, apt to crash and pathetically poor gaming machines with minimal graphics acceleration, if any at all. Assuming Apple still managed to sell you a Mac, you were tied to using proprietary (read: expensive) peripherals like printers and joysticks, where PC owners could get ten times the range to choose from being sold for half the price. Above all, the jewel in the crown, the Mac operating system, had some serious problems. System 7.5 will probably be remembered by Mac users of the time as being both innovative and incredibly buggy. Coming on something like twenty floppy disks, you could easily spend an afternoon installing the software, and once up and running you’d be thrilled to spend yet more time downloading updaters galore. Remember, this was the age of 14, 4 and 28,8 modems, and downloading ten megabytes of data could easily take all night.

Apple was dying, according to the pundits; more often than not, if Apple was mentioned in the financial pages, its name was prefaced with the word “beleaguered”. People who might have bought a Mac thought twice, not wanting to be lumbered with the computing equivalent of Betamax. If Apple needed a White Knight, and in 1997 it got one, except this one wore black turtleneck sweaters and blue jeans. Enter Steve Jobs, initially as an adviser to the then-CEO, Gil Amelio, but after a difficult few months Amelio moved on and Jobs remained. With boyhood friend Steve Wozniak, in 1976, Jobs had been a founder of Apple, but he had left the company in 1985 shortly after losing a boardroom tussle with CEO John Sculley over the future of the company. Computer industry historians will argue over the significance of Jobs’ return and the departure of Amelio for as long as geeks care about such things. There’s no question that this was a turning point for Apple. What is open to debate is how much of the success that followed was a product of Amelio’s groundwork on the one hand, or Jobs’ vision on the other. What Jobs did bring to Apple Computer and Mac users everywhere was something that had been missing for years: belief in themselves. A cynic could put this down to the famous Steve Jobs “reality distortion field”, a phrase coined by Apple employees during the early 1980s to describe his preference for passion over practicality, but when you’re in a boat taking water over the sides, having a captain that assures you you’ll get home safely can be a very comforting thing.

The Reality Distortion Field

One of Jobs’ most important allies was a young British designer, Jonathan Ives, who he promoted to Executive Vice President of the Apple Industrial Design Group (IDG). The Apple IDG had turned out some remarkable products over the years, and garnered more design awards than any other computer manufacturer. Some, like the original compact Macintosh and the “Blackbird” PowerBook 500 series from the early nineties blended form and function so well that they have become iconic, the benchmarks against which new computer designs are compared. Others, like the Newton were innovative but flawed, perhaps even ahead of their time, and will be remembered more as failures or costly experiments. A few, like the Twentieth Century Mac, were overpriced and underpowered, great looking toys without an obvious role in the product line-up. But though the IDG reflected the soul of Apple Computer, from the late eighties the CEOs of Apple Computer had become more interested in production costs and marketability than design. In particular Apple had a problem that would seem downright paradoxical given the decline in its market share: it couldn’t always produce computers fast enough to satisfy demand. So a great deal of effort during the early and mid nineties was put on developing common motherboards and components for as many computer designs as possible. The hope was that if Apple could increase the efficiency of the production side of the business, it could then pass on to consumers a larger volume of computers quickly and sold more cheaply than ever before.

With Jobs’ return, the reality distortion field changed all of this. Steve Jobs has always had an eye for design and a sense for what doesn’t just feed a market but creates new ones. His vision for the IDG involved radical designs to allow Apple Computer to re-define computing away from the beige boxes that dominated the marketplace. The external design, the form factor, of the computer would be just as important as the power of the processor or the speed of the hard drive. In short, computers would become fashionable, as much an expression of the personality of the user as a tool for people to get work done, and the poster boy for this revolution would be the iMac.

Though Ives had a reputation for being one of the most innovative designers of his generation, Jobs’ design brief required not just great design but an engineering tour-de-force as well. The computer he wanted from Ives and his team needed to be easy to use but powerful, to be beautiful yet cheap to manufacture, and to embrace new, cutting-edge standards in connectivity and yet remain compatible with low-cost consumer-level devices like inkjet printers. Above all it needed to be easy to set up and connect to the Internet. Jobs’ goal was for a computer that wasn’t just a way to get connected to the Internet, but a computer designed for the Internet.

The IDG delivered on this design brief with an all-in-one Mac reminiscent of the originally Macintosh computer, but in many ways a departure from all previous Macintosh computers. It eschewed the two interfaces ubiquitous to Macs over the last ten years, the Apple Desktop Bus (ADB) for low-throughput input devices like mice and keyboards, and the Small Computer Standard Interface (SCSI) for devices that needed fast data transfer rates, like hard disks and optical drives. Instead the designers included the versatile but at this time still unfamiliar Universal Serial Bus (USB) for both external drives and input devices. Even more surprising was the absence of a floppy disk drive. Jobs felt that the floppy disk was obsolete both as a medium for installing software and for storing files and documents. It could only hold a small amount of data, less than 1.4 MB, and that quantity of data, he reasoned, would be much more easily transferred between computers using the Internet. The built-in CD drive would be used for installing software, and for distributing or backing up large amounts of data Zip disks or CD burners would be used anyway, and these could be accommodated using the USB interface. Although USB was much slower than SCSI, unlike SCSI it was “hot-swappable” and fundamentally idiot-proof: USB cables either fitted into the correct ports or not at all, and USB devices could be plugged and unplugged with the computer switched on without risk of damage. SCSI might be a higher performance interface, but it was troublesome for home users without expert knowledge of the arcane lore of SCSI chains and termination. As well as USB, the iMac came with no fewer than three ways to connect to networks: a 56k modem, Ethernet, and an infrared port that could be used to beam information to another computer with an infrared port such as other iMacs or a PowerBook.

The iMac was different in subtle ways, too. The earliest Macintosh computers were silent, but as processors got bigger and faster they got hotter, too, and with more and more high-performance components packed inside them noisy fans became essential to keep them working properly. Jobs wanted very much to make the iMac silent, and although the
original iMac did contain a small fan, but the from 1999 onwards iMacs
lacked fans. Instead, they relied on clever design to channel cooling
convection currents of air through the machine instead. More obvious was the difference in styling. It didn’t use the metallic beige (referred to as “platinum” by Apple) that had graced practically every Macintosh since 1984. Instead, it was blue, Bondi Blue, to be precise, after the famous beach in Australia, and as the “Bondi Blue iMac” it sounded much more like something from Home Depot than CompUSA.

The first time I saw an iMac was in the window of Peter Jones, a department store in London, walking home from the Natural History Museum where I worked to my apartment in Battersea. It was dark and cold, and I was on the other side of the road, but sitting by itself in a brightly lit corner display was this blob of plastic looking for all the world half computer and half confectionary. I crossed over the Kings Road to get a better look, almost pressing my nose to the glass like the young hero of Jean Shepard’s “A Christmas Story” staring at the glorious Red Ryder BB Gun. It was compact, and it was much more powerful than the PowerBook 3400 that I owned. But most of all, it looked really, really cool. It was like something from a sci-fi movie set. It was semi-translucent, you could easily make out the inner workings, but though obviously a computer it was also weirdly organic, like an alien egg. Besides a unique shape and colour, it had texture too, which combined with the see-through-ness gave the iMac a “touch me” quality not normally associated with hi-tech.

The iMac’s Reception

Inevitably, critics were quick to point out the compromises and design flaws. For a start the “hockey puck” mouse was a triumph of style over substance: pretty to look at, it was awful to use. USB was a fine interface for peripherals that didn’t need fast data transfer, but for those that did it was very limiting. Compared with SCSI equivalents, USB CD burners and external hard drives were slow, and represented a bottleneck to productivity in offices and graphic design studios. Floppy disks may well have been obsolete in 1998, but even today we still use them. Not because they are good, but because they are convenient. Not everyone has the Internet, and even those that do have access to the Internet can find sending files or attachments to e-mails slow and confusing. Power users berated the lack of expansion. With only a slow USB port for external devices and disks, the inability to put a CD writer or second hard disk inside the computer was felt even more keenly. Even by the standards of 1998, a 233 MHz on a 66 MHz bus was hardly a speed demon, and coupled with an anaemic 2 MB graphics card this was not a machine for hardcore gamers, a serious flaw in a machine targeting the lucrative home market in particular. Admittedly this video memory could be upped to 6 MB, and one manufacturer at least produced a graphics accelerator for the mysterious “mezzanine slot” that was empty on all shipped iMacs. Another manufacturer even managed to design a SCSI card to slip into this slot, despite warnings from Apple that such upgrades would definitely void the warranty. (Quite what Apple had in mind for the mezzanine slot was never known, and it was dropped from all the later iMacs, and even the current G4 iMac lacks any sort of expansion at all. It was probably used by Apple engineers while development and testing the iMac, and was never meant to be used by consumers.)

So the iMac had flaws. But it had real strengths, too, and these ultimately so far outweighed its limitations that Apple were able to shift 400,000 iMacs worldwide in its first month of sales. It was cute, it was colourful, and it had enough speed and versatility to make it a useful choice for home users and small offices. Sales of the iMac were driven by its looks and its charm; in short, it was the fashion accessory of the moment. iMacs popped up in the reception areas of trendy firms that otherwise used generic Windows machines, not because they needed a Mac but because the iMac looked good. Art galleries used iMacs to display their catalogues, and film producers brought them on set like character actors.

R.I.P, iMac (1998 – 2003)

Over the next five years the iMac was continually upgraded, thought its form factor remained the same. The mezzanine slot was dropped early on, as was the infrared port, but otherwise the only significant changes were in the hardware specifications. Each new version had a faster processor, bigger hard drive and more memory. The video cards got better too, and there was a greater variety of optical drives available, including CD writers and combination CD and DVD players. A succession of “Digital Video” editions came with FireWire and iMovie software, making them useful workstations for lightweight video editing, particularly home moviemakers.

Technically insignificant, but very important for the marketability of the iMac, was the release of each new version in a variety of colours. The Revision D iMacs came in “five fruit flavours”, blueberry, grape, tangerine, lime and strawberry, but shared the same 333 MHz processor and hardware specifications. Some flavours sold better than others, and people even produced discussions of what your favourite flavour said about your personality. For the first time a computer was being sold by flavour: Apple had managed to completely change the consumer electronics market, shifting the attention away from the nuts-and-bolts of processor speed and operating system and towards seemingly trivial issues like the shape and colour of the computer. A few PC manufacturers followed suit, but none really captured the public imagination in the same way, and the PC marketplace still largely consists of beige or black boxes, external monitors and generic keyboards and other components. Of course the danger with fashion is that it is a fickle thing, and with each new year Apple needed to change the iMac enough to keep it desirable. The graphite iMac was a bit more transparent than earlier iMacs, and its smoked glass appearance was definitively classy, helping set it apart from the brightly coloured iMacs, which helped this model find its way into offices and boardrooms where the fruitier iMacs would never have made it. The “flower power” iMacs went in the opposite direction; their hippy-trippy patterns were meant to appeal to the young and perhaps the young at heart with memories of Woodstock.

But by now Apple had moved on. Sales of the iMac had propelled Mac software and hardware sales generally. Apple had turned around, it was now a prosperous computer company making money when many of the others were losing money. No longer was it beleaguered, but instead it was described as being innovative and an industry leader. G4 processors and the new OS X operating system had revitalised the company and spurred Apple into a new design mode characterised by a feel of discreet power rather than overt friendliness. The iBook, Apple’s consumer portable that it dubbed the “iMac to go” had been a great success but by 2001 it had been completely redesigned away from a curvy, brightly coloured design to a sleek, white slab of translucent white plastic. In its new form, the iBook was a hint of what was to come for the iMac. The 2002 iMac was a totally different creature to the original, and owed perhaps more to the ill-fated G4 Cube than the original iMac. Although completely new in design, it retained much of the original iMac rationale: it was silent, it couldn’t be expanded, and it was easy to set up and connect to the Internet. To quote Wayne Campbell from the movie Wayne’s World: “Ah yes, it’s a lot like “Star Trek: The Next Generation”. In many ways it’s superior but will never be as recognized as the original.”

So farewell, then, iMac. You saved Apple, and you saved the Macintosh. Not bad for a blue, bubble shaped computer that didn’t even have a floppy drive.

Links:

Dr. Neale Monks

Leave a Reply