A brief history of the IBM PC, thirty years on.

A brief history of the IBM PC, thirty years on.

30 years ago today, a new office automation device was officially launched by IBM at the Waldorf Astoria hotel in New York. August 12, 1981, marked the launch of the IBM 5150 Personal Computer, the original IBM PC, a device which whilst not revolutionary itself, started a revolution in the way we manage and consume data.

Today Dr Mark Dean of IBM International Business Machines Corporation 120,85 -6,28 -4,94% thinks that the PC is now dead, and we should move onto something else, whilst Frank X. Shaw of Microsoft Microsoft Corporation 336,06 -0,38 -0,11% thinks we’re now in the PC-plus era, and not the post-PC era.
So how did we get here, and is the PC dead, or just transmuting into something else?

The Genesis of the IBM PC

To my mind, the thing that made the PC great was that despite the fact that IBM could have made it something different, more powerful, more proprietary, they instead used the commodity Intel Intel Corporation 50,31 -0,08 -0,16% 8088 CPU running at 4.77Mhz.

This wasn’t the fastest or most capable processor available. And now this processor would be considered underpowered. For comparison, my HTC Desire HD phone has a minimum clock speed of 249Mhz and runs normally at 1Ghz. This is between 52x and 200x faster, even before we look at IPC (instructions per clock) and processor efficiency gains.

Mark Dean led the team that developed and publicly documented the expansion bus. Of all the various elements in the PC design, I think the most important are the five expansion slots. The PC can adapt from a basic model to a flexible device suited to different needs and requirements. By opening the interface specification IBM allowed other companies to offer expansion interfaces that IBM hadn’t even thought of.

Flexibility through expansion

One of the slots contained a Monochrome Display and Printer Adaptor (MDA), for text display and a parallel port for connection to a printer. Alternatively, a Color Graphics Adaptor (CGA) card provided a prodigious 320×200 pixels in 16 colours or monochrome 640×200. A floppy-disk controller supported Single Sided/Double Density 5¼” floppy disk drives, at 160Kb each. (Later Double Sided / Double Density drives became available, increasing capacity to 320Kb, and with an extra sector to 360Kb). Serial ports allowed a connection to a modem for access to a number of communications services available.

The IBM PC package covered both input and output, each considered as part of the whole. Output was handled by a bi-directional dot-matrix printer, and the input was via a good quality keyboard, drawing upon the heritage of IBM’s golf-ball “Selectric” electric typewriters. The keyboard had a good feel and quite a loud switch action, suitable for touch-typists.

Software drove the hardware

Microsoft might claim that their operating system (MS-DOS) was the key to the PC’s success. I’m convinced it was the availability of critical applications. The Visi-calc spreadsheet, MultiMate word-processor and dBase III were key applications. This proves yet again that content (and content generation) is more important than the platform itself.

It didn’t take long, however, before limitations in the IBM PC hardware lead to innovation. This was certainly the case when Lotus (now itself part of IBM) launched the Lotus-123 spreadsheet in 1983. This took over from Visi-calc as the de-facto spreadsheet, in part because it had a larger within each sheet , but primarily as it supported graphics.  The larger spreadsheets built in Lotus 1-2-3 subsequently drove the demand for memory beyond the 640KB boundary available in MS-DOS. We’re seeing within two years the demand for over 10x more memory than the basic PC started out with.

This later grew with the Expanded Memory systems and boards to page switch up to 4MB of RAM into spaces in the upper 384KB of memory that the original PC specification had held for expansion ROMs and I/O. This allowed the growth of spreadsheet and other memory intensive applications.

Similarly, the Hercules Graphics Card (HGC) was developed to provide both the textual display of the MDA, but also graphics using a 720×384 array of pixels. This drove the same monochrome monitor of the MDA. Programs had their own display drivers to support the Hercules Graphics Card. Lotus 1-2-3 could use this for graphics, but could also use a special font to place more data on the screen. Until the advent of graphical operating systems, each program needed to provide individual drivers for each card.


In March 1983, we also saw the launch of the IBM PC/XT (Model 5160) with a single floppy disk drive and an internal Seagate ST-412 10MB hard disk with a Xebec hard disk controller using the ST-506 interface, and an increase from 64KB of RAM to 256KB of memory on the main board. This was in part due to the demand for extra memory.

Juggling floppy disks to boot a system, load an application, and then load, work on and save a data file, before moving onto a different task, was a challenge. Installing the applications and data files on a hard disk made the process of using a PC for multiple tasks much easier. It was easier to use databases with a hard disk and allowed the size of the data files grew considerably. You were no longer limited by the space on a floppy disk, and it’s (comparatively) slow access to the information.

The expansion slots on this device moved closer together. This defined width and spacing of expansion slots on all subsequent devices, still used on motherboards today. This became the ISA expansion bus.

In 1984, IBM improved the PC, it became the PC/AT (Model 5170) with a new processor (Intel 80286 at 6Mhz). Upgraded memory to 16MB and the Enhanced Graphics Card (EGA) provided 640×350 pixels with 16 colours. The power to address more than 1MB of memory using the 24-bit address bus would not be released until 32-bit operating systems arrived. Paging the extra memory into the 1MB space using specific device drivers (HIMEM.SYS) allowed access in the same way as the EMM system beforehand.

A change of direction with the PS/2

IBM however, made a mistake, in my mind, when they chose to become less compatible with other devices when they transitioned from the PC to the Personal System/2 (PS/2) environment, moving from the more open ISA (and EISA) buses to the proprietary Micro-channel Architecture (MCA) interfaces for expansion cards.

This allowed IBM to significantly advance the , introduce a host of new technologies, and regain technical leadership. It lost them the competitive advantage gained by the openness of the original PC platform. Customers sometimes moved to a more compatible IBM PC-compatible, as other options were now available, ironically no longer an IBM PC.

The revolutionary change brought by the PS/2 did bring some advancements that were rapidly integrated into the IBM PC-compatible platform, including 3½” floppy disk drives, VGA (640×480) graphics and the smaller PS/2 6-pin mini-DIN keyboard and mouse interface, and the memory SIMM (rather than banks of SIPPs, or even individual RAM chips).

The rise of the IBM PC-Compatible

But IBM wasn’t having everything their own way, as the choice of commodity hardware and open interface specifications quickly allowed copies to be made. The one thing that wasn’t published was the BIOS specification. Reverse engineered using clean-room techniques, a few companies were able to produce a different but compatible .

Compaq (now part of HP HPQ 31,33 +0,08 +0,26%) quickly produced a BIOS that was very closely compatible with the IBM original, and it’s carry-top transportable form-factor (that the Osborne 1 pioneered a few years earlier) as the Compaq Portable. Olivetti produced the M24, which was an 8MHz Intel 8086 PC-compatible I’m intimately familiar with since it was the first PC device I spent a lot of time with. Dell also started in the IBM PC compatible business, with a configure to order service for their systems delivered by mail-order.

Other companies developed systems, motherboards, and expansion cards of many sorts.. Some companies even took a lot of discrete components and joined them in a smaller number of integrated components. Amongst the first was the Chips and Technologies NEAT chipset, which allowed designers to deliver more reliable, cheaper motherboards. I know that it certainly simplified my life when I was working building systems at the time, with substantially fewer failures on first on install.

Indeed, integration has occurred on so many elements of the PC marketplace since then. For example, the three cables for a dual-hard disk ST-506 setup are now a single narrow SATA cable per drive. Integration has also move system sizes down from AT-sized boards to ATX to ITX sized platforms. The IBM PC has now spawned a multitude of form-factors and capabilities, from the desktop to tower, to rack-mounted servers, to laptops, to netbooks, to home entertainment servers and to gaming platforms.

Content creation and consumption

The other major integration that has gone on is the rise of networking and the Internet itself. Moving from a stand-alone device to connecting with many. Sharing and integrating data and content across an organisation. Sharing data within a site, across sites and across continents became commonplace. Ideas and data are now shared between individuals, no matter where they are in the world.

Without sharing the content with a wider audience, the content is effectively worthless, consumed by a few. The Internet and the World-Wide Web allowed created content to be shared globally. Because of its ubiquity, the PC was the first real platform to allow this to occur. Indeed networking is driving the next innovation, which is to embed data and content in the cloud, and the PC is just one of the surfaces that collect, edits and consumes content.

The PC is not dead. It has moved past being the single point of access to content. It has also helped blur the boundary between the that recreational, home use and the use in the office. The PC still has a future. It is now part of a greater hegemony of devices for creation and consumption of content, with the data held within the cloud.

The PC lives, long live everything else (and the IBM PC)!

Note: Just to prove a point, I compiled this post on a desktop, notebook and smartphone. Content is now everywhere!
Update: 16 July 2016 – changed stock references for HP to reflect Hewlett Packard Enterprise following the split of HP
Update: 13 July 2017 – the original blog entry no longer exists, here is the original link for reference
John Dixon

John Dixon is the Principal Consultant of thirteen-ten nanometre networks Ltd, based in Wiltshire, United Kingdom. He has a wide range of experience, (including, but not limited to) operating, designing and optimizing systems and networks for customers from global to domestic in scale. He has worked with many international brands to implement both data centres and wide-area networks across a range of industries. He is currently supporting a major SD-WAN vendor on the implementation of an environment supporting a major global fast-food chain.

Comments are closed.