When IBM designed the original PC in 1981, they gave it an “open architecture”, that is, the specifications were freely available, and manufacturers could make add-on hardware for it without paying a licence. IBM also licensed the system software from an external company, Bill Gates’s Microsoft. It remains a mystery why IBM chose to deal with an embryonic outfit operating out of a garage in Albuquerque, rather than an established software corporation. But the open architecture idea and the failure to buy the software outright combined in the long run to make Gates and Microsoft fabulously wealthy.
What IBM had failed to forsee was that while other manufacturers could certainly produce hardware to add value to the IBM PC, they could also produce “IBM compatible” PCs of their own. And because IBM didn’t have an exclusive deal with Microsoft, these new PC manufacturers could buy a licence (ker-chingg! for Microsoft) and sell a computer that ran exactly the same software as the IBM one.
Given that the IBM cost as much as a new car, there was plenty of scope to undercut the price, and IBM’s market share steadily declined. But they learned their lesson. When the time came to redesign the hardware to cope with more advanced chips, IBM invented a new architecture, with the interface “closed”, that is, if you wanted to make compatible hardware, you had to buy a licence from IBM.
Actually, I said they learned their lesson. That isn’t really true. It was more a case of shutting the stable door when the horse was a tiny dot in the distance. IBM’s new architecture was pretty much a flop, while the “IBM compatibles” evolved into the PCs (and Macs) we have today.
The reason I was thinking about this history lesson was that I went to the council recycling and disposal plant today to get rid of some old computer rubbish. Mainly, it was two huge CRT monitors. I’m not sure why the things are bigger and heavier than a contemporary television with the same screen size, but carrying them around does your back no good at all. There was also a very old small monitor, which must have been of eighties vintage.
But I also discarded enough actual computer guts to build three whole computers. The one that seemed to be the oldest had a date code of “91” on the case, and had an Intel “Overdrive” chip on the motherboard, which means that I had bought the chip (at a ridiculous price , probably) to upgrade my computer by a miniscule amount. That computer still had the original IBM hardware design: a real relic. It’s gone.
What struck me was the profusion of RAM memory parts. Each of the computers used different types, but each was stuffed to maximum capacity. 64 megabytes, I think, or about 20,000 times less than you’d get in a new computer from PC World. I’m sure it was all working, but totally useless today. It’s gone.
It didn’t come easy to me: I’m a hoarder by instinct. But nothing that I threw away had any actual functional value. Some value as scrap, I’m sure, particularly the gold on the connectors, and maybe tantalum in the capacitors or something. But no use to me. So I’ve only kept the few components which are still compatible with today’s computers. Those will probably stay in my attic until they’re out of date as well, and then I’ll throw them out. Eventually.