Tag: computing

Vacation Reading: The History of Computing

Sitting poolside with Gödel, Escher, Bach
Vacation reading

There are certain sub-genres that appeal to me more than others. Baseball history is one example. The Apollo space program is another. In each of these sub-genres I’ve read more than my fair share of books. Another sub-genre I enjoy that I have recently been revisiting is the history of computing.

Perhaps because I grew up with computers I find a particular fascination in them, and their impact on society. I am particularly fascinated by their evolution from the early time-sharing systems, to what we carry in our pockets today. I recently read (and re-read) several books in this sub-genre. I re-read Steven Levy’s great history of computing, Hackers. I read James Gleick’s The Information which was all about information theory. I re-read Walter Isaacson’s The Innovators.

Several of these books refer to Douglas R. Hofstadter, and in particular, his book Gödel, Escher, Bach. It got me curious about the book, which won the Pulitzer prize for general nonfiction in 1979. After several friends gave the book high marks, I decided I should give it a try. It is tangentially related to computing in that it discusses artificial intelligence and completeness.

It turns out the book is not available as an audiobook or an e-book. I ordered a paperback copy. It so happens that I am on vacation for the next ten days or so and decided that reading this book would be good poolside reading. (I enjoy when people come up and ask me what I am reading. I show them the book and the often ask what it is about. It will be interesting how to explain this one.)

Not long ago I wrote about hard books to understand. Gödel, Escher, Bach came up in the discussion of that post. Last night, I got through the 20-page preface to the 20th anniversary edition of the book. The first half of that introduction tried to explain the book, and I found that I was at the limits of my comprehension. I read some passages over and over and when I finally thought I understood what Hofstadeter what saying, I would encourage myself in the margins, like this:

An annotated page from my copy of Gödel, Escher, Bach
Encouraging my understanding in GEB

It will be interesting to see whether I will be able to make much sense of this book at all.

I am also particularly interested in the history of Unix, and until recently, hadn’t come across a good, succinct history of the operating system. A recent search, however, turned up Unix: A History and a Memoir by none other than Unix creator Brian Kernighan. When I get bogged down in GEB, I can turn to Kernighan for some relief.

Finally, I always have an audiobook queued up for those times when I am walking, driving, exercising, or not somewhere that I can sit and read. In keeping with the history of computing theme, I’ve got Nicholas Carr’s The Shallows queued up.

Eventually, this sub-genre phase will pass and I’ll move onto other things. I imagine that the butterfly’s wings will flap rapidly around GEB in particular.

Moore’s Law

Moore’s Law states that the transistor density of semiconductor chips would double roughly every 18 months. Another formulation of this law says that RAM storage capacity increases at about the same rate as processing power.

I mention this because I got some additional RAM for my work laptop today bringing my total RAM capacity to 2 GB. Now 2 GB is a lot of RAM for a laptop, but I do a good deal of Visual Studio development. However, when the RAM was installed this morning, it got me thinking about Moore’s law, and a real-world example.

I’ve been at my job ever since graduating college in 1994. When I got here, the first computer I was given had 16 MB of RAM and has steadily increased ever since then. Today, nearly 12 years later, my computer as 2 GB of RAM. To keep everything in the same units, let’s call 2 GB the equivalent of 2000 MB of RAM.

It is easy to compute that 2000 MB of RAM is 125 time greater than 16 MB. So in 12 yeras, the amount of RAM I use as increase 125-fold.

Moore’s Law states a doubling of RAM every 18 months. In 12 years, there are (12 x 12)/18 = 8 sets of 18 months. This means my RAM should have doubled 8 times since I’ve started here. Starting with 16 MB and doubling 8 times would result in roughly 4000 MB of RAM. In actuality, I now have 2000 MB of RAM which is the equivalent of 7 doublings instead of 8.

What this says is that RAM has not quite kept up with Moore’s Law, at least in the case of my computing history at work.

I know this probably doesn’t mean anything to most people, but I find this kind of stuff fascinating.