Computers Never Get Faster

Back in the digital Mesolithic era (i.e. the 1980’s), I can remember very frustrating times waiting for the computer to do its thing. Painful things like performing calculations, reading floppies and most of all compiling programs. Many times I would hit the return key and go off for a coffee break. Who hasn’t done this?

One of the funniest software advertisements I have ever seen, was a picture of a skeleton full of cobwebs sitting in front of his computer monitor. On the monitor the message read "compiling, please wait ..."

But wait a minute, that was back in the 80’s. Those were the times when 8 Mhz processors and 10 MB hard drives were top of the line. The PCs of the current generation outperform those old museum exhibits a thousand-fold, don’t they?

My answer is NO!!! I repeat NO!! In fact I assert that in the last twenty years the speed of the average PC has changed very little.

To show that I have not flipped my lid, I’d like to explain why this is so. The key is in how one defines “speed”. In the virtual (i.e non-real) world computers speeds have been constantly increasing. This trend will presumably hold for the foreseeable future. At the same time back in the real world the average users sits and waits just as long for his/her computer to do its thing as he/she did twenty years ago. Back in Mesolithic times I would turn on my PC in the morning and then go get a coffee and a toast. By the time I got back upstairs the computer was finishing up with its booting process and I could get to work. Today I DO THE THE SAME THING. The computer is faster it can do more and it does do more, but in terms of real time, that is actual minutes and seconds, there has been little increase in speed.

Why is this so? The technical answer is that the computers nowadays are performing much much more than their counterparts from the past. Back then ASCII displays were the standard. Printing a line on the monitor was equivalent to move a string of bytes into specific areas of memory. Nowadays things are much more complicated. The human language has to be determined. A font has to be found. The size and weight of the font must be calculated the resulting image must be transferred to the graphic display. and so on and so forth. The computer of today has mush more to do that the computer of old. Fortunately the computer of today is powerful enough to do the extra work without to much problem. However, in the end, the net effect is zero. The amount of time the user waits for the computer is roughly the same. Back then I waited for the floppy disk to be read. Now I wait for the Internet to respond. Back then I hit page down and the next screen showed up in the blink of an eye.Now I hit page down and I can see the computer redrawing all the graphical elements on the screen.

But wait a minute you are comparing apples with oranges. Am I Really? Of course if the computers of today were restricted to the tasks that the computers performed twenty years ago they are faster almost beyond comparison. But that’s not my point. The real answer to my assertion lies in the psychology of mankind. Or to put another way “how much slowness can the average user tolerate?”. When a computer is “doing its thing” at what point does the user throw up his hands and say “I’ve had enough”. I assert that this measure of human patience has not changed in the last twenty years. When the user hits the return key or clicks the mouse the internal human clock starts ticking at a certain point the human gets restless. This marks the boundary of acceptable computer speeds. If the computer finishes “its thing” before hand everything is OK. If it requires more time than it is slow and people will use it reluctantly.

Software engineers intuitively know where this boundary lies. After all software engineers are also human [ at least most of them ;) ]. The development of hardware and software over the last twenty years follows a general pattern. As hardware speeds increase, software engineers recognize that they can now do more things before the user gets frustrated. So they pack more processing into the software. At some point they do to much and the computers are once again to slow. So the hardware people make a faster chip or enable more memory or make a faster hard drive etc. This increases the speeds of the computers once again. And once again the software people get greedy and pack more processing into the software. The whole process keeps repeating itself.

The speed of computers is not measured in mhz or ghz but in minutes and seconds. It is measured in the level of human patience. How much time is the average user willing to wait for the computer to do its average thing. What ever this value may be is not important. The point is that this value has not changed in the last twenty years. And because this value has not changed the speed of computers to perform an average task has also remained stable.

Share

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>