May 22, 2020
Looking back, I'm surprised that computing has advanced so little in the past 30 years. Given the pervasive nature of the internet and cell phones, this seems like a crazy statement, but I think that plenty of programmers would agree. As a computer user, today's computers are better, but as a computer programmer, they're almost as frustrating as they were in 1990. The issue is not the hardware; it's the software ecosystem.
Personally, I hit my stride as a programmer about 30 years ago. At that point, I had enough experience to have some idea what I was doing. It also happens that the computer I was using in 1990 was a Macintosh IIcx, and of all the computers that I have ever used, it was the one I was most excited by and happy with at the time.
Computers in 1990 were comparable in function to what we have today, but that was not true for the computers available a few years earlier. Yes, the IIcx was large, heavy and darn expensive compared to today's computers, but it had decent color graphics, a mouse, and it used windows, drop-down menus and the basic graphical user-interface framework that we continue to use. Broadly speaking, the various categories of software we use today were available for the IIcx and the nature of the interaction was similar too – word processors, spreadsheets, databases, drafting software, image editors, IDEs, and (eventually) even a browser for the early internet.
Moore's Law may or may not be dead, but there's no question that it's headed that way. It's no longer reasonable to approach performance problems by simply waiting for the next generation of hardware. Computing in the future will look much different than in the past. We're coming to the end of the era of Moore's Law, so now is a good time to take stock.
The big thing that has changed is that today's computers are faster and cheaper. The Macintosh IIcx ran the Motorola 68030 at 16MHz, with a 68882 floating-point coprocessor, and memory could be expanded to 8MB. Video cards did not exist in the modern sense, although certain very expensive high-end computers, which were not available to the average consumer, could be outfitted with an "array coprocessor."
The computer on my desk today is nothing special; in inflation-adjusted dollars, it cost roughly one fifth the amount of the IIcx. Even though it is much cheaper, the primary CPU on my current computer is more than 20,000 times faster than the IIcx. Considering floating-point operations, the difference is even larger – so much larger, particularly when the video card is considered, that I hesitate to make an estimate – more than a million times faster? Today, my computer has 1,000 times as much RAM as the IIcx, and something like 10,000 times as much bulk storage (which is also dramatically faster).
On paper, computers seem much better than they were 30 years ago, but are they any better in actual use? Video is one thing we can use computers for today that was difficult or impossible 30 years ago. And computers are better as general-purpose appliances. I can hook up a digital camera or mobile phone to my computer and download hundreds of pictures in just a few minutes. There have been some genuine advances in computer science too: things like natural language processing, speech recognition, neural networks, genetic algorithms, etc.
Then there's the internet. This seems like the big way in which computing is better than it was 30 years ago – in fact, "big" is too weak a word; more like "colossal." But the internet is less about computing than it is about communication. It belongs in the same family tree as the telegraph, telephone, radio and television. The internet allows for quick and convenient access to information and remote computing resources, but computers are being set to roughly the same tasks that they were 30 years ago: databases, text processing and manipulation of graphics. The internet allows and encourages us to execute these tasks in novel contexts, and much more broadly than before, but with little qualitative difference in what is being done.
For the average computer user, there's no doubt that computers are "better" than they were 30 years ago – not 20,000 times better, but certainly more useful and almost ubiquitous. However, for a person whose job it is to make computers "do stuff" by writing software, things look a bit different. These are the people who are interested in "computing" as opposed to "computers."
What haven't things improved for programmers in the ways I would have expected 30 years ago?
In 1991, I wrote some software on the Macintosh IIcx to allow navigation of volumetric data (from an MRI machine) in 3D. The program worked with what amounts to a 3D picture of a portion of the circulatory system (blood vessels); instead of pixels, the program worked with voxels. These voxels were used to generate a triangulation of the surfaces of the blood vessels, which was then rendered, allowing the user to pan, rotate, zoom, etc., and it worked interactively, in real-time. If I were to write the same program today, it would be somewhat better, with sharper and snappier graphics, but it wouldn't be dramatically different.
If I had been asked, 30 years ago, what it would be like to write such a program in the year 2020, I probably would have said that such things would be trivial. It's true that today there is probably a program on the shelf that could do most of what the 1991 version of the program could do. But what if I wanted to actually write the program, rather than using an existing executable? There are packages out there that could help, but if these packages are like most source code packages I've worked with, then the amount of time needed to get up to speed on the idiosyncracies of the package will be such that I'd be just as far ahead to start with a blank slate.
That is the problem – the work has been done repeatedly, but not in a form that can be conveniently used by others. After 30 years, certain problems, like voxel manipulation, have been addressed so many times, from so many different perspectives, that one would think that a body of essentially fixed work would be available for everyone to work with, but that is not the case. Sure, libraries and packages are "available," sometimes in dozens of forms, but not typically in a form that's finished enough to be broadly useful. Too often, it's genuinely easier to keep reinventing the wheel – and re-re-re-re-inventing it!
Suppose that in 2020 I did write a version of the orginal program from 1991. How would the experience be different? For one thing, with today's faster machines, I could probably avoid writing it in assembler. For sure, that's a huge relief. Another difference is that IDEs got a lot better in the 1990s; I would take almost any modern IDE over the old Macintosh Programmers Workshop. And with faster machines, there would be no more thumb-twiddling and coffee-making while waiting for code to compile. Modern operating systems have segmented memory so that I wouldn't have to reboot the machine every time my program crashes either. All of these things improve productivity, and make for a less frustrating day, but they aren't game-changers.
As I recall, the bulk of the original program was written over a three month period in 1991 (I was deeply engaged and working more than 70 hours a week). In 2020, I could probably do it in about half the time (i.e., three months on a more reasonable schedule), but a lot of the difference is due to the fact that I know more now that I did then, and would spend less time floundering around in dead-ends. Also, in 1991, I had to write the program twice: once in C to get the ideas right, and again in assembler to make it fast. With today's faster machines, rewriting it in assembler wouldn't be necessary. In almost every other regard, the experience would be the similar to writing it in 1991.
In 1991, my utopian vision of what creating this program would be like in 2020 would have been something like the following. In 2020, voxels must be old hat, and nearly everything I might want to do is part of a highly standardized library. If I've been away from this kind of programming for a while, then it might take a week to familiarize myself with the relevant body of work. During the course of this week, I would build various elements that demonstrate certain aspects of the final product. Then it might take another week to bring these elements together to the ultimate result. In practical terms, most of this two week period would be spent on getting the user-interface right rather than stamping out bugs in the code. If I was already up to speed on the current state of the relevant library, then the entire thing might take a few days. If I wanted to implement a new algorithm, say a new method of shading or a means of detecting physiological pathologies, then the library would allow me to do that easily and I could focus on these new ideas instead of rehashing old ones.
So, why don't we have highly standardized libraries that are full-featured, well-tested, well-documented and widely available? The reason is partly due to human nature, and partly due to the organic history of computer development and common business practices.