Ed de Moel

WindMills

Not published

It's Christmas!
by Ed de Moel

It's Christmas season, and I've just gotten a new present... Last week, I got a new software tool, and, over the past week, I have spent most of my time learning how to use it. Needless to say, I have two feelings at the same time: the happiness of a kid in the proverbial toy-store, and the disgust of a grown up over a tool that does not produce the desired result immediately.

I am certain that we all have seen similar circumstances.

Of course, not all the cursing and swearing that my colleagues might overhear is to be understood as to be addressed at the authors of this new software... I just have to learn to think in the way that one is expected to think while working in this new environment. Some of the problems I am having are the direct result of me being stuck in my own way of reasoning that has seemed to be valid for sooo many years.

So, what is new?

The new software in question is a tool that allows me to define windows that will do exactly the same things that I have been doing for so long. But... doing things from windows is quite different from the familiar "world".

It used to be that a program went through a series of steps, and the outcome of each previous step was a given at the beginning of the next one.

When the sequence of events is driven by a hand moving a mouse erratically across a screen (especially like in the case of me trying to figure out what I am expected to do next) this sequence is suddenly overturned completely, and new strategies have to be adopted to make sure that software does not constantly report problems with unexpected circumstances, or "undefined" answers to questions that have been "skipped".

So, what does one do?

Well, nothing more than learn and adapt... After all, we don't want to be accused of being "old-fashioned", do we?

But the most important lesson of this exercise is: don't assume anything! Whatever you do, whenever a piece of "call-back" software is invoked, first figure out what caused "the event", and what your current operating parameters are. Only after you have identified your "current operating world", your software can start going about its business.

If not... you're about to learn completely new meanings of the error message "undefined variable".

So What?

Of course, I have known about this aspect of event driven environments for at least ten years, and still it surprises me when I start to work with one. Why could this be?

I can think of several reasons, but, surprisingly, it appears that the most important one is that I have ended up being dogmatic about several things that I have always claimed that I would never be dogmatic about...

Haven't I always said...

Indeed, you may have seen me in one of my stints of teaching a class about "how to code for good performance". I have stated over and over that "the most expensive code is code that the computer does not really have to execute".

And, what's going on here: we're spending nanoseconds and nano-seconds figuring out again and again what is "our current baseline". Isn't that just what I have always taught to be "Wrong"?

Indeed, and you may already have guessed: this is where the other thing comes in that I also have always taught: "nothing is true forever!"

CPUs are getting faster and faster, and so are disks. The box under my desk at home that I am using to write this column has 100 times the amount of memory and five times the processor speed of the "supercomputer" that I used in the physics lab at the University of Amsterdam in the mid-seventies, and general consensus is that it is "OK" to waste a CPU cycle every now and then.

After all: while "writing for windows" all we are doing is make the computer prepare for the next interaction with the end-user, and it doesn't really matter very much how much time the computer is idling between the various mouse-clicks. If your program is "really efficient" by the old rules, it will mean that today's computer will be idling over 90%, as compared to only 30% if your original code was really lousy. The "rules" about performance that I have always taught simply don't seem to apply anymore. (Of course, it never really hurts to keep performance issues in mind, and maintain a little bit of surplus capacity...)

This reminds me of the first time I saw a "NeXT" computer, about 7 years ago. The machine was "uncannily" faster than the PCs and Macintoshes that I was working with at that moment, but all the additional power was lost in screen-management, because the resolution of the monitor was so much higher than any of the "traditional" machines at that moment, and the apparent performance of the machine ended up being really poor as compared to the more popular machines at that moment.

The same seems to keep happening over and over: we're getting faster CPUs every month, and it doesn't seem to matter how fast a CPU really is, because as soon as a new next level of potential performance is offered, our software providers will take advantage of it, and pull another rabbit out of their hats, which will require that, indeed, we need to have the additional millions of colors in our video cards, and the additional resolution on our monitors, of the additional disk space to make all these new beautiful capabilities visible.

Today's lesson

The important lesson to learn from this all is that, indeed, we are living in an evolving world. For us programmers, every couple of months, the new developments in "hardware" make a new series of resources available, and, immediately, those new resources are gobbled up by attempts to "finally do what we've always wanted".

It is important that we should remain aware of the back-log in "what we've always wanted to do" and should keep in mind that there is still a whole lot left to be done, and it will take another five or six generations before we can even stop worrying about all those things for which there still isn't enough capacity in the current generations of hardware.

Of course, by then, we will have thought of a couple of dozen more things that we ought to do when the resources will become available.


Jacquard Systems ResearchEd de Moel is past chairman of the MDC and works with Jacquard Systems Research. His experience includes developing software for research in medicine and physics. Over the past ten years, Ed's has mostly focused on the production of tools for data management and analysis, and tools for the support of day-to-day operation of medical systems. Ed can be reached by e-mail.