Thursday, July 20, 2006

Extinction level events in software

If you read this site for news, skip this item, it is just speculation.

I sometimes think that the software industry, scientific software included, can behave a bit like evolution. When times are good, lots of diversity occurs- the space of possibilities and choices is explored. When times get lean, the fit survive and the population (of users) dwindles for those that are not.

A software companies can exist on small user bases, as long as one person has the inclination to answer the phone and use a CD copier. But every so often, there is an extinction level event, a stepwise change that can kill off the weak (or just unlucky) and those that survive move into the territory relinquished by the dead. In evolution, think asteroid killing off the dinosaurs leading to the domination of mammals. In software it is things like the release of a new operating system.

Reasonably healthy software has died out in the past because it had been written for an outdated technology and wasn't successful enough to invest the development time to update to new technology. It happens less now than it used to. When I first learned to use a computer in the early 80s, it happened all the time. Every new technology was incompatible with the last. These days the platform providers make huge efforts to reduce this. I am sure most Windows XP software will work just fine on Vista, but some won't. Just as most OS X software runs on MacTel, but not all.

What got me thinking about this was the last piece about Multicore support. Is the coming of multicore computing a slightly slower extinction level event, like, say, global warming. At first it seems pretty harmless. Even more than the coming of Vista and 64bit platforms, multi-core promises complete compatibility.

But like global warming, it fundamentally changes the eco-system, and particularly for scientific computing. A lot of algorithms, have been studied and optimized in minute detail. Within a few percent, you should expect common algorithms, like say ANOVA, FFT, or basic linear algebra to be similar speeds on all of the systems that provide them. But optimizing algorithms for parallel computation is very different and is also somewhat less studied. There is a lot of work to be done by all the providers to make the shift. Nothing may be seen on the surface but lots of money will be spent behind the scenes.

It is not just that the weak will not keep up with improvements, but as multicore architecture is based around reducing clock speed, while adding more cores, those that don't shift may appear to get slower, while those that do will get faster.

Those that were once at least adequate, may very soon appear unfit. It may be in the small print of new features, but keep your eye on whether your prefered system starts claiming improved multi-core support over the next couple of years.

2 comments:

JacquesC said...

But what you've missed is that although CS types care about speed, it turns out that most customers of these types of Math software don't, not really. Sure, you can't credibly offer something slower than average, but a boost of speed doesn't translate into sales. So unless one of the players gets a really substantial speed boost and makes a lot of noise about it, this is likely to be a long road.

If people cared about speed, none of the interpreted systems would not have survived this long. And Maple, Mathematica, Matlab, etc all have an interpreted user-level language. The convenience of putting together useful code that much faster than in mainstream languages is a huge advantage. As I have said elsewhere, "user efficiency" counts for a lot more than either time or space efficiency.

Scientific Computing said...

Well, of course, those that do excel in speed will make noise. But I think the real pain will come with the evolution of user expectation.

When I started in technical computing things I do routinely now, would have been impossible. Its not that I even felt held back, they were just things that you knew were not faesible. Now it annoyes me that my 32bit laptop can only allocate a couple of Gb to an application because I have processes I want to run which will work but need more.

It wasn't even a fantasy that I had when I was paying $1000 for a 16Mb SIM!

In a few years you will be outraged if your system won't run that little script you wrote to import the last 24 hours of footage from the 10 HD security cameras in your home and diff the images to tell you where you left your glasses!