Tuesday, January 31, 2006

Gauss 7.0

As if to emphasise my comments on function naming, I read the announcement of the new Gauss 7.0. Amongst the mostly minor tweaks to this numerical system (most notable being 64 bit support), I see the following:

Additional new commands:
• convertsatostr
• convertstrtosa

There is no reference on the site to what these new functions might mean or do, and to a non-Gauss user, it is hard to know from the cryptic function name.

If you must invent cryptic new words, it is certainly vital to explain them somewhere!

Thursday, January 26, 2006

Minitab offer targeted consulting and training

Minitab Inc have done well in recent years by careful targeting of their statistical software at a particular problem - Six Sigma standard compliance.

It is continuing this strategy with a new training and consulting operation announced recently. Avoiding the usual language of consulting services and training, they focus directly on Six Sigma again offering a "support service" where organizations lack the statistical or technical skills needed.

When you strip it down it is a standard consulting/training service - "Can't use our product? Pay us to use it for you or to show you how". But this seems like a worthwhile service, well presented and likely to do well. At $200 per hour, Minitab Inc must think so too.

Monday, January 23, 2006

Mac Intel, the race is on

With the first of the Intel based Macintosh computers now shipping, there will be a race amongst those who care, to be the first to ship with native support.

There are plenty of companies who don't care because of the small Mac user base and its historic bias for consumers and publishing users.

Recent high end Macs have been much more interesting for technical work, and Apple has shown some focus on grid/HPC computing. Nevertheless, a quick port to Mac is more a matter of prestige than huge commercial return. Somewhat like being the first to the moon, though rather easier.

In response to the usual vocal Mac users demanding it now, there have been some clues from software companies.

Mathworks Mac developer, Brian Arnold seems to be hinting at a time frame when he explained that Matlab would take a month to test in its current version, and that the result would probably be a recommendation to not use it. He wouldn't be drawn further but one can probably conclude that a native release is at least 2-3 months away.

Meanwhile Paul DeMarco at Maplesoft was willing to be more precise and more ambitious, suggesting that Maple would be "several weeks" away.

No comment has yet be drawn from Wolfram Research. The silence is quite suprising, since Steve Jobs first announced Mac Intel with Wolfram Research's Mac developer, Theodore Gray, on stage next to him, claiming to have managed a prototype port in "a few hours". Presumably release quality ports and testing take a lot longer, but Wolfram have set their users up with anticipation that must be lived up to.

Place your bets now.

[See Feb 17 update for winner]
[See June 6 update for Maple and Matlab Mactel news]
[See June 13 update for further Maple Mactel comment]

Friday, January 20, 2006

What's in a function name?

When a computer language is created, what guides the choices made for command names? One might hope it would be easy of writing and reading, but sadly not in the case of most scientific software.

Most of these systems have roots that pre-date modern desktop computing. What makes modern computers different? Speed and capacity. It's capacity that has had lasting effect on language design.

Why did Maple invent a new word "int" to express integration and Matlab invent "quad"
while Mathematica used an existing word "Integrate"? The answer is 3 bytes in Maple, 4 bytes in Matlab and 9 bytes in Mathematica. Such considerations were once important when PCs came with 640k of RAM but we suffer these choices decades later.

We rarely abbreviate words in normal life because it doesn't help us express ourselves. Or perhaps I should say "We rar abbrv wds in nml life bec it dnt help us xprs ours". Except in the one place where hardware limitations still control our behaviour- texting on mobile phones.

It's hard to read back and hard to remember when the words we use in programming are so different from the ones we use in real life.

Of the popular systems, suprisingly only Mathematica does not fall into this trap, being just young enough to have been designed with anticipation of megabyte computing.

Consider some examples
Task: Dimensions of a matrix
Matlab: ndims
Mathematica: Dimensions

Task: Row reduced eschelon form of a matrix
Matlab: rref
Mathematica: RowReduce

Task: Change to cartesian coordinates
Matlab: sph2cart
Mathematica: CoordinatesToCartesian

Task: Comparison test
Matlab: ge
Mathematica: GreaterEqual

Task: Rational approximation
Matlab: rat
Mathematica: Rationalize

Task: Select a column of a matrix
Maple: col
Mathematica: ColumnTake

Task: Polynomial interpolation of data
Maple: interp
Mathematica: PolynomialInterpolation

Task: Number of occurrences
Maple: numboccur
Mathematica: Count

Task: The covariance of two data sets
MathCAD: cvar
Mathematica: Covariance

Task: Pearsons correlation coefficient
MathCAD: corr
Mathematica: Correlation

Task: Inverse fast fourier transform
MathCAD: icfft
Mathematica: InverseFourier


I will leave the reader to guess the meaning of the following commands in use in Maple and Matlab: gc, statevalf, PSLQ, cfrac, nops, ne, erfcx, decic, speye, sprank, dF, dpois, rnd, PRNCOLWIDTH, Yn, K1.

And if you can guess, try remembering the correct abbreviation later.

And of the future? Not bright. Even new systems often are developed by people so immersed in this culture that they do not question it. Only Maple has the courage to address the issue in an existing system. They are gradually making parts of the system obsolete and replacing them with new functionality with clearer naming: The linalg package with functions like LUdecomp and diag replaced with a LinearAlgebra package with LUDecomposition and DiagonalMatrix. But they have chosen to do this gradually, to keep the existing user base on board. Perhaps in 10 years it will look just like Mathematica!

Apologies to non-English speakers for whom this analysis does not apply, though if you are reading this article, you probably do!

Friday, January 13, 2006

First draft of Numerical Math Consortium "standard"

The Numerical Mathematics Consortium has already published a first draft- a list of the first 250 areas of functionality to be covered by their plan for a standard.

http://www.nmconsortium.com/upload/NMC%20Technical%20Specification%20v1.0%20_DRAFT.pdf

This 'draft' is clearly meant more as PR than a serious design document. Nevertheless one can get clues as to their direction:

There is no reference to handling data of higher dimensions than a matrix.

There seems to be a bias towards special cases rather than general. eg there is 'log base 2', 'log base 10', 'atural log' but there is no' log to base n'. One could construct this from the others, but why have three when one could do? Conversely, for power there is 'real power' which would cover all cases, but then there is also a special case for 'power of 2'.

There is some confusion about whether the standard will define tasks or algorithms. Most of it appears to define tasks eg Bessel makes no mention of what algorithm to use, that is left to the implementation, the same for most of the 250 functions. However for numerical integration we find 'Numerical Integration (2D)'
'Numerical Integration (3D)', 'Numerical Integration (Adaptive Labatto)'
'Numerical Integration (Adaptive Simpson)', 'Numerical Integration (Trapezoidal Method)'. So here we have specific algorithms listed as well as the general task. To what level of detail will the 'standard' attempt to define the algorithm? Any step down this route will lead to trouble for implementors.

The 'power of 2' example is about algorithms too. It is there because power of 2 lends itself to a special case algorithm that is faster and more robust than a general power algorithm. Why not leave the handling of special cases to the implementor to decide? A cheap implentation could decide to ignore special cases, a good implementation could detect and branch on special cases.

All this becomes a bit clearer when you look at the 'example' function mappings. While these claim to be illustration examples only one discovers that every one of the standard commands is shown to have a direct mapping to a single function in Matlab. This is most strange when you consider that MathWorks are not on the committee. Is the whole standard going to be a Matlab clone in disguise? Is the real purpose to set up Matlab code importing?

Its too early to judge, but the only thing that has impressed me so far, is the speed at which they are working.

[See also previous comment "Numerical math consortium - the new openmath?"

Wednesday, January 11, 2006

Shock Response for DADiSP

DADiSP, the venerable "Engineering Spreadsheet" has had one product anouncement this year- a new add-on for Shock Response Spectrum Analysis.
http://www.dadisp.com/srspr.htm

Given that they don't produce much software (the last upgrade for the principle product was 2002) and all other add-ons are quite general (eg fuzzy logic) or utilities (like .wav import), it is suprising to pick a very narrow and specific topic.

My guess is that this was a spin-off from some consulting rather than the start of a new direction for the company.

DADiSP has a small core of enthusiastic users, but has never made much mass impact. This niche add-on will likely go un-noticed by the world.

Friday, January 06, 2006

What's the point of calculators?

At a conference recently, I met a representative of Texas Instruments who was there to sell calculators. It gave me the chance to ask a question that has been on my mind for a while- "Why would I still use a calculator now that computers are so cheap?".

She explained what I wide range of calculations could be done on the calculator, which I don't doubt, so again I pressed "But why choose to use such a tiny keyboard, and small screen when it would be easier to do on the computer and easier to save work, annotate work, re-use calculations etc."

In the end I got a reason, and I had to agree it was a good reason to buy one. She said "You can take it into an exam, but most exam systems do not allow you to bring a computer in."

I left satisfied that TI has a safe and lucrative future but shocked by the realization of how back-to-front our educations systems can be. If we are preparing our children to be effective in the real world, why do we force them to use tools that lose their central benefit in the real world?

I reflected on the possible reasons to keep computers out of exams. It can only be the ability to store lots of data- textbooks, lecture notes etc. A good reason when you are testing students knowledge, but not a good reasons when you are testing the students ability to apply knowledge. And surely this is what you are testing in a computational task? When I solve problems as a professional, I do look up facts and methods. Thats the easy part, understanding them, modifying them to my task, specifying the calculation, interpreting results- these are the hard parts. And they can be plenty hard enough without forcing one to do it on a 3 inch keyboard and 100 pixel screen!

Related news:
26 May 06: Is the new TI calculator good news for TI?
7 Sep 06: TI NSpire calculator delays
23 Nov 06: Derive to be discontinued

Tuesday, January 03, 2006

Partner swapping in Scientific Computing

Since the start of modern scientific computing, relationships have been
important. Some, like Wolfram Research prefer detached friendships, offering
links to other software like Labview, Excel, Matlab etc without forming any
deeper attachment. But most seem to like to take it much further.

For many years Maple has been the most promiscuous. In bed with Matlab under
the pseudonym of "Matlab Symbolic toolbox", in bed with MathCAD, installed as
a library with every copy, in bed with Scientific Word to make the
computational version, Scientific Workplace while being happily married at
home with NAG installed with Maple to provide numerics.

Scientific Workplace was the first to break this happy arrangement. After
flirting briefly with Mathematica, it found a new, less demanding lover MuPad,
and for some time they were very happy. Now that MuPad is looking rather sick,
they may be wondering if they did the right thing as they prepare for life as
a widow.

Maple could be forgiven for feeling smug, but according to comments from
Cliff Yapp on the Axiom discussion group, it has its own impending
relationship woes. Maple's first love, MathCAD, is considering someone else
too. And even more embarrassing for Maple, MathCAD is looking like moving in
with the ancient and unattractive Maxima who is feeling younger than it looks
after the excitement of becoming open source.

As with all divorces, it's the children who suffer!