Perhaps I am not very sharp today, but sometimes the purpose of a product escapes me.
Origin have just released a free tool to view Origin Project Files.
Origin Project Files, are a file format for storing origin data, results, graphs and layouts in a single file. So kind of like a .zip or archive. The format itself doesn't provide any additional features as far as I can tell.
Now the reader, lets you browse the directory structure and get data or graphics out of it. Well, again, like you can with a .zip file.
I can't see why anyone would use this, except to get at a piece of data or output from an Origin users project. If you actually cared about the way that different parts or the project related, you would already be an Origin user.
If Origin projects used standard formats in the first place, then you would be able to browse and retrieve parts already.
Its seems like a solution to a needless problem.
Friday, March 31, 2006
Perhaps I am not very sharp today, but sometimes the purpose of a product escapes me.
Tuesday, March 28, 2006
Yet another new product launch from Wolfram Research, the 5th in slightly over a month. Again it is a third party developed add-on to Mathematica. Wolfram is being very successful in attracting developers to the Mathematica platform.
This one is a little more confusing than the others. The press release says it "Provides state-of-the-art solver for large-scale nonlinear optimization". What is odd is that Mathematica already has good built in nonlinear optimization methods which scale very well.
So the new active-set and interior-point algorithms are only going to be of interest to a small set of people with quite specialist needs, which are beyond the built in functionality.
The full info at http://www.wolfram.com/news/knitro.html
Friday, March 24, 2006
My comments on open source software in science generated more feedback last month than any other article.
As well as a thread on sci.math.symbolic, there was an interesting response from Tim Daly, lead developer on the Axiom project.
There were three basic themes:
There was some heated debate on the best form of open source licensing. I have no particular expertise here, so I shall just duck that question.
There were several points that I had argued were a problem to open source, that people rightly questioned "so why is that more of a problem for open source than professional development?". Let me run through those now.
1) "The knowledge required for the topic and the interdependent parts of the system reduces the pool of possible contributors." It is quite true that the same issue applies to commercial development. But what I should have pointed out, is that the difference is that commercial development has an answer- it can pay more. Following the usual supply and demand laws of capitalism, the lack of supply of appropriate programmers forces salaries up until it attracts enough and equilibrium is reached. Or, as Richard Fateman pointed out in the case of Macsyma Inc, a failing company cannot attract the appropriate team and the failure is accelerated. In the contributed open source project, there is no salary or similar compensation mechanism. Indeed economics works in the opposite direction. Because the pool of contributors that you want to attract are particularly skilled, their time is more valuable, as they probably command high salaries in their day jobs, so it requires greater generosity to give their time for free.
The counter point to this are PhD students, who are typically highly skilled and expect to work for free, in pursuit of the qualification.
2) Tim Daly questioned why the complexity was a greater problem for part time contributors. Another point that I didn't make clearly enough. When I am deeply involved 8 hours a day in a programming project, I know every line of code that I am working on, the data structures, the function names and argument orders are all in my head. My 8 hours are all productive (apart from my human weaknesses of coffee breaks, day-dreaming etc). When I come back to that same code weeks or months later to fix a bug that might otherwise have taken 5 minutes, I spend the first two hours familiarizing myself with the flow of the code, that I have long since forgotten. If you take this over simplified example, and extrapolate. One full time programmer, achieves the same output as 24 programmers who work for one hour. Or another way, a full time programmer achieves the same in 2 hours, that a part time programmer manages in a year of working for an hour a week. Of course, the reality depends greatly on the amount you forget per break for given complexity of code, but I am sure you get the point.
As was pointed out there are many projects that are low on complexity and require widely available skills eg documentation translation, but these are pointless unless the central features are delivered.
The third strand, was from Tim Daly, and is the most interesting, being from a very different point of view (and well worth reading). If I can oversimplify his arguments- Axiom's purpose is not competition or dominance but in the act of donation where your work is available to others. He talks of a "30 year" time frame over which companies may well have gone bust, but free information lives on.
I was very much drawn to his view, but my cynical free-market thinking took over again. You can't separate yourself from the market, just because you are non-commercial. Even ideas need marketing, this is the purpose of publishing in journals, and presenting at conferences. If it was just to make the information public, it would be sufficient to put a copy of your research in the library and let it sit there waiting to be discovered. There is a wealth of research out there that is doing no good to anyone because it has long been forgotten or because competing ideas have been presented in a more compelling way. So Axiom must compete. Not for revenue perhaps, but at least for some level of mind share, and users, to keep attracting contributors.
The central question that I came back to was "If I produce some original piece of work that I want to share, should I publish it as a contribution to Axiom, or as a free piece of Mathematica or Matlab code and submit it to their share libraries?". Just like publishing a paper, I want it in the most prestigious place which gives it the widest exposure. Axiom must try to be that place, and I don't think it can take 30 years to get there.
Tuesday, March 21, 2006
I commented last year on the embryonic market for personal supercomputing software.
Today, I happened to hear two related bits of news: A nice new piece of kit from Tyan (16 CPU computer with 64Gb of RAM). And the apparent demise of Orion multisystems.
As consumer PCs continue down in price, and approach the point that they deliver all the power that their users need, scientists and other users who's demand for CPU power is unlimited, and budgets are a little larger, are once again an appealing market.
But business can be a matter of timing. Orion, it seems, was just a bit too early to try and break through from specialist to large scale use of HPC and have run out of money before the market was ready. Maybe Tyan are here at a better time, or perhaps it is tomorrow's startup that will capture the prize.
Thursday, March 16, 2006
Back in February, I speculated over the contents of the rumoured plans for the Maple Blockbuilder for Simulink. The product is now out, and it looks like that analysis wasn't far off the mark.
The product appears to be an S Function extension to the code generator but also contains some control theory functions like transfer functions, state space and zero pole gain models etc, apparently like the Wolfram Research application Control System Professional 2. Whether these functions have similar range or depth is impossible to tell, as no documentation or serious examples are provided.
What is most interesting though is not the product, but the change to the company's presentation of itself that the release has triggered. What other company has someone else's product name (Simulink) larger than its own principle product on the front page of its website?
[Update March 22: Within six days, the page has been re-designed again, to return Maple to a prominant position, so I was probably not the only person to think this odd!]
There have been a couple of other small products for control engineering coming out of Maplesoft - ICP for system identification, Dynaflex for rigid body control. These had seemed rather fringe to a company that makes its money out of teaching calculus to students. Now with this web redesign, where they have been elevated to the front page and grouped together, I am starting to suspect that Maplesoft is trying to reposition itself as a provider of addons to Simulink and Matlab.
Tuesday, March 14, 2006
Mathworks have released a new version of the Matlab product family.
The release itself is not very exciting, Mathworks has not even listed it in their "news" pages. The top Matlab new features are Windows 64 support and a new differential equation solving method.
What is very impressive is the meticulous documentation of the minor enhancements. While this is good marketing, as it makes the release look more significant, it is to be applauded as an irrelevant change for one person, is a compelling feature for another.
An interesting strategic change, is the shift to a planned two releases per year. Mathworks are showing some confidence in the plan by suggesting which months one should expect the release - and software releases are notoriously hard to predict.
While the virtues for the user in predictable release cycles are clear - user planning for deployment, shorter time to getting bug fixes - the benefits to Mathworks are also clear - a more compelling reason to pay for service rather than upgrades, and less "intellectual stock" tied up in development rather than in the market being sold.
This is the first release on this schedule, so they may not turn out to be as predictable as they hope. But while that might be embarrassing, it loses nothing. Greater pressure to release on time will be a positive drive on R& D but if too much importance is given to the deadline, then there will be pressure to cut corners - release unfinished features or buggy code.
But my biggest fear for this approach, is it makes big features- long term projects - harder to implement when you have to maintain several branches of code for different releases at the same time. We may see many more of these small incremental releases, and fewer "big new technologies" in the core products.
Did scientific computing just get a lot more boring?
Friday, March 10, 2006
A flurry of new software releases since I last wrote about Wolfram Research: GeometricalGeodesy, LensLab and Rayica.
GeometricalGeodesy, which probably falls in the class of "if you don't know what it means, your probably don't need it", concerns itself with geodesy tasks like distances between points on the earth, different mapping coordinate systems and map projections.
Rayica is a ray tracing and optical design program, and the better named product, LensLab, is basically a "lite" version of Rayica.
What these all have in common, and with the last months announcement of Statistical inferencing, is that they are all, so called, "Third party" addon products. These are extensions to Wolfram Research's Mathematica product, marketed by Wolfram, but are written by other companies.
This is a growing trend for new technical software- to write as an extension to an existing scientific packages. While there are some benefits in access to the existing base of users and support from the manufacturer of the package you are writing for, I believe the central reason is reduced cost of authoring.
Many years ago, a programmer had to write every part of a program. Then came operating systems, and programmers no longer had to worry about disk operations, or whether you had an EGA or VGA monitor. Then operating systems built in all kinds of useful libraries like graphics libraries, interface building libraries, networking etc. All you had to write were the parts of the software that were unique to you.
So it is in science, where many of the technical software packages are pushing themselves as the equivelent of computational operating systems. You don't have to program numerical libraries, sorting, solving, optimizing, statistics, bignum or symbolic libraries. Just those parts of computation that are unique to your discipline.
You don't need a mass market or huge price tag, if you can write the code cheaply enough.
Tuesday, March 07, 2006
Tecplot will shortly be reorganizing its product line up. The central product, Tecplot 10, will be split shortly to form two diverging offerings: "Tecplot 360" aimed at CFD post processing and visualization and "Tecplot Focus" aimed at "XY and 2D engineering plotting".
Upgrading customers will have to choose which branch to follow. Initially, it is claimed Tecplot Focus customers will notice little difference in the way of new features, but the company does not say whether the existing CFD visualization in Tecplot 10 will be removed from that product. If it has been, that is a tough upgrade to sell!
If it is true that their market is split between these two needs, then it is natural to target them seperately. If they are not, there are two possible interpretations:
1) Tecplot 360 will simply out develop Tecplot Focus which will become the "lite" version, at a lower price. A long established principle - see Photoshop vs Photoshop elements, Outlook vs Outlook Express, etc and in the Science arena Mathematica vs Mathematica CalcCenter or Mupad Pro vs Mupad.
2) By partitioning feature sets that do not have exclusive user groups, this is an attempt to force an existing user base to purchase two products where one used to do.
Details are expected by the end of March.
Friday, March 03, 2006
There is an interesting press release from Mathsoft today. On first inspection, it is a simple piece of cheerleading about how successful they have been but on closer inspection it is a little less clear.
First I wondered what the central premise meant: "[Mathsoft] has sold its 5,000th managed installation of Mathcad®, the world's most widely used engineering calculation tool. "Managed installation" refers to a team of users in industry, research or academia ranging from small workgroups to large engineering organizations with thousands of seats under management"
I searched their website for information on "Managed installation" and there is no reference to any product or scheme. It could mean "any quantity of licenses greater than one". Perhaps it refers to maintenance contracts, or volume schemes that need only 5 licenses. Lets suppose maintenance is 25% of a license fee (they don't list this), and volume discount means, say, 25%. Then a Managed installation, might be less than $1000 per year. That might mean this press release could represent as little as $5M per year. Not nearly enough to run a business on.
As I was musing on how this release was not as impressive as it first appeared, I noticed the most interesting fact. Towards the bottom was a reference to $3M in "follow on funding" from venture capitalists. So the funding that separated Mathsoft from Insightful five years ago is spent, and the company is still spending more than it is earning. This is normal for a start-up but for a mature company, this isn't healthy.
So how does a company close a deficit? Cut costs? No, Mathsoft is opening new offices not closing any. Sell more? Of course, if possible, though as we have seen from this release the number of customers isn't that large after 15+ years of trading. Make more out of existing customers? Here is where the excitement of having moved your revenues from 25% to 75% "managed licenses" becomes more clear. It looks like the strategy is based around growing the regular revenue from their regular customer base.
The only question left is- will this by getting them to use more of their products, or by charging them more for the ones that they have?