When I first posted about Wolfram|Alpha, I was kicked by the capabilities offered by the new search engine. Despite its limitations, it looked like the beginnings of something great. Considering it was just in alpha...
Then I realized that the name of the tool itself was Alpha, not version Alpha of something big that was still come. Then, it was not a search engine at all, but a computation engine. Third, the data for the tool is not assembled by hyper-intelligent programs. Instead it depends teams of individuals to 'curate' data that can then be used by Alpha. What that means is that the extent of Alpha's knowledge base depends entirely on a validated data set provided to it by someone.
The coolest part of Alpha, therefore, seemed to be its natural language translation capability. The ability to translate speech into precise mathematical commands against a defined data set. It was still good, but far less impressive than a tool that was going to search the web, separate signal from noise, and perform massive computation against said signal.
The reason for the post is two-fold. First, I met folks from Wolfram|Alpha during a corporate event recently. When we started to look at the tool for internal use, I was struck by how the conversation seemed so very similar to the sales people of just another Business Intelligence (BI) tool; not magic whatsoever.
Then I saw the the video from TED (after the break), by Steven Wolfram. The talk is of course a pitch for Alpha, Mathematica and A New Kind of Science. But it slowly devolved into a made-up term-fest punctuated by too much I, and use of highly presumptuous language. Like the “co-evolution of users and machine” after the release of Wolfram|Alpha.
I guess I am put off from Alpha for the moment. Take a look and see if you agree.