An interesting—and arguably important—example of a favourite hobby horse of mine emerged at the recent Intel Software Development conference in Chantilly, France. That hobby horse is the way that technologists get so carried away with the undoubted cleverness of their developments that they sometimes miss the point that, ultimately, it is technology’s application in the real world that is the important factor.
The Intel conference is an annual spring-time event focussing on programming and applications development in the world of parallel computing. To be fair, you can’t get much more technical than that, for parallel processing is largely the domain of supercomputers processing esoteric meteorological, scientific and engineering problems at PetaFLOPS processing speeds (that’s quadrillions of calculations per second).
Except that this now is a 'used to be' scenario. Parallel processing has been creeping into the upper echelons of mainstream computing for a while, but with the advent of cloud computing that creep is getting much faster. Now add in the advent of mainstream big data analytics and the need for parallel processing is fast changing from 'nice to have' to 'essential'.
The combination of big data analytics and the cloud is also happening—it is only a matter of months, and more likely a couple of weeks, before SAP announces that its in-memory analytics engine, HANA, is available as a SaaS delivered service. When that happens, the need for parallel processing power in the cloud will be the only sensible option.
So, here then is Intel, developer of x86, the leading parallel processing architecture. This is also at the heart of most commodity servers used as the basis of cloud infrastructures, where packaged applications and service development environments are commonly found. Despite this congruity, it became clear at the conference that the company sees no need to now move in the direction of building automated development tools for parallel processing applications. This is despite acknowledging that application areas such as big data analytics really do need them.
The argument put forward by the company’s Chief Evangelist for software products, James Reinders, is understandable. Basically, it is that automating development leaves open the possibility of building in processing and operational inaccuracies to code that could spawn and go viral in a parallelised cloud environment. On the face of it that is a very good argument and something certainly to be avoided.
But on the other hand, one of the parallel development tools Intel has produced allows developers to bit-flip—change the state of a single bit in a single byte of code. That would seem to put a good deal of trust into the developer’s hands to get it right.
And on another hand, the company has just released such a packaged-up development solution for a specific problem. This is the new HTML5 Development Environment, for developing apps across a range of different devices. This is aimed to help with issues such as working with different architectures for cross-platform applications, developing for different aspect ratio displays and different user interfaces.
It supports development for iOS, Android, Nook, Amazon, Windows 8 and deployment in the following stores—Apple App Store, Google Play, Nook Store, Amazon Apstore for Android, and Windows Store. It also supports delivery of HTML 5 web applications for Facebook, and Google amongst other environments.
Reinders sees the HTML5 Development Environment as a one-off special case. Personally, I see it as the first of many similar tools, which can then be linked together to build richer, more comprehensive applications and services. For example, the HTML5 tool would make the obvious output/delivery component of a big data analytics service development and orchestration suite for cloud environments. That would appeal to not only big enterprise end users but also every Cloud Service Provider business looking to offer richer levels of customer service and engagement.
I am sure some company will do just this in the not too distant future. But Intel sits here with the skills, the knowledge and the capability to do it right now, yet it seems strangely reluctant. Maybe it is scared of some future anti-trust case for creating and owning the dominant development environment?
If that is the case I’d just suggest they do it and be damned. The need is about to get much more important than the legal implications. And sometimes the law just has to play catch up.