I've just read some interesting research from Computing, the UK technology publication, supported by MSM Software, based on a survey of 160 IT leaders across every size of organisation within the UK market. Unsurprisingly, to me, it found that "over 60% of UK organisations could be using IT systems more than 20 years old" and this worries Computing. It says: “there’s a personal dimension to this, too. In some high-pressure, high-churn sectors, IT leaders may move from job to job swiftly as they build their reputations, meaning that few want to take on the challenge of a legacy upgrade that delivers little immediate payback”; and, “using infrastructure, operating systems and applications that are past their sell-by date is just storing up trouble”.
Hold on a second though. As Mandy Rice-Davis would say, "they would say that, wouldn't they". The commercial IT industry generally, and vendors of licensed software in particular, makes cashflow out of replacing software just to keep up with the latest interfaces and file formats (and to maintain suppport contracts). If you have a simple SQL application, almost certainly the SQL written for version 2000 of your RDBMS still works and if it runs fast enough, why should you upgrade to version 2014, which needs more resources, is more complex and has interesting new bugs (which may affect you) in powerful new features, that you don't want to use? Well, because your vendor won't support you if you don't upgrade, because the commercial software licence model (and your vendor's share price) is based on a constant stream of upgrades and new features.
If that suits your business model, fine; but if it doesn't, why should you upgrade just to suit your vendor? Only in IT is 'legacy' a pejorative term—most of us would like to get a legacy Rolls-Royce from our favourite uncle. Even in the world of commercial licensed software, if you are using a 20 year old CICS system on zEnterprise that still works reliably 24x7x365 and does the job, then you might consider yourself well-off rather than at risk.
Given that the methodology and sampling of this survey holds up (Computing is a reasonable sponsor, but one should always check). it's a useful input to the legacy modernisation debate, of course (I'll mention it on our Legacy Modernisation page once I've checked). But always remember that 'legacy modernisation' isn't necessarily 'legacy replacement'—legacy replacement should only be considered when and where it makes business sense for the customer, rather than for the vendor (and don't overlook the regression testing overhead just to show that the replacement hasn't changed the way your business works). Of course, one issue with legacy software, even when it is good quality and does the job, is that you may have lost the in-house expertise needed to maintain it.
Interestingly, MSM Software itself appears to undestand this; according to Thomas Coles, Chief Executive of MSM Software: "organisations need to work with an expert partner who has a real expertise in legacy technologies and can maintain and enhance the legacy system in a cost-effective way until the organisation is in a position to replace it. Our clients operate in competitive markets and need to be able to innovate to compete. Legacy systems are not only a drain on the IT team’s precious time, but they can limit an organisation’s ability to bring new products and services to market quickly. By outsourcing the maintenance of legacy systems, IT teams are freed up to focus on the more strategic IT projects that enable organisations to compete effectively in their markets”.
These survey results perhaps also go some way to explain the increasing popularity of Open Source. If over 60% of operating UK companies really are using software over 20 years old, presumably 20 year old software is often 'good enough'. Without a 'license vendor' as such, the pressure to upgrade Open Source software to a new release for fairly arbitrary reasons to do with vendor cash flow, doesn't seem so strong....