For today’s IT decision makers, bombarded with advertising and marketing from all sides about how everything is going to change, it is hard not to get caught up in the hype. At a recent event I attended, a senior IT guy told a number of his peers how his boss – the CEO – had been quizzing him about some of the latest developments. “When can we get some of this cloud?” he was asked – a question which could never hope to have a straight answer.
Storage managers are not immune from pressures caused by such marketing, as well as being faced with much the same realities as their peers elsewhere in the IT department. Storage virtualisation and thin provisioning, iSCSI, deduplication, encryption, data centre convergence through 10-Gigabit Ethernet, storage as a service, and so on are all (we are told) set to rip up the rule book when it comes to how we design, deploy and operate our storage environments. We are heading for a brave, exciting new world, where everything is going to be so much better managed, easier and above all cheaper than in the past.
If you ask exactly when this wondrous vision is going to present itself, the answers become more vague however. I’ve been asking this question for the past decade, and the answer has not really changed much – the general consensus is that we are looking at 5-7 years or beyond. In other words, just long enough ahead not to make a difference to what is happening today.
The reality, for the majority of organisations we speak to, is that IT is not going to change that fast or that much. Last week I was speaking to two storage managers representing two large financial institutions, who concurred that life in three years’ time isn’t going to look particularly different compared to what we see now, for a number of reasons. First and foremost, that storage demands will continue to grow – as reflected in our research, data growth is currently the biggest factor affecting how we architect and procure IT systems in general, and storage in particular.
Another factor is that in organisations of all sizes, IT is just too complex to be replaced wholesale. As a consequence vendor case studies tend to dwell on the few exceptional organisations that really did decide to make sweeping changes to their IT – these frequently appear to include healthcare institutes somewhere in middle America, eastern European telcos and banks with names nobody has ever heard of. Who would not wish such organisations well – but such examples do not always translate for more mainstream IT organisations.
This is, of course, if it would make financial sense to make such sweeping changes in the first place, which leads to the third and perhaps most important factor – the cost of any change can frequently exceed the shorter-term benefit of making it. We know of a number of new technology areas at the moment – desktop virtualisation is one – where it is very difficult to make a business case purely on the basis of shorter-term financial savings, particularly if the costs of storage are taken into account.
An additional cost with any such initiative is that involved in managing the change. IT skill sets and working practices have been developed over many years, and changing the approach requires extensive changes to both skills and mindsets. My colleagues reminded me how many previous ‘big bang’ new technologies have under-delivered, in part because these factors weren’t taken into account. As a counterpoint, such under-exploitation can represent a latent opportunity, for example, should an organisation choose to revisit the potential of tiered storage.
While “big change” only happens by exception, new technologies do still find their place. However – and it stands to reason, given the need to justify value in advance – this will tend to be in specific areas where the need can be clearly defined, as opposed to broader deployments. So, “rationalising SAP instances” is more likely to happen than “rationalising applications”. While this may cause opportunities to solve bigger problems to be missed, there is a higher guarantee of short-term return – which is important for any senior IT decision makers having to justify their existence and add to their CVs.
Given these factors, it stands to reason that we should not really expect the world of IT in general, or storage in particular, to look that different in three years’ time. Certain technologies will undoubtedly become more prevalent: server and storage virtualisation for example, though we are still looking for proof points for whether there is life for virtualisation beyond consolidating existing servers and storage onto smaller pools of hardware. Meanwhile, deduplication will gain a footprint no doubt, particularly given the amount of unstructured content being duplicated unnecessarily through email, file sharing and the like. Other technologies, such as storage encryption and 10-Gigabit Ethernet will be more of a slow burn, implemented following traditional data centre refresh cycles rather than sweeping through like a forest fire.
Is IT life really just going to be more of the same, however? Given how data quantity continues to grow, many of the above capabilities are more about keeping up than getting ahead, so will we just be treading water? Thinking of storage in particular, one area that may yet have its day is data classification and categorisation. The storage managers I spoke to mentioned the continuing pressures of compliance, and their desire to get smarter about what information they were storing, and how it was managed. Classification technologies have been available for some time, but they remain fragmented – tools available to classify information for records management purposes currently sit separate from tools for content management and data leakage prevention, for example. Should vendors grasp the opportunity to integrate classification information, they may well find a very interested audience when they bring the resulting products to market.
While it is perfectly valid to pay attention to new technology developments, then, it is equally important to consider them against the reality that corporate infrastructure is not going to look that different in a few years time, compared to how it looks today. There will always exist opportunities to make things better, to save money, to consolidate, integrate and rationalise. However, nobody should be duped into thinking that the latest raft of technologies are anything other than that – new capabilities which can be integrated with what has gone before, so as to keep up with the increasing quantities of data we have to deal with. While this may be disappointing to the evangelists, it should come as quite a relief for the majority of front line IT decision makers.