Business Issues -> Compliance
By: Bob Tarzey, Service Director, Quocirca
Published: 4th February 2011
Copyright Quocirca © 2011
The rapid increase in the availability of on-demand IT infrastructure (infrastructure as a service/IaaS) gives IT departments the flexibility to cope with the ever-changing demands of the businesses they serve. In the future, the majority of larger businesses will be running hybrid IT platforms that rely on a mix of privately owned infrastructure plus that of service providers, while some small businesses will rely exclusively on on-demand IT services.
Even when it comes to the privately owned stuff, the increasing use of virtualisation means it should be easier to make more efficient use of resources through sharing than has been the case in the past. Quocirca has seen server utilisation rise from around 10% to 70% in some cases where systems have been virtualised. There will of course always be some applications that are allocated dedicated physical resources for reasons of performance and/or security.
Any given IT workload must be run in one of these three fundamental computing environments; dedicated physical, private virtualised and shared virtualised (that latter being part of the so-called “public cloud”).
However, the benefits of this flexibility to deploy computing workloads will only be fully realised if the right tools are in place to manage it. In fact, without such tools, costs could start to be driven back up. For example, if the resources of an IaaS provider are used to cope with peak demand and workloads are not de-provisioned as soon as the peak has past, unnecessary resources will be consumed and paid for.
A workload can be defined as a discrete computing task to which four basic resources can be allocated; processing power, storage, disk input/output (i/o) and network bandwidth. There are five workload types:
A series of linked workloads interact to drive business processes. Each workload type requires a different mix of resources and this can change with varying demand. For example, a retail web site may see peak demand in the run-up to festivities and require many times the compute power and network bandwidth it needs the rest of the time; a database that relies heavily on fast i/o may need to be run in a dedicated physical environment; virtualised desktop workloads may need plenty of storage allocated to ensure users can always save their work (thin provisioning allows such storage to be allocated, but not dedicated).
To ensure the right resources are allocated requires an understanding of the likely future requirements when the workload is provisioned, this is also the time to ensure appropriate security is in place and that the software used by the workload is fully licensed. Once workloads are deployed, it is necessary to measure their activity and monitor the environment they are running in, sometimes allocating more resources or perhaps moving the workload from one environment to another, ensuring, of course, security is maintained and that the workload always remains compliant (for example, making sure personal data is only processed and stored in permitted locations).
The intelligent management of workloads is fundamental to achieving best practice in the use of the hybrid public/private infrastructure that is here to stay. To manage workloads in such an environment requires either generic tools from vendors such as Novell, CA or BMC or virtualisation platform specific tools from VMware or Microsoft. Such products of course have a cost, but this is offset by more efficient use of resources, avoiding problems with security and compliance and providing the flexibility for IT departments to better serve the on-going IT requirements of the businesses they serve.
Quocirca’s report, Intelligent workload management, is freely available here.
We automatically stop accepting comments 180 days after a post is published. If you would like to know more about this subject, please contact us and we'll try to help.
Published by: electronicdawn Ltd.