The ever-changing IT landscape
August 19, 2020
Gareth John, Solutions Architect at Q Associates, examines the issues around migrating systems to the cloud, and the growing shift towards a hybrid multi-cloud model.
The IT landscape is changing. It hasn’t just evolved from what it was five years ago, or even one year ago; it is in a state of constant flux, mostly due to the cloud aspect of an IT strategy – Cloud Strategy – and this can change from month to month as organisations adapt to the proliferation of new tools and services that are on offer.
There is a definite trend that is seeing workloads being moved from the on-prem datacentre to some sort of cloud, whether that be IaaS, PaaS, or SaaS; into a hyper-scaler or by consuming a service from a smaller provider. And there are many good reasons for this trend, especially in relation to the hyper-scalers: near infinite and instant elasticity where you can scale up or scale back and only pay for what you use, off-loading of hardware maintenance, taking advantage of cloud-based data analytics, utilisation of the substantial and ever-growing compendium of services, and more.
Cloud adoption, however, should not be hurried. Testament to this are the many organisations who had adopted an aggressive cloud-first strategy and discerned the resultant increasing costs, that are now trying to reverse out of the public cloud – and incurring yet more expense. Just as there are many potential benefits of public cloud, there are also many valid concerns including connectivity, security, data sovereignty, lock-in, and of course cost.
Organisations need to carefully assess their existing IT estate to ascertain which workloads are appropriate for cloud transition. There will almost certainly be workloads that are unsuitable for the transition and the ones that are appropriate will suit different cloud models. In this light, most customers that I talk to are looking to adopt a hybrid multi-cloud model (see diagram).
The first step is usually to move previously on-prem applications to SaaS offerings; Microsoft 365 is a prominent example of this where people can off-load everything (including hardware maintenance, O.S. and application versioning, resilience and interoperability) to a full-stack service that includes the application and its data. Note that while the data will reside on resilient infrastructure, it still needs to be backed up to protect against corruption, unintended change or deletion.
RUNNING HOT AND COLD
Cold data (data that is rarely used) is also considered low-hanging fruit for cloud utilisation. Nowadays organisations are typically deploying all-flash storage systems in on-prem data centres and cold data is not a good fit for this medium. Intelligently archiving cold data to a cloud object store can ensure that hot data enjoys the high performance of flash whilst exploiting a low-cost scalable cloud tier for inactive data. This cloud object tier is also a good location to store an off-site copy of backup data that can then be utilised as part of a cloud-based DR strategy.
The way in which public cloud services are consumed is fast becoming the de facto standard: users can log on to a portal, select the services they require and have these services instantiated in minutes. This is the reason that organisations should consider transitioning their on-prem infrastructure into a private cloud, so that their resources can be consumed in a much more cloud-like fashion.
It’s a lot more complicated than this, but it will involve deploying a framework that provides a service catalogue, automated fulfilment, and a billing engine. It will also require mapping SLAs to resource pool utilisation, organisational changes, and procedural standardisation amongst other things.
Whilst public cloud is great for burstable workloads (due to the inherent elasticity where you only pay for what you use) one mistake that we regularly see is the lift-and-shift of on-prem applications into public IaaS offerings. Having all VMs, that would normally reside on on-prem infrastructure, running in the cloud 24/7 could see a significant cost increase.
REARCHITECT FOR SUCCESS In order to realise the full value of public cloud, applications really need to be rearchitected to utilise things like database services (rather than running full database VMs) and serverless code services (where you only pay for the compute time that you consume). Automatically turning VMs off when they are not being used will also be financially advantageous.
Connectivity is also an important factor; as organisations move workloads off to various cloud services, connectivity needs to be considered to ensure that bandwidth and latency requirements are met once the workload has been moved. In this arena we’re seeing a lot more interest in software defined WAN (SD-WAN) initiatives aiming to simplify and orchestrate routing over an assortment of disparate WAN connections.
Q Associates has been helping customers with all of these schemes for some time, but until recently we have had to rely on partnerships to ensure that we utilise the best specific skills and knowledge in any particular area. With the recently-announced acquisition of Apex Group, we now have premium in-house skills in all of these fields and can provide our customers with a holistic delivery of infrastructure and services, from design and implementation through to support and management. The acquisition will also help us to evolve at speed, with widespread internal hybrid multi-cloud skills and knowledge, to ensure that we stay relevant to our customers in this rapidly shifting environment.
We are proud of the work that we do and would love to discuss your strategic IT priorities and challenges in this rapidly changing landscape.
Get in touch with our Solution Architects
T: (+44) 1635 248181 E: email@example.com
This article is by Gareth John,
Solutions Architect at Q Associates Ltd.
The article is also published in Storage Magazine UK: The ever-changing IT landscape