Britons of a certain age may well remember Radio Rentals. This was a fixture of the high street in the 1970s and 80s and provided the ability for people to rent appliances such as televisions and white goods instead of buying them outright.
For many people these days, the idea of renting technology rather than buying it sounds unusual, given the availability of cheap and replaceable electronics.
For those planning computing projects, however, renting is now seen as an increasingly desirable option and this is what cloud computing is.
Cloud is simply the expression of a utility model: renting or leasing computing resources from a third party provider and accessing them via the internet.
To illustrate the model, consider the resources that a computer needs. There’s the infrastructure (storage, processor and networking), the platform (the operating system) and the software (the applications that run on it).
Now, if you own your own laptop, you also own these resources. But you can also rent them from cloud providers; this is where terms like infrastructure-as-a-service (IaaS), platform-as-a-service (PaaS) and software-as-a-service (SaaS) come from.
So, you can rent processor capacity on machines in an Amazon data centre - to give an IaaS example. Or, you can use an online email service like Gmail, where you’re effectively renting (albeit free of charge) software-as-a-service from Google.
Crucially, with any of these services, you can stop using them at any time and not be left, for example, with a computer that is gathering dust.
Making financial sense
If you’re a small or medium business, the economics of cloud computing works. There is no upfront capital expenditure, you can expand or contract your usage in scale with your requirements and you needn’t worry about obsolescence as new technology comes about.
Even for larger businesses, it often makes sense to outsource systems management to dedicated providers rather than having to fund an IT department that is outside of your core competency. Rental agreements can be longer term, compute resources isolated and service level agreements formed that make it suitable for critical operations.
Use of the cloud by businesses is popular and the trend is continuing. Gartner predicts that by 2025, 80% of enterprises will have shut down their traditional on-premises data centres.
However, there are reasons why businesses continue to choose to keep some computing resources on-premises. For example, particularly sensitive or proprietary data is likely to be kept in-house for the foreseeable future, because accessing it over the network presents too much of a security risk.
Meanwhile, for companies who buy completely into cloud computing, it is common for them to adopt different third-party providers for different services.
Cloud vendors will innovate and tempt clients with unique value propositions. Vendor and technology biases change over time but often, lack of portability between providers can hamper movement.
The hybrid model
For the aforementioned reasons, the future of cloud is in a hybrid model - where organisations have a mixture of public cloud services from a range of providers, often co-existing with private, on-premises systems.
This model can have benefits - of not putting all your eggs in one basket, for example - but it can lead to problems: most notably, it can lead to an architecture that’s difficult to manage and whose components don’t work well together.
Therefore, the key to making hybrid models work is in ensuring that services between providers are properly integrated and portable. Thankfully, these are requirements that the computing industry is already starting to address.
Kubernetes-based platforms allow workloads to be placed in easily manageable containers that can bridge between data centres, regardless of their provider, where in the world they are, or the platform on which they are based. These architectures will be increasingly used as organisations grow their cloud deployments.
But, is it possible that cloud computing will just fade away - that we’ll look on cloud like we look on television rental today? After all, the computing industry moved away from client/server architectures in the early 80s with the advent of the IBM PC. The answer is, probably not.
The ubiquity of network infrastructure and the economic advantages of a rental model to businesses, mean that the cloud - and particularly hybrid cloud - will likely be with us for the long term.