Top three predictions for the cloud in 2020
Cloud computing is a rapidly growing technology that many organisations are adopting to enable their digital transformation. It is a major force that is transforming the entire IT landscape, from how data centres are built, to how software is deployed, to how upgrades are handled.
During the last decade alone, the cloud has fuelled business success for companies across a number of industries from financial services to retail – giving them access to the tools and technologies they needed in order to compete head-on with the giants.
As 2020 gets off to a start, Exasol’s Market Intelligence Lead, Helena Schwenk, and Market Intelligence Analyst, Michael Glenn, predict the cloud technology trends that will gain momentum and impact businesses in 2020 and beyond.
1. On-premises investment by the public cloud providers will cement hybrid cloud’s dominance
In 2020, hybrid cloud deployments will increasingly become the focus for hyperscale public cloud providers.
In the last year alone, the three largest public cloud providers have made investment in on-premises capabilities. AWS for instance announced an on-premises version of its IaaS and PaaS services called AWS Outposts, planned to ship in late 2019 and Microsoft announced an expanded catalogue of on-premises Azure Stack services, including software-only and converged hardware packages. Similarly, Google added the ability to centrally maintain and manage Kubernetes clusters as a primary feature of its Anthos microservices platform, while Salesforce’s acquired Tableau, a predominantly on-premises self-service analytics vendor.
This move to hybrid is in recognition that the journey to the cloud hasn’t been painless for certain organisations, especially as not every workload is, or can, run in the public cloud. Hybrid cloud deployments are a necessity for the majority of enterprises especially with many enterprises taking far longer to make a full cloud transition. With hybrid cloud deployments, companies can transition to the cloud at their own pace, with less risk and at a lower cost, while still making the most of efficiency and effectiveness improvements.
SEE ALSO:
2. Data warehouse modernisation projects utilising containers will expand rapidly
To date, the cloud has primarily been used to build new apps and rehost infrastructure, but in 2020 we expect enterprises to increasingly leverage cloud to modernise existing business apps, processes and data environments.
In 2020, we expect more data warehouse modernisation programs to be deployed in a containerised hybrid / multi-cloud environment, helping organisations to become more agile and deliver a more frictionless deployment and management experience. This investment will be driven by the need to speed up data accessibility, improve the timeliness of insights, lessen support and maintenance costs and future-proof existing data environments.
A container-based approach allows organisations to reap the benefits of “cloud-native” as quickly as possible in the enterprise. Containers can help these companies manage data in a hybrid cloud setup in ways that other approaches cannot.
Moving data to the public cloud can be risky, and expensive, often because data gravity, dictates where data should reside and be processed. Containers help with this, improving agility and increasing portability. By building out services and data stores within containers, businesses can more easily move them all—or some of them at a time as part of your migration strategy—to the public cloud.
Containers also provide flexibility in terms of maintaining a similar architecture across all of your on-premises and cloud applications, but with the ability to customise rollouts in geographical regions.
3. Cost optimisation for cloud data warehouses becomes a growing priority
As organisations continue their journey to the cloud, CIOs are finding that a growing portion of their tech budgets are going to cloud subscriptions. While cloud brings organisations many benefits in terms of business agility and on-demand computing services, it also creates new problems not least the possibility of wasted or inefficient cloud spend. More and more enterprises are finding that there is a lot of waste in cloud budgets including those for data and analytics, as well as ineffective resource utilisation. They are therefore looking to better plan, budget and forecast spending requirements for cloud consumption.
In the case of a cloud data warehouses, costs are often based on resource usage patterns, including how much data is stored, queried and possibly inserted. In cloud terms, storage is relatively cheap, but compute can start to become expensive for organisations if the complexity of their analytics functions increases; with certain operations such as aggregating data, or raw fact-based queries pushing up compute resource.
Certain providers also have pricing models that cap costs at a certain level and therefore they restrict the amount of resources employed to process workloads. This means that organisations may need to avoid scanning raw data and limit expensive operations such as joins in order to maintain costs within budget.
In 2020 we expect cost efficiency to be a critical piece of successfully migrating from legacy on-premise data platforms to cloud data warehouses; with organisations looking to cost predictability and assurance to ensure they will not be penalised unnecessarily for becoming more of a data-driven business.
By Helena Schwenk, Market Intelligence Lead at Exasol and Michael Glenn, Market Intelligence Analyst at Exasol