Pandemic Priorities for Every IT Department
The way we live has changed. We are in a world where the word 'connected' has been redefined. Every action we take in the physical world begins in the digital world first, particularly as lockdown habits become entrenched.
For IT leaders, the implications of this are huge. So how will they respond? Will tackling the technical debt they’ve accrued over the pandemic be top of the list? Will it be moving away from legacy IT? Or perhaps investing in tech to help tame complexity, such as Kubernetes? With these questions in mind, here’s a list for every IT team as they look to modernise their technology to stay relevant in this ever more digitally augmented world.
Get cloud costs under control
It’s no secret that companies waste a fortune on unused cloud spend – up to 30 percent of their cloud budgets, according to one estimate. With many budgets in tatters after 2020, it’s time to look into cost management. This typically comes in the form of granular usage telemetry, automated usage monitoring, elastic scaling based on thresholds, among others. However, these solutions often call for the user to run their own VPC, as many ‘aaS’ vendors aren’t set up to work with cost controlling measures. As such, keeping cloud costs down starts with the use of solutions that will work within the user’s VPC, giving them more control over size and cost.
Get serious about security
Security is arguably the most visible challenge IT leaders face – it seems that we see a new leak or cyber-attack on a daily basis. Yet outside of a small handful of cases, no organisation is being intentionally careless, it’s just that security is an incredibly complex issue that requires a high level of attention to detail.
While there’s no shortage of security solutions for all areas of a business, there are a number of methods available without breaking the bank. Start by taking a deeper look at your data at rest (databases, files, LDAP) and in motion (networks, WANs, ports), and ensure that reasonable rules are being followed, without adding too much complexity. In 2021, standard security audits won’t cut it; it’s now much more important to think through the company’s wider security architecture and security culture. Of course, an attack or leak can happen to anyone, but those with a balance of IT infrastructure simplicity and a modern security structure will reduce their chances.
Don’t stick to a single cloud
While standardising on the services offered by AWS, GCP, or Azure may seem like a win-win at first, organisations are beginning to discover that it sacrifices freedom in the long term and may even lead to higher costs as these services are generally offered at a premium over third-party equivalents. This is where multi-cloud comes in: aside from being a cost saver, it should be a given for any global enterprise that needs to run services 24/7. After all, no cloud vendor reaches into every region of the world, and multi-region outages have struck all existing vendors.
Get to grips with Kubernetes complexity
Kubernetes has become ubiquitous among enterprise development teams, and for good reason. Yet its complexity is daunting: it strangles development teams with 3,000 line configuration files and week-long debugging sessions. Kubernetes should be supporting multi-cloud, multi-region configurations with much lower complexity. Every organisation should be not only putting their best and brightest into solving this for themselves but also working to solve it as a global community, through means such as bringing in standardisation to container environments where possible.
Automate and analyse
Analytics capabilities have been around for years now, yet many organisations have little to show for it. It just takes too long to get anything done, and modern businesses are too dynamic for the kind of answers that systems like Teradata, Greenplum, or Netezza are made to handle. Analytics should be real-time: ease and quickness of deployment matters more than creating perfect and efficient analytics cubes. As such, cloud-based analytics systems that can operate using production data, or even stream it in real-time are becoming the norm. If your organisation is still relying on a batch process, there’s never been a better time to start moving faster as we work towards putting the pandemic behind us.
Taming the database sprawl
Could it be time to migrate? Most data architectures are piecemeal, and the bigger picture is rarely seen. Consolidating your database environment may be a better fit for your organisation’s needs than a wholesale migration. Equally, maintenance costs, capability needs, consistency and lineage concerns may make migration the most feasible option. Ultimately, whichever approach your organisation opts for to tame its database sprawl must be determined by an assessment of where each database fits within the overall environment, and whether it serves its purpose cost-effectively.
Over the edge
In 2021 there are more connected devices on any given network than ever before, and the number keeps growing. As a result, edge computing has started to hit the mainstream. By moving more compute and storage closer to where it’s actually needed, systems become much more responsive and scalable. Crucially, this builds resilience into networks by reducing the impact of issues that affect the core, such as an outage. If the pandemic has taught us anything, it’s that resilience is among the most valuable commodities out there.