IBM and the new data-driven era of computing

IBM and the new data-driven era of computing

Over the years, IBM has produced six Nobel Laureates. Despite being over a century old, the company is still pushing the boundaries when it comes to com...

IBM is one of the most recognised and admired brands in the world. The business has been building on its brand for more than a century, with a continuous history dating back to the 19th century. Throughout the first half of the 20th century, IBM enjoyed explosive growth, but it’s future was never completely secure. During this time, the company had to survive two World Wars, the Great Depression, and multiple industry transformations. Then came the 1950s, a period of rapid technological change, with budding computer technologies – electronic computers, magnetic tape, disk drives, programming – creating a slew of new competitors and market uncertainties. 

IBM had been a leader of the mechanical age, but success in the electronic age was not a certainty. In fact, some analysts at the time, predicted doom for IBM, as the IBM install base of mechanical tabulating machines were under assault by upstarts and well funded companies offering electronic computers. In 1951, the US Bureau of Census – an IBM client, dating back to 1889 – turned to Remington Rand to provide the electronic computer UNIVAC.

CEO Thomas Watson Jr was so concerned that IBM was too slow in adapting transistor technology that he issued a corporate policy in 1957, declaring the use of solid-state circuitry in all machine developments. Of course, history shows that IBM successfully navigated this technology shift and went on to dominate the electronic computing era. Sixty years later, some analysts have declared the end of IBM’s industry leadership, pointing to the threat of cloud computing on IBM’s traditional computing install base. Just like Watson did in 1957, the new IBM is responding to yet another technology shift, declaring artificial intelligence and cloud computing as strategic imperatives.

“I’ve witnessed, lived through and experienced first-hand the different eras of IT and computing over the past several decades,” observes Eric Schnatterly, Vice President IBM Systems for Cloud Platforms, Asia Pacific. “I’ve been working hard on navigating and transforming my team to be able to capitalise on the new era, which is driven by new consumption models and fuelled by data. We all now fully understand that data is the new natural resource. We all participate in a global economic model, which is based on trust, and therefore, to have a secure set of data is paramount. This natural resource has helped drive IBM’s transformation over the past decade.  IBM is dedicated to turning data into insights, transforming the way people work.”

During the early part of Schnatterly’s career, IBM was best known by those outside of the IT industry for the IBM PC, which became an industry standard and spawned many new businesses and industries. While intended for business enterprises, the PC extended its reach into the consumer space, giving retail consumers visibility to this large, business machines company.

When IBM decided to exit what had become a commodity offering, some thought that was the end of IBM. “When it was announced that we were divesting our PC business, many folks thought: ‘Well, IBM is out of business, they’ve sold their business,’ and yet PCs were just one small part of the business, and just as IBM did with time clocks, punch card machines, and the Selectric typewriter, it divested the old to transition to the new,” Schnatterly remarks.

“We’re an enterprise company and as such we have transformed into higher value systems, software and services, delivered via the cloud and targeted at extracting insights from data. My team is focused on the infrastructure that supports the full continuum of cloud delivery models, and designed for analysing mass volumes of data.”

Unlocking the potential of data

Data – said to be the oil of the digital age – is something that IBM wants to help its clients exploit. “Our strategy is to help clients transform into cognitive enterprises and that means that they need to make better use of the data that they have,” advises Schnatterly. “There’s statistics that estimate that 80% of the data out there is what they call ‘dark data’ – it’s not visible to search engines and sits trapped behind firewalls and within the confines of datacentres around the globe. For the most part, this resource goes untapped which means we are gaining no insight from it.

“What IBM is trying to do with our clients is to help them unlock the potential of the data that they have and to combine this with other data sources, to extract insight and differentiation.  Historically, clients have done this through business intelligence programs and queries that they’ve created. They would build a data warehouse and they would have algorithms, big elaborate queries and reports that they run against the data, trying to use the results to make better business decisions,” details Schnatterly. 

Creating rule-based programs to support the making of decisions is the old method. Now, companies are converting these rule-based algorithms into machine learning algorithms. With cognitive-based systems, the machine learns as it goes. “The whole idea is to use algorithms that learn from data on the fly, thus allowing the system to find hidden insights without being explicitly programmed where to look,” explains Schnatterly.

There is a lot of exciting work going on in this space, and IBM is credited with having reinvigorated the field of artificial intelligence with the popularity of Watson – IBM’s cognitive system that beat Ken Jennings and Brad Netter on the TV game show Jeopardy. But Watson is just one of many offerings that IBM delivers to help clients inject artificial intelligence into applications, business processes, and procedures. 

“We have a software platform called PowerAI, which includes the most popular machine learning frameworks, languages, libraries, tools, and their dependencies, and it is built for easy and rapid deployment. Complementary to PowerAI, IBM also offers a collaboration platform, called Data Science Experience [DSX], where folks can come to learn, create, and collaborate about AI and deep learning,” advises Schnatterly. 

DSX supports the complete data science lifecycle, helping data scientists bring their familiar tools such as Jupyter, RStudio, HDP, to curate data and create complex machine learning models and deploy these models into production. “Hortonworks, who IBM has selected to provide the Hadoop-based data platform, offers DSX to their clients because they see the need and value to marry big data with the complete data science lifecycle,” Schnatterly adds.

Another recent development is the growing and necessary use of hardware accelerators to mine this vast amount of data and to execute the AI algorithms. “IBM Power Systems offer unique, and industry leading capabilities, especially in the area of acceleration, that are unlocking new use cases for AI. Together with Nvidia, IBM offers GPU acceleration, but with a unique twist.  You see, within the system, the GPU and GPU memory appear as a peer to the CPUs and system memory, with system level speed and bandwidth. Put simply, this means faster access to data, faster machine learning, and better business outcomes,” says Schnatterly. 

“The need for systems that can handle the demands of AI, larger data models, and distributed deep learning clusters, will fuel the growth of my business,” Schnatterly continues. “Of course, we will continue to provide the infrastructure for core banking, telco billing, ERP, and large databases, but we see the growth coming from cognitive systems – and this is our sweet spot.  Our systems are designed from the ground up to handle AI better than all other alternatives, which allows us to navigate this latest shift from a position of strength.”

Head in the cloud

In addition to machine learning, IBM is embracing cloud technology. Whilst written usage of the term ‘cloud computing’ dates back to a 1996 business plan written by executives at Compaq, the concept only began gaining momentum a decade later when companies such as Google and Amazon began using ‘cloud computing’ to describe the new paradigm in which people were increasingly accessing software, computer power and files over the internet instead of on their desktops.

The cloud is not a place or destination, rather, it is now best characterised as a set of capabilities. “This is not always well understood,” comments Schnatterly. “Largely due to the success and growth of public cloud providers, like Amazon Web Services (AWS), some folks confuse the cloud with a place to which you move your data and workload. However, a public cloud is just one type or delivery model for cloud services. In fact, there are multiple ways and places from which to deliver and consume cloud services.”

see also:

NVIDIA: Artificial intelligence, made simple

RISE SICS North: On the RISE

As most businesses looks to embrace the cloud, IBM is engaging with them around workloads and data, to architect the best cloud model and to deliver the best business outcomes. “Some of our clients prefer to run some of their applications and workloads from the public cloud, for which the IBM Cloud would be a possible solution. However, it is highly unlikely that any company would choose to move all their workloads and data to a single cloud provider, so it is important for IBM to allow clients the flexibility to support and maintain a multi-cloud platform strategy,” states Schnatterly. 

“Since these multiple cloud types may include public cloud, private cloud, hybrid cloud and hosted private cloud, IBM stands apart as one company that can address all cloud types and requirements. IBM’s list of enterprise clients is the envy of the industry and we are working with and helping these clients with their transition to the cloud, no matter the type, but architected to suit their unique requirements – and we know these unique requirements better than any other. 

“When my team engages with a client, we take a workload and data approach, which leads to a recommended architecture. Sometimes, such things as latency requirements, data locality, or even governmental regulations, will dictate the need to build and deploy cloud capabilities within the client’s own datacentre. Make no mistake, this is still a cloud, as such solutions still allow variable consumption and costing, self-service and automated provisioning – all the capabilities that one attributes to cloud service providers. But in this case, the cloud provider is the company’s own IT shop,” explains Schnatterly.

“When it comes to cloud services, I have many routes to market. I also have partners and alliances with other firms that provide hosting services. Some of my partners have global delivery centres, from which they provide cloud services to clients, using IBM provided infrastructure – systems, storage, and software. For example, TCS – a global systems integrator – provides cloud services using IBM mainframes, power systems and storage, to some of the largest clients in the world. Some of our ecosystem partners are local or regional managed service providers that address data sovereignty issues. This provides the trust and localisation that some people are looking for versus maybe a global, public cloud provider, with whom there is no established trust or relationship.

“You see, my client may be a bank, who buys IBM systems and storage to install within their own private cloud, or my client may be the cloud service provider, who sells my infrastructure solutions as-a-service to the bank. I have to cater to all types of buyers – enterprises that choose to buy our systems and storage and install it in their own data centre and the providers of cloud services.

“Some of the enterprises do have us manage it for them in the data centre, others pick and choose and have a hybrid approach. Therefore, some of the stuff they do themselves, some of the stuff they subscribe to as a service, and we work with them to provide the glue that allows all of that to work seamlessly together, be secure and ensure trust is maintained. This transition to the cloud – it’s not a threat to me, it represents a great opportunity,” Schnatterly states.

The transition to the cloud has opened up IBM’s technology to segments of the market for whom it was previously economically unviable. “Now that things have transitioned to the cloud, I can provide infrastructure at hyperscale to big data centres that can provide this infrastructure as a service to clients not based on a server or a big storage subsystem, but based on just their compute needs and their storage needs, no matter how small,” Schnatterly explains.

“I can get very granular and we can now price to just the compute and storage needs through sub-capacity, which can be carved up smaller than a server, smaller even than a virtual server, right down to a subset of a core. You can sell and make granular your compute and storage, and we can reach a whole new audience, which was never a market I could get to before. So, this represents a big opportunity.”

New opportunities

Understandably, Schnatterly is optimistic about what the future holds for IBM. “All of this bodes well for me and my team because those demands will drive new projects and opportunities for my team to engage with clients on solutions and infrastructure to help them deliver the performance, and keep it up and running in a secure and trusted way,” he concludes.

 

Our Partners
Horton Works
View profile
Share