The new oil: how the world became driven by Big Data

Share
According to SAS, big data refers to data that is so large, fast or complex that it’s difficult or impossible to process using traditional methods
Experts and analysts have asserted that data has become the world’s most valuable resource. We look at how the world became driven by Big Data

According to Bernard Marr in an article with Technology Magazine, big data is a movement that has the power and the potential to completely transform every aspect of business and society.

In 2001, the META Group's Doug Laney famously identified big data as the 'three Vs': Volume, Velocity and Variety. We look at the history of big data, and how it is transforming how organisations do business today.

1928: Storage on tape

Fritz Pfleumer, a German-Austrian engineer, invents a method of storing information magnetically on tape. The principles he develops are still in use today, with the vast majority of digital data being stored magnetically on computer hard disks.

1958: Business intelligence

First attributed to Mr. Richard Miller Devens in 1865, IBM researcher Hans Peter Luhn defines Business Intelligence almost a century later as “the ability to apprehend the interrelationships of presented facts in such a way as to guide action towards a desired goal”

Youtube Placeholder

1965: The first data centre

The US government builds the first data centre with the intention of storing millions of fingerprint sets and tax returns. The initiative – capable of holding 742 million tax returns and 175 million sets of fingerprints – is generally considered the first effort at large-scale data storage.

1989: Big Data

The World Economic Forum notes this as possibly the first use of the term Big Data in the way it is used today. International best-selling author Erik Larson pens an article for Harper's Magazine, speculating on the origin of the junk mail he receives. He writes: “The keepers of big data say they are doing it for the consumer’s benefit. But data has a way of being used for purposes other than originally intended.”

Youtube Placeholder

2007: Data deluge

The publication Wired brings the concept of Big Data to the masses with the article, The End of Theory: The Data Deluge Makes the Scientific Model Obsolete.

The world’s servers process 9.57ZB (9.57 trillion GB of information – equivalent to 12GB of information per person, per day), according to the How Much Information? 2010 report.

Today: Real-time intelligence

IDC’s Future of Intelligence predictions for 2023 and beyond finds 90% of the world’s most successful companies will use real-time intelligence and event-streaming technologies by 2025. The firm also predicts that, by 2024, organisations with greater enterprise intelligence will have 5x institutional reaction time.

Share

Featured Articles

How Davos 2025 Tackles AI Revolution Amid Climate Concerns

The WEF annual meeting brings together tech leaders and policymakers as AI and sustainability shape global agenda

What US Chip Export Restrictions Mean For Nvidia

Biden's last-minute system for GPU exports imposes controls on global advanced AI chips including Nvidia's products for US dominance

Why Australian Tech Leaders Are Struggling to Adopt AI

Tech Council of Australia & Datacom report Australian tech execs are grappling with AI adoption due to economic uncertainty and skills shortages

What Global Tech Leaders Think About The UK’s AI Action Plan

AI & Machine Learning

JLR & Tata: Advancing Software-Defined Vehicles

AI & Machine Learning

How Siemens is Reimagining the Energy System of Davos

Vendor & Supplier Management