The new oil: how the world became driven by Big Data

Experts and analysts have asserted that data has become the world’s most valuable resource. We look at how the world became driven by Big Data

According to Bernard Marr in an article with Technology Magazine, big data is a movement that has the power and the potential to completely transform every aspect of business and society.

In 2001, the META Group's Doug Laney famously identified big data as the 'three Vs': Volume, Velocity and Variety. We look at the history of big data, and how it is transforming how organisations do business today.

1928: Storage on tape

Fritz Pfleumer, a German-Austrian engineer, invents a method of storing information magnetically on tape. The principles he develops are still in use today, with the vast majority of digital data being stored magnetically on computer hard disks.

1958: Business intelligence

First attributed to Mr. Richard Miller Devens in 1865, IBM researcher Hans Peter Luhn defines Business Intelligence almost a century later as “the ability to apprehend the interrelationships of presented facts in such a way as to guide action towards a desired goal”

1965: The first data centre

The US government builds the first data centre with the intention of storing millions of fingerprint sets and tax returns. The initiative – capable of holding 742 million tax returns and 175 million sets of fingerprints – is generally considered the first effort at large-scale data storage.

1989: Big Data

The World Economic Forum notes this as possibly the first use of the term Big Data in the way it is used today. International best-selling author Erik Larson pens an article for Harper's Magazine, speculating on the origin of the junk mail he receives. He writes: “The keepers of big data say they are doing it for the consumer’s benefit. But data has a way of being used for purposes other than originally intended.”

2007: Data deluge

The publication Wired brings the concept of Big Data to the masses with the article, The End of Theory: The Data Deluge Makes the Scientific Model Obsolete.

The world’s servers process 9.57ZB (9.57 trillion GB of information – equivalent to 12GB of information per person, per day), according to the How Much Information? 2010 report.

Today: Real-time intelligence

IDC’s Future of Intelligence predictions for 2023 and beyond finds 90% of the world’s most successful companies will use real-time intelligence and event-streaming technologies by 2025. The firm also predicts that, by 2024, organisations with greater enterprise intelligence will have 5x institutional reaction time.

Share

Featured Articles

MWC24: Mimik Hybrid Edge Cloud Drives Cognitive Internet Era

Siavash Alamouti, mimik Co-founder & Executive Chairman, explains how its hybrid edge cloud platform enables the transition to the Cognitive Internet Era

Dell Technologies: Powering Reliable Global Connectivity

Dell Technologies is announcing new solutions to help communications and service providers (CSPs), so that their systems are faster and more flexible

MWC Barcelona 2024: Unveiling the Future of Technology

Technology Magazine is live at MWC Barcelona 2024 this week, where global industry leaders come to reveal cutting-edge innovations in connectivity

Google Gemma: An AI Model Small Enough to Run on a Laptop

AI & Machine Learning

Why Tech Leaders Should Attend Sustainability LIVE: Net Zero

Digital Transformation

OpenText Report: IT at Forefront of Sustainability Efforts

Digital Transformation