The new oil: how the world became driven by Big Data

Experts and analysts have asserted that data has become the world’s most valuable resource. We look at how the world became driven by Big Data

According to Bernard Marr in an article with Technology Magazine, big data is a movement that has the power and the potential to completely transform every aspect of business and society.

In 2001, the META Group's Doug Laney famously identified big data as the 'three Vs': Volume, Velocity and Variety. We look at the history of big data, and how it is transforming how organisations do business today.

1928: Storage on tape

Fritz Pfleumer, a German-Austrian engineer, invents a method of storing information magnetically on tape. The principles he develops are still in use today, with the vast majority of digital data being stored magnetically on computer hard disks.

1958: Business intelligence

First attributed to Mr. Richard Miller Devens in 1865, IBM researcher Hans Peter Luhn defines Business Intelligence almost a century later as “the ability to apprehend the interrelationships of presented facts in such a way as to guide action towards a desired goal”

1965: The first data centre

The US government builds the first data centre with the intention of storing millions of fingerprint sets and tax returns. The initiative – capable of holding 742 million tax returns and 175 million sets of fingerprints – is generally considered the first effort at large-scale data storage.

1989: Big Data

The World Economic Forum notes this as possibly the first use of the term Big Data in the way it is used today. International best-selling author Erik Larson pens an article for Harper's Magazine, speculating on the origin of the junk mail he receives. He writes: “The keepers of big data say they are doing it for the consumer’s benefit. But data has a way of being used for purposes other than originally intended.”

2007: Data deluge

The publication Wired brings the concept of Big Data to the masses with the article, The End of Theory: The Data Deluge Makes the Scientific Model Obsolete.

The world’s servers process 9.57ZB (9.57 trillion GB of information – equivalent to 12GB of information per person, per day), according to the How Much Information? 2010 report.

Today: Real-time intelligence

IDC’s Future of Intelligence predictions for 2023 and beyond finds 90% of the world’s most successful companies will use real-time intelligence and event-streaming technologies by 2025. The firm also predicts that, by 2024, organisations with greater enterprise intelligence will have 5x institutional reaction time.

Share

Featured Articles

DTW24 Ignite: AI to Power the Next Generation of Technology

Technology Magazine is on the ground in Copenhagen at DTW24, highlighting the industry's move towards an AI-Native era

SolarWinds: IT Professionals Worry about AI Integration Risk

A recent trends report by SolarWinds reveals that very few IT professionals are confident in their organisation's readiness to integrate AI

Qlik's Julie Kae: Leveraging Data to Improve Sustainability

In an exclusive interview with Qlik’s Julie Kae, she explores siloed data business challenges and how leveraging data can improve sustainability strategies

Study: More than Half of Companies Lack AI Innovation Skills

Digital Transformation

Devoteam Expands into UK Market, Acquires Ubertas Consulting

Cloud Computing

NTT DATA: Outdated Tech Holding Back Global Organisations

Digital Transformation