The new oil: how the world became driven by Big Data

According to SAS, big data refers to data that is so large, fast or complex that it’s difficult or impossible to process using traditional methods
Experts and analysts have asserted that data has become the world’s most valuable resource. We look at how the world became driven by Big Data

According to Bernard Marr in an article with Technology Magazine, big data is a movement that has the power and the potential to completely transform every aspect of business and society.

In 2001, the META Group's Doug Laney famously identified big data as the 'three Vs': Volume, Velocity and Variety. We look at the history of big data, and how it is transforming how organisations do business today.

1928: Storage on tape

Fritz Pfleumer, a German-Austrian engineer, invents a method of storing information magnetically on tape. The principles he develops are still in use today, with the vast majority of digital data being stored magnetically on computer hard disks.

1958: Business intelligence

First attributed to Mr. Richard Miller Devens in 1865, IBM researcher Hans Peter Luhn defines Business Intelligence almost a century later as “the ability to apprehend the interrelationships of presented facts in such a way as to guide action towards a desired goal”

Youtube Placeholder

1965: The first data centre

The US government builds the first data centre with the intention of storing millions of fingerprint sets and tax returns. The initiative – capable of holding 742 million tax returns and 175 million sets of fingerprints – is generally considered the first effort at large-scale data storage.

1989: Big Data

The World Economic Forum notes this as possibly the first use of the term Big Data in the way it is used today. International best-selling author Erik Larson pens an article for Harper's Magazine, speculating on the origin of the junk mail he receives. He writes: “The keepers of big data say they are doing it for the consumer’s benefit. But data has a way of being used for purposes other than originally intended.”

Youtube Placeholder

2007: Data deluge

The publication Wired brings the concept of Big Data to the masses with the article, The End of Theory: The Data Deluge Makes the Scientific Model Obsolete.

The world’s servers process 9.57ZB (9.57 trillion GB of information – equivalent to 12GB of information per person, per day), according to the How Much Information? 2010 report.

Today: Real-time intelligence

IDC’s Future of Intelligence predictions for 2023 and beyond finds 90% of the world’s most successful companies will use real-time intelligence and event-streaming technologies by 2025. The firm also predicts that, by 2024, organisations with greater enterprise intelligence will have 5x institutional reaction time.

Share

Featured Articles

Intuit: How AI-Driven Personalisation is Reshaping Ecommerce

Intuit Mailchimp research reveals 48% of consumers expect AI to revolutionise shopping, presenting new challenges for data governance and personalisation

Harnessing AI in Education to Transform Student Experience

Technology Magazine speaks with Eric Wang, Turnitin's Vice President of AI, about how the organisation uses AI and its opportunities and challenges

How AI is Boosting Big Tech Operations in India

India’s US$254bn tech industry is no stranger to big tech, with countless companies eager to invest to bolster AI progress and digitally transform

How Project Silica Could Revolutionise Global Data Storage

Data & Data Analytics

SAS: Balancing Cloud & AI Innovation and Sustainability

Cloud & Cybersecurity

How Disney Uses Digital Twin Technology with Hitachi Vantara

Data & Data Analytics