View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
  2. Data
March 6, 2020updated 13 Jul 2022 9:47am

Hindsight in 2020: The top Big Data Innovations of the Past Decade

There’s no denying that the 2010s saw some fascinating developments in Big Data.

By CBR Staff Writer

Predictions are hard – it’s not always easy to say what will happen in the next week, let alone the next year or next decade, writes Ravi Mayuram, SVP of Engineering and CTO, Couchbase.

Hindsight in 2020

Ravi Mayuram

And looking ahead, it’s difficult to grasp just how few people knew what the next ten years would bring in 2010. After all, if you had bet $100 in 2010 that Donald Trump would be President of the U.S.; the U.K. would be preparing to exit the European Union; and that superheroes and Star Wars would dominate the box office, you would be looking forward to a hefty pay-out in the New Year.

In tech, the same is true – in 2009, many predicted that e-discovery would be one of the hottest applications of the decade. However, here at the beginning of a new decade, and with the benefit of hindsight, we can see that the 2010s was the decade when Big Data really took off. From data warehousing, in-memory processing and massively parallel processing in 2009, we have seen the use of Big Data explode in fascinating, and sometimes terrifying, ways. But what were the biggest Big Data innovations of the last ten years? We asked our brain trust of top Couchbase experts and came up with the following two innovations, and one painful lesson:

1. Pay-as-you-use computing has given Big Data a home in the cloud

In 2010, in-memory processing was the secret to crunching big data quickly and intelligently; using the extra speed granted by eliminating the storage pipeline together with massively parallel processing to ensure there was enough memory available to perform any calculation the business needed. Simply running enough computing power at once could be an expensive task for any business – especially since the end of Moore’s Law means that processing power will no longer become more affordable year-on-year. However, the cloud has already provided the answer. The elastic nature of the cloud means that organisations need to only pay for the processing power they actually use, instead of having to first make a significant capex investment. This means that true Big Data processing is now within reach of most enterprises. As a result, we are seeing an explosion of Big Data applications that wouldn’t have been imaginable for most businesses at the beginning of the century.  

2. Machine Learning gave killer tech a killer app

Like any technology, Big Data is only as valuable as the uses it’s put to. The rise of Machine Learning in the 2010s created a demand for immediate insights that Big Data was perfectly suited to meet. After all, you can only learn if you’re given information to learn from. The ability to feed Machine Learning algorithms with a wealth of information, and give them the processing power to draw conclusions from and adapt to that data, means enterprises are creating ever-more sophisticated applications. From online help assistants, to sophisticated fraud detection systems – not to mention Deep Learning’s efforts at pushing the boundaries – Big Data-powered Machine Learning is the success story of the decade.

Content from our partners
Scan and deliver
GenAI cybersecurity: "A super-human analyst, with a brain the size of a planet."
Cloud, AI, and cyber security – highlights from DTX Manchester

3. Your technology is only as valuable as the environment allows

The decade wasn’t all successes. There were also painful lessons. For instance, Hadoop looked like it would take the world by storm near the beginning of the decade – with its technology looking to revolutionize the ability to take advantage of Big Data via distributed file systems on enterprises’ own hardware. However, the increase in disk and network speeds, coupled with the rapid growth of the cloud, meant that the trends in Big Data moved away from Hadoop’s vision. While still an excellent tool in the appropriate environment, it hasn’t been the all-conquering force it might have been. Whether developing or looking to take advantage of Big Data tools, businesses need to be well aware of what they can do; how they can be best used; and what direction trends could potentially take, so they are prepared for any eventuality. For instance, with IoT yet to reach its full potential, we still can’t be certain what distorting effect it will have.

There’s no denying that the 2010s saw some fascinating developments in Big Data. Here’s to seeing what the 2020s bring.

See Also: Ransomware is Encrypting Backups Too, Warns NCSC: From Cloud, to USB

Topics in this article : , ,
Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.