Big Data refers to extremely large data sets that can be analysed to reveal patterns and trends.
The 3V’s of Big Data
Much of the tech industry follows Gartner’s ‘3Vs’ model to define Big Data. Data that is high in:
Volume
Velocity
Variety
The volume of data organisations handle can progress from megabytes through to terabytes and even petabytes. In terms of velocity, data has gone from being handled in batches and periodically to having to be processed in real time. The variety of data has also diversified from simple tables and databases through to photo, web, mobile and social data, and the most challenging: unstructured data.
How big is ‘Big Data’?
Every day, we create 2.5 quintillion bytes of data – so much, that 90% of data in the world today has been created in the last two years alone.
When data sets get so big that they cannot be analysed by traditional data processing application tools, it becomes known as ‘Big Data’.
As different companies have varied ceilings on how much data they can handle, depending on their database management tools, there is no set level where data becomes ‘big’.
This means that Big Data and analytics tend to go hand-in-hand, as without being able to analyse the data it becomes meaningless.