How big is big data?

Ever since ancient Egypt and the Roman Empire, people have been collecting data and trying to use it to
make better decisions, solve practical problems, or gain military advantage over an adversary.
However, to understand the world of big data, it is important to understand that most of the techniques
used today – from predictive algorithms to classification techniques – were developed centuries ago and
that big data is based on the work of some of the greatest minds in history. The key aspect that has
changed is the availability and the ability to quickly process large amounts of data. While until the
middle of the 20 th century, the majority of data analyses were done manually and on paper, now, thanks
to technology, we can analyse terabytes of data in a fraction of a second.

Since the beginning of the 21 st century, people can no longer grasp the volume and speed with which
data is generated. The total amount of data in the world in 2013 was 4.4 zettabytes. This amount has
grown to 44 zettabytes by 2020. To give you an idea of ​​the amount of data involved, 44 zettabytes is
equal to 44 trillion gigabytes. And it is estimated that, in 2025, there will be as much as 163 zettabytes
of data in the world. Even with today’s most advanced technologies, it is impossible to analyse all this
data. However, it was precisely the need to process increasingly large and complex data sets that has
contributed in the last twenty years to the transformation of existing data analysis technology into big
data.