Global Data Volume Grew to 2.8 Zettabytes in 2012

In 2012, the world generated and replicated about 2.8 zettabytes (ZB) of data. The "digital universe" is forecast to reach 40 ZB by 2020, up from previous estimates of just 5 ZB.

According to IDC, the data volume will have grown 50-fold in 2020 from 2010 levels. One of the most significant drivers of this growth will be machine-generated data which will grow 15x until 2020. By that time, the data created in emerging markets will exceed the data created in the developed world.

IDC used some interesting images to illustrate the sheer quantity of 40 ZB: 40 ZB is equal to 57 times the amount of all the grains of sand on all the beaches on earth. If we could save all 40 ZB onto today’s Blu-ray discs, the weight of those discs would be the same as 424 Nimitz-class aircraft carriers. Or, 40 ZB will represent 5,247 GB of data per person worldwide.

Click through twice to see the full image.

 

Contact Us for News Tips, Corrections and Feedback

This thread is closed for comments
6 comments
    Your comment
  • zrobbb
    So. Much. Porn...
  • wavetrex
    The numbers are pulled out of their collective a$$es. It's impossible to estimate how much data exists today...

    For example, one single PRIVATE torrent tracker which I have access to tracks around 500 TB of "unique" data, but there are tens of thousands of users, which replicate this data in large amounts. And that's just multimedia on ONE such site.
    What about all the crap that every company stores in their datacenters? Nobody knows exactly what they have, the data is private. What about the billions of people that use one kind of computer, nobody would know exactly how much data they have stored and what it represents.
  • smuggl3r
    wavetrexThe numbers are pulled out of their collective a$$es. It's impossible to estimate how much data exists today...For example, one single PRIVATE torrent tracker which I have access to tracks around 500 TB of "unique" data, but there are tens of thousands of users, which replicate this data in large amounts. And that's just multimedia on ONE such site.What about all the crap that every company stores in their datacenters? Nobody knows exactly what they have, the data is private. What about the billions of people that use one kind of computer, nobody would know exactly how much data they have stored and what it represents.


    What they are talking about is internet traffic. That can be measured easily.