In 1865, a twenty-nine-year old English economist named William Stanley Jevons published a book, “The Coal Question,” in which he made the argument that coal reserves were rapidly depleting, threatening Britain’s affluence. More importantly, he made the observation that increasing efficiency in coal usage will not to reverse this process and reduce consumption, but it will further increase the coal consumption.
Economist call this “Jevons Paradox” – the proposition that technological progress that increases the efficiency with which a resource is used tends to increase (rather than decrease) the rate of consumption of that resource.
I believe similar paradox applies to what we call “Big Data” technologies today. Big Data tools like Hadoop, NoSQL, NewSQL and other next-generation Data Management technologies have been created to address the explosion of data and increase the economic efficiency of data management. Economic efficiency boils down to ability to use commodity hardware with lower power consumption characteristics, storage compression and opportunity cost of gaining insight faster (in real-time).
Once organizations become adept at handling terabytes and petabytes in economically efficient way, per Jevon’s paradox, there will be increased consumption of “Big Data” technologies and soon the need to handle Exabyte’s and Zettabytes. Or, even Bigger Data.
Update: Empirical evidence of this paradox in the graphic (below) sourced from Cloudyn