Saturday , November 25 2017
Home / Economy / Peak memory

Peak memory

Image: Cory M. Grenier

Central to the myth of the “fourth industrial revolution” is the idea that, unlike the planet, hyperspace is infinite.  The myth is easy enough to get away with when most of the Internet’s unthinking users spend their days staring transfixed at the screen of the latest i-phone.

We’ve all heard the stats.  YouTube alone has 1.5 billion registered users who watch an average of a billion hours of content a day.  This only scratches the surface of total content, which is currently growing at a rate of 400 hours per minute.  That is just the content of a sub-division of Google.  Now add in the content on Amazon, Facebook, Twitter, Instagram, etc.  And then there are the myriad personal and corporate sites and blogs that billions of people add content to every day.  Soon, we are told, even this will be just a tiny fraction of the “smart grid” in which our household goods begin to interact with a new “internet of things.”

The trouble with this imaginary brave new world is that the real action goes on out of sight in a million datacentres and the communications infrastructure that connects them all together.  The electricity required just to keep all of those hard drives cool is enough to power Germany and Japan combined.  But even in the unlikely event that our finite planet can supply sufficient additional  energy to allow the Internet to continue growing exponentially, we may be about to crash into another physical barrier.

In a 2015 interview with Tech Radar Mark Whitby, Senior Vice President at Seagate Technology (the leading provider of hard drives) raised the spectre of a “data capacity gap:”

“We are entering a world where everything is connecting to everything else and the resulting big data is anticipated to solve virtually all our problems. However, by 2016, the hard drives housed in all those connected devices, whirring away in countless data centres, will start to reach their limits.

“The total amount of digital data generated in 2013 was about 3.5 zettabytes (that’s 35 with 20 zeros following). By 2020, we’ll be producing, even at a conservative estimate, 44 zettabytes of data annually.

“A zettabyte might not be a word you’ve heard of – even Word’s spellchecker doesn’t recognise it – but consider it in terms of a more familiar unit. A standard smartphone today will have around 32 gigabytes of memory. To get to one zettabyte you would have to completely fill the storage capacity of 34,359,738,368 smartphones.”

That prediction appears to have been borne out by the wave of memory shortages that have persisted through 2017.  In January, Trevor Pott at Virtualization Review reported that:

“The new year sees us smack in the middle of a storage crisis. This has been building for some time, and several things have conspired to create a perfect storm for storage. Naturally, this will affect everything else in the datacenter.”

The result has been a dramatic increase in the cost of memory around the world.  Although PC world report that prices may fall back by 2019.  This may not, however, be good news.

Price volatility goes hand in hand with physical resources reaching hard limits.  Just look at what has happened in the oil industry since global conventional crude oil production peaked in 2005.  The anticipated outcome was that prices would shoot up into the stratosphere and then keep on rising.  However, at least in the short term, the economists’ predictions beat the geologists.  Oil prices spiked above $100 per barrel; helping to trigger the 2008 financial crash and the subsequent depression.  But prices didn’t keep rising.  Instead, the industry found a temporary work-around in the shape of massive debt and hard to recover shale oil deposits.  All of that new oil hitting the global market just at the point where high debts and low incomes in the west sent western oil demand falling, caused the price of oil to crash below $50 per barrel in 2014, where it has stayed ever since.  Oil, it turns out, cannot be substituted.

In response to memory shortages and higher prices, computer hardware companies have shifted resources away from other components in an attempt to build more memory.  But the result is that everything from batteries to display screens are now increasing in price.  It may be that memory cannot substituted either.  As Mark Whitby warned:

“Unfortunately, the imminent gulf between storage demand and production is not a problem that can so easily be solved. The fact of the matter is that it’s far harder to manufacture capacity than it is to generate data. Building factory capacity that is capable of meeting such stratospheric demand would take hundreds of billions in investment. It’s simply not a realistic option.

“Another factor is the technology in use by the storage industry today. Even if the investment was there and thousands of new data centres could be commissioned, it’s becoming more difficult on a molecular level to squeeze increasingly dense volumes of information onto the same amount of space…

“So, while the ability to squeeze ever more dense data onto the same amount of space is a real testament to human ingenuity and engineering, it’s starting to reach the point where new technologies will have to take over.”

This also sounds reminiscent of the energy debate, where some new technology is supposedly going to come riding to the rescue.  The problem is that if those new technologies existed anywhere beyond a theoretical physics textbook, we would have started using them the moment shortages began to appear and prices began to rise.  We didn’t because, like nuclear fusion, biofuels and thorium reactors, nobody knows how to make them work in the real world – at least, not without driving prices so high that they defeat the object.

This is Liebig’s ‘law of the minimum’ in action.  A system will fail because of its least available component.  In computing, it appears, memory storage will be that component.  If so, then the days in which everyone has free (or at least bought with personal data) access to Internet content is coming to an end.  When it does, so too will all of the nonsense about singularities, knowledge economies and fourth industrial revolutions.

Of course, on a planet that can no longer sustain the Neoliberal demand for infinite growth, we may run out of energy, rare earths, phosphates or even food and clean drinking water before the Internet goes down.  Nevertheless, our unsustainable lifestyle is going to crash somewhere; and hyperspace is as good a place as any.

Check Also

Tale of two economies

A tale of two economies

In the years since the Great Financial Accident of 2008 it has become common to …