The UK’s internet infrastructure may not be able to handle another huge heatwave

The infrastructure that runs online services will not be able to survive further heatwaves

Adam Smith
Wednesday 27 July 2022 14:50 BST
Comments
Related: London declares ‘major incident’ as heatwave grips Europe

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

The United Kingdom recently went through a heatwave the likes of which had never been seen before.

Temperatures reached 40 degrees Celcius their peak, resulting in numerous homes being set ablaze and wildfires raging across Europe pumping out the same amount of carbon dioxide as Estonia’s yearly total.

But the issues created by the heat also had an unexpected victim - the infrastructure of the internet. Google and Oracle both had to take off their cloud services and servers because their cooling systems could not handle the temperature and were at risk of burning out.

Oracle said that it was “ identifying service infrastructure that can be safely powered down to prevent additional hardware failures” while Google Cloud products experienced “elevated error rates, latencies or service unavailability” when trying to reach one of its London servers.

Although the disruptions only lasted briefly, should temperatures continue to increase as they have done due to the effects of climate change the issues will get worse. In 2070 winters could be up to 3.9 degrees Celsius warmer and summers could be up to 4.9 degrees Celsius hotter - with both users and businesses feeling the effects.

The reason for this is because data centres are simply large rooms full of computers, and removing heat from inside those machines has always been one of the major challenges for engineers.

In the 1990s, when many data centres in the UK were newly constructed, chilled water had to be circulated throughout the facility alongside air conditioning systems. This is because computers can only get rid of heat at a certain rate, but designers did not foresee such a dramatic increase in heat happening in the country and the potential for errors, data loss, and other unpredictable problems is now rising.

"With the frequency and severity of severe weather events globally, it is entirely likely that future occurrences such as the recent extreme heat are a likely possibility”, predicts Mitch Fonseca, senior vice president for global data center company Cyxtera. “These events can place increased demands on utilities that include higher demand on power, and increased municipal water use to operate data centers”.

The significant rise in the amount of technology we use, and the amount of data that is generated, is also a key issue. “If there’s one thing we can be certain of, it’s that demand for digital services will only increase. Our reliance on apps for everything in our daily lives isn’t likely to retreat now we’ve learnt to expect things on-demand and at relative ease”, Russell Poole, managing director for another global data centre company, Equinix, told The Independent.

Companies have a number of solutions to tackle these problems. Site operators monitor weather forecasts and, in some cases, manufally administer water across coils until peak heat subsides. Physical barriers are used to contain cold air in supply aisles and hot air in exhaust aisles where waste heat can be removed quickly, minimising the amount of air mixing between the cold and hot aisles to ensure efficient distribution of cooling air possible.

Nevertheless, many data centres will still face challenges. “A data centre, when you build it, typically has a lifetime of 20 to 30 years”, Professor Alan Woodward, a computer security expert from the University of Surrey, says. “But rather like a lot of buildings in central London, they were designed like that but have been there for a lot longer.”

Should the climate problem continue apace, it could mean that companies find their servers get shut off in desperate times. This could mean certain niche services become available, users seeing software running slower, and storage devices could lose data that becomes impossible to retrieve.

“A degree here and a degree there does not sound like much, but you’ve got so much heat being produced in these halls, and you’re running everything so close to the line, that moving up just a few degrees could be what takes it over the limit”, Professor Woodward says.

Big companies like Microsoft have more extreme solutions. In 2018, the company sunk a data centre off the coast of Orkney Island in the North Sea as part of a "moonshot research effort" to make the internet more eco-friendly, and while the company see it as an “additional offering” rather than a replacement of land-based data centres it may become a preferable long-term solution.

Yet the rise of giant technology conglomerates like Microsoft, Amazon, Meta, Google who have consolidated the internet by buying up its infrastructure presents its own problems. Amazon Web Services currently controls 33 per cent  of the internet’s back-end. Microsoft is second at 18 per cent, with Google third at 9 per cent; when those companies run into problems and the infrastructure goes dark, millions of people notice.

An outage in December last year hit many of the world’s biggest apps and services, from Disney Plus and Tinder to Coinbase to name but a few, not to mention Amazon’s own products like Alexa voice assistant, the Kindle, Amazon Music and its Ring security cameras. While many servers can be accessed and rebooted remotely, some need physical access. A Facebook outage in October, which took down the company’s products for six-hours in “an error of [its] own making”, meant engineers were forced to physically access “hard to get into” data centers because the internal tools that it normally uses to address such issues were broken.

These companies “may have more money than God, but you’ve got a CEO who’s looking at the profit line as well, so you might build outside of the UK in countries such as Iceland, or under the ocean”, Professor Woodward says, “and if that means you’ve got a whole thing up from the ocean floor [when things go wrong], then that’s a bit of a drawback.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in