Warming swimming pools with data centers

 By Danny Bradbury

Cloud computing has made us smarter and faster than ever before. The hard work happens elsewhere though, so it’s easy to forget the environmental cost. Every time you ask Siri a question, Apple burns a milliwatt or two. Checking social media, sending emails and surfing the web all takes energy. Even making a credit card payment burns electrons

The computers that bring us these services reside in vast data centers that require large amounts of electrical power. In 2016, a report from Berkeley Lab estimated that U.S. data centers could consume over 100 terawatt-hours (TWh) of energy by 2020. That equates to around ten large nuclear power stations.
As more of our computing moves to the cloud, data center operators must find more efficient ways to crunch numbers for us. One of the biggest savings promises to be in smarter cooling. All those computers generate large amounts of heat. Around 40 percent of the power consumption in the average data center goes towards keeping it cool.

Diagram showing overview of cloud computing, with typical types of applications supported by that computing model (Wikimedia)

In the last few years, data center operators have developed more innovative ways to cool their computers without wasting energy on expensive, power-hungry chillers. One example is in Finland, where Google pumps icy-cold seawater from the Bay of Finland into its Hamina-based data center to cool its servers efficiently.

Placing data centers beneath mountains and oceans

In Stavanger, Norway, data center company Green Mountain has achieved a zero-carbon footprint by building a computing facility inside a mountain. Aside from the security inherent in an underground facility, the data center also draws water from a nearby deep-water fjord. A gravity-fed piping system brings consistent cold water into the facility, using just 3 kilowatts of power to gain more than 1 megawatt of cooling power.
Meanwhile, Microsoft is sinking its data centers at the bottom of the ocean. The company, which is investigating in the benefits of underwater data centers as part of its Project Natick research initiative, has placed a data center underneath the ice-cold waters near Scotland’s Orkney Island. The facility will be powered by tidal turbines and wave energy converters; the software giant has said. The tricky part will be ensuring that the facility can run entirely unmaintained, because resurfacing it for equipment fixes will be a costly and time-consuming task.
E-commerce giant Alibaba has decided to use its cooling system to make the local community a little prettier. The facility uses water from a local lake to cool its data center in Hangzhou. The water cools its servers and then flows through the city via a 2.5 kilometer open canal before running back into the lake.

Cooling solution scheme in Stavanger's Green Mountain data center (

Reusing heat

Decorating your local community with water from a data center is a smart move, but some data centers go further with a more practical approach. They use their excess heat in innovative ways. One of the earliest examples was the GIB-Services AG data center in Switzerland. Built by IBM, the facility feeds hot air produced by its computing equipment into a heat exchanger where it is transferred to water. The local town of Uitikon then pumps the water into a local swimming pool, saving the town from having to heat the water via conventional means.
If you can heat a swimming pool, then you can heat other buildings, too. One of the most imaginative data center installations we’ve seen is underneath Uspenski Cathedral in Helsinki. The facility is built directly into a former bomb shelter in the bedrock underneath the holy site. It transfers heat from its servers to a nearby district heating network, which is an underground system of water pipes. The heated water brings warmth to 500 nearby homes and saves $563,000 in annual heating costs.
More recently, Nordic data center operator DigiPlex teamed with Swedish energy firm Stockholm Exergi, to power 10,000 homes with computing energy. DigiPlex is retrofitting an existing data center that previously relied on HVAC cooling equipment, equipping it to transfer heat directly to the district heating grid.
Amazon has caught on to this idea with its planned four-block campus in Seattle. The company has been talking with the co-owners of a local data center to buy its excess heat. Across the street, the Westin Data Building Exchange is a “carrier hotel” that houses 250 telecom and internet companies.
The 34-story building had been venting heat from its computing facilities into the open air. Amazon now pipes the heat in the form of water to its campus, where it concentrates it to double its temperature. Using the waste heat will eventually heat 5 million square feet of office space, saving around 80 million kilowatt hours of electricity use over 25 years, Amazon says.

From data centers to homes

Instead of taking heat from the data center to the home, some companies are now cutting out the data center altogether. Nerdalize has hit upon the idea of installing servers directly inside peoples’ homes. The company sells the server space to commercial clients to process their computing jobs. As they work, the servers generate enough heat to warm water to 55 degrees Celsius. The company saves 3 tons of CO2  per home, and avoids having to build any data centers at all, achieving cost savings for its clients.
As companies continue to grapple with the cloud’s energy usage, we will see data centers pop up in more innovative places – on the edge of frozen lakes inside the Arctic Circle, on artificial floating islands and perhaps even in space. One thing is certain: as we continue to embrace new technologies such as artificial intelligence, wearable computing and augmented reality, our computing needs will only continue to grow.

READ ALSO: Energy & supercomputers: a data relationship by Michelle Leslie

about the author
Danny Bradbury