To keep its search engine (as well as many other products) running around the clock, Google maintains 13 large data centres across the globe. Data centres themselves, as facilities that house computer systems, have existed for decades. They’re commonly used to back up data in large archives or – as is more common with the onset of cloud computing – access to IT resources. Data centres have grown in proportion with the expansion of the internet in the last 20 years. Today they can take up the same space as an industrial plant, require tight security (both physical and logical) and contain millions of pounds worth of computing equipment.
Currently six of Google’s data centres are located in the US, one in South America, three in Europe and three in Asia. They prioritise efficiency by focusing on the way power is distributed and cooling, with environmental impact in mind. The servers are a proprietary and patented modular format housed in their hundreds in individual shipping containers. But rather than using a huge, centralised uninterruptible power supply (UPS) for each facility, Google includes a battery on each server which is much more power efficient. Data centres can run very hot if left unchecked – especially in summer. To combat this some are strategically located, like the Hamina facility in Finland, which uses the icy waters of the fjords to keep machinery cool.