Microsoft’s Project Natick looks to the ocean to alleviate the overheating issues that data centers are prone to in something they believe is mission possible.
Microsoft tests “mission impossible” data center in the ocean
In the latest installment of the Mission Impossible films, Tom Cruise/Ethan Hunt was nearly killed in his efforts to steal data from an underwater data center off the coast of Morocco. While fiction, it’s closer to reality than you might think.
The inherent problem with data centers is heat and their biggest operating cost is air conditioning. The world we live in requires massive data centers with thousands of servers to store, stream and deliver the information we take for granted.
“When you pull out your smartphone you think you’re using this miraculous little computer, but actually you’re using more than 100 computers out in this thing called the cloud,” said Peter Lee, corporate vice president for Microsoft Research and the NExT organization. “And then you multiply that by billions of people, and that’s just a huge amount of computing work.”
The problem is the heat and Microsoft is looking to the ocean to counter the problem.
If you think that electricity and water are a bad mix, you’re not alone.
“When I first heard about this I thought, ‘Water … electricity, why would you do that?’ ” said Ben Cutler, a Microsoft engineer involved in the Project Natick system. “But as you think more about it, it actually makes a lot of sense.”
In addition to the cold water that would help keep the heat of an underwater data center at bay, Microsoft is looking at the feasibility of using a turbine powered by underwater currents or tidal energy to supply the necessary electricity to make the data run smoothly.
Microsoft’s first prototype is back on land
Our lives require serious closet space. Your email provider, Facebook, Netflix, and just about everything we take for granted requires vast amounts of storage. Microsoft alone operates well over a hundred data center world wide and those are hardly sufficient forcing the company to construct new ones each year. The company has spent nearly $20 billion on a data center system that gives the world access to over 200 online offerings that Microsoft gives its customers.
The prototype of Microsoft’s first underwater data center was named “Leona Philpot” after a character in the company’s wildly successful Halo series of video games and was a product of a research paper from 2014 written by a number of Microsoft data center engineers including one with submariner experience.
The “Leona Philpot” is roughly eight feet in diameter and the steel capsule was launched off the coast of California near San Luis Obispo where it was placed on the ocean’s floor 30 feet underwater for its 105 day test. By all accounts, that test was considerably more successful than Microsoft anticipated.
Clearly it was impossible to man a tiny capsule 30 feet underwater and was operated remotely from a Microsoft campus. In order to run the center, the engineers behind the capsule needed to observe what was happening inside and installed with over a hundred sensors to monitor pressure, motion, humidity and a litany of other factors during testing.
The 105-day trial was not planned for that extended of a duration but based on the success enjoyed by the prototype the team extended its undersea experiment and even ran parts of Microsoft’s Azure cloud computing service while submerged.
Given the success of the project, Microsoft engineers are working with what they know and what they discovered to build another data center to be placed underwater that is over three times the size of its prototype.