By Carlie Bacon
Some like it hot, but datacenters don’t. When they get too toasty they crash, making waves in the sea of data storage and access.
Microsoft is making waves of a more useful variety.
The company just launched Project Natick—a research effort that includes underwater data centers. As cloud computing becomes more prevalent, Microsoft aims to improve the ways we manipulate data. The underwater setting provides better cooling, renewable energy, and a more controlled environment than traditional land options.
From August to November of 2015, researchers tested the self-contained prototype, Leona Philpot, about one kilometer off of the Pacific coast of the United States near San Luis Obispo, California. The prototype can operate several hundred feet below the surface, though in this trial it was only about 30 feet deep. Outfitted with over 100 different sensors to track performance and conditions, the first-ever Natick datacenter functioned swimmingly.
Because the Natick datacenters are self-contained and mobile, Microsoft envisions a system where the datacenters can be deployed to areas in greatest need of data access, such as natural disasters and large events like the Olympics. Currently, Microsoft has over 100 datacenters around the world, including Amsterdam, Australia, Brazil, China, Ireland, Hong Kong, Japan, and numerous locations throughout the United States. These datacenters contain over 1 million servers. Even this global network would be quickly trumped by Natick underwater datacenters, which could be quickly deployed and stationed around the world.
Importantly, this ready access will decrease network latency, the delay in data communication over a network. When a user attempts to retrieve information remotely, the content travels through a network from a datacenter server to the user’s device. If a user is far away from the server that is storing the content, that user may experience low-quality service and lagging response times.
While this trial’s 1-kilometer distance from the coast was not even close to international waters, it is possible that Natick datacenters’ deployment to far-flung areas could involve the high seas, and its mobile nature generally could spark jurisdictional issues. Microsoft is currently embroiled in a Second Circuit case with the United States Department of Justice over whether the U.S. government, under authority of a Stored Communications Act search warrant, can compel the production of data that is stored internationally. It will be very interesting to see how traditional laws evolve to address data being stored in novel ways.
Microsoft also designed the Natick datacenters to help the environment, with renewable energy sources, zero emissions, and no waste products. These perks seem promising, but it remains to be seen whether this new system will be acceptable under environmental legal frameworks. For instance, even though Microsoft reported that the sea life in the area quickly adapted to the Natick center’s presence, existing laws, such as the Marine Mammal Protection Act, may still require testing and approval before any Natick datacenter could be placed underwater long-term.
If Project Natick proceeds successfully, underwater datacenters could be the wave of the future. Let’s just make sure that Microsoft researchers deploying the underwater datacenters know to steer clear of coral reef.
Image source: Microsoft.