Scroll Top

Microsoft experiments with underwater datacenters

article_msftdc

The oceans are home to coral reefs, ship wrecks and now – datacenters

Continuing its long tradition of datacenter experimentation in the name of efficiency, Microsoft announced it has been testing an unusual new datacenter concept – placing servers underwater out in the ocean.

Close to half of today’s society lives near large bodies of water, and since physical distance creates the ultimate speed limit for transferring data, storing data under the sea, close to major population centers, is, despite some of the obvious complexities, a logical way to optimise delivery of cloud services.

“Half of the world’s population lives within 200 km of the ocean, so placing datacenters offshore increases the proximity of the datacenter to the population, dramatically reducing latency and providing better responsiveness,” said the company as the introduced what’s now known as Project Natick.

 

Microsoft’s underwater datacenter
 

Microsoft hasn’t shied away from experimenting with novel ideas for datacenter infrastructure in the past. In Wyoming, for example, the company tested a datacenter powered by fuel cells that converted methane from a waste processing plant to electricity. In another experiment, Microsoft researchers tested small fuel cells installed directly into IT racks.

What the company will learn from Project Natick though may lay the groundwork for deploying datacenter capacity underwater at scale, cooled by seawater and potentially even powered by tidal energy.

“While every datacenter on land is different and needs to be tailored to varying environments and terrains, these underwater containers could be mass produced for very similar conditions underwater, which is consistently colder the deeper it is,” the company said.

Another potential benefit is quick deployment. It took 90 days to build and deploy the test system, which is much faster than the typical process of getting permits for a brick-and-mortar datacenter, designing, and building the facility.

 

See also
Researchers gene hacked human cells and turned them into dual core computers

 

Around August of last year, Microsoft researchers deployed the test system off the coast of California. It was a rack of standard servers sitting in a cylindrical steel shell (10 feet by 7 feet). Heat exchangers were outside of the shell, providing the servers with free cooling then in December, the 38,000 pound container was out of the water and back at the company’s campus in Redmond, Washington.

Microsoft has not released any results of the experiment yet, saying only that they were “promising.” At this stage, the project is more about collecting data than developing a specific solution. There are still major hurdles to actually implementing something like this.

“While at first I was skeptical with a lot of questions. What were the cost? How do we power? How do we connect?” Christian Belady, general manager for datacenter strategy at Microsoft, said in a statement.

“However, at the end of the day, I enjoy seeing people push limits.” Belady says.

“The reality is that we always need to be pushing limits and try things out. The learnings we get from this are invaluable and will in some way manifest into future designs.”

Related Posts

Leave a comment

FREE! DOWNLOAD THE 2024 EMERGING TECHNOLOGY AND TRENDS CODEXES!DOWNLOAD

Awesome! You're now subscribed.

Pin It on Pinterest

Share This