Microsoft Finds Comfort and Reliability in Underwater Datacenters
An underwater expedition led to Microsoft’s conviction in underwater data center investments.
Microsoft reached the crux of years of constant research with the discovery of underwater data center feasibility. After the latest deployment, three years of proper testing, and a crucial summer expedition that ended the years-long survey, Microsoft came upon this decision.
The company is adopting a “two-phase immersion cooling” method.
The adopted method by Microsoft involves bathing servers.
This innovation is inspired by a doubling phenomenon called Moore’s law.
The company took notes from cryptocurrency miners in this innovation.
Microsoft pursued a hunch on how Atlantis could make operations easier than land and came out with smiles. Earlier in the summer, a data center the size of a shipping container, well decorated with sea debris, was retrieved from the seafloor off the Orkney Islands in Scotland. This verifies that the theory of feasible, financially advantageous, environmentally enabling data centers was not just all talk or misgivings.
In 2018, Microsoft’s team, which went by the pseudonym Project Natick, expedited the Northern Isles datacenter, which went 117 feet deep to the seafloor. The team maintained two years of testing and monitoring data centers ’ servers. Project Natick had a hypothesis that reliable datacenters could come out placing a sealed container on the ocean floor.
They highlighted that this project would have nothing to worry about regarding corrosion due to humidity and oxygen, bumps, temperature fluctuations, human erraticism, and most limitations and problems that data centers encounter on land.
Ben Cutler, a Microsoft Special Projects research project manager and relatively the leader of Project Natick, highlighted how Project Natick could help Microsoft’s datacenter sustainability strategy in terms of waste, water, and energy. He intends to force the issue with a Microsoft Azure team that envisages giving customers better global and tactical outreach for data centers.
Microsoft first conceived the underwater data center idea during an inspiring ThinkWeek session in 2014. The concept received acclaim and was seen as a way to improve cloud services to ocean-populated areas and save energy. Given that a sizable amount of the world’s population lives within 120 miles of water bodies. Putting datacenters underwater near coastal populace around the world would mean enhancing indiscriminate geographical distribution of high-speed internet.
Moore’s law, named after Gordon Moore, Intel co-founder, is the doubling phenomenon. He scrutinized the trend in 1965 and forecasted a minimum figure of 10 for continuity of the phenomenon. It is now slowing down after holding through the 2010s. This is due to changes in transistor widths down to the atomic scale and nearing their physical limits. However, as is widely known, demand for faster computer processors emphasizing higher output applications like AI has increased.
The two-phase immersion demands a suitable environment deployment. That is high on Microsoft’s list as their long game is meeting up more powerful, faster computer demands at a time of reduced tempo for advances in air-cooled computer chip technology.
The computing industry is now focusing on-chip infrastructure that can withstand higher electric power to meet improved performance.
Digital currency organizations have always preserved currency transaction logging equipment with liquid immersion cooling. They picked up the initiative first. Microsoft took a deeper look into their mannerism, thinking of how they could bolster production with liquid immersion.
Get similar news in your inbox weekly, for free
Share this news:
Today, companies make the most use of cloud technology regardless of their size and sector. …
In this post, you will learn how to optimize your cybersecurity and performance monitoring tools …
We launched the first episode of a webinar series to tackle one of the major …