In case you didn’t already know, dumping computer equipment into water is generally not a very good idea. This fact is mainly the reason why Microsoft’s dumping of data centers is so interesting.
Data centers are basically buildings containing various computer equipment that process all of the internet we use. With the increase in the usage of cloud-based services and various other internet provisions, data centers are in such a high demand right now. But the problem that comes with them is that they are so expensive to maintain. Not only do they consume a lot of energy, most of the energy they consume is spent on the cooling system that prevents the components from overheating.
Given these pieces of information, Microsoft’s idea of putting data centers into the ocean is starting to make much more sense. The idea behind this is that by doing so, Microsoft should be able to capitalize on the low temperatures of the bottom at the ocean. Since the mass of the ocean is practically limitless compared to that of the data centers, there will be virtually no harm in doing so.
Last year, Microsoft initiated Project Natick. In this project, they put a prototype called Leona Philpot into the the Pacific Ocean for 90 days and tested its ability to endure the harsh underwater conditions. It is said that the prototype performed better than expected.
Microsoft thinks this is a good idea because not only does the ocean ensure zero maintenance of data centers for very long periods of time, but they also think that setting up data centers will become much easier by doing so. This is based on the fact up to 4.5 billion people live within 125 miles of a shoreline. This will not only make it much easier to set up data centers for people nearby, but it will also save up a lot of space on the land, which can be now be used for other purposes.