廣告

2011年6月19日 星期日

Data Centers Look for Lower-Emission Cooling

Green Column

Data Centers Look for Lower-Emission Cooling


BRUSSELS — Putting computers near water is usually discouraged. But water could become vital for some companies seeking to cool the powerful servers that store and exchange vast amounts of information.

Google, which runs five large data centers, is planning to open one of its most efficient facilities in a former paper mill on the coast of Finland later this year.

“It’s the first time that I know that seawater has been used for data center cooling, but in other industries it’s actually quite common,” said Urs Hoelzle, a senior vice president at Google.

“Over all, there is huge opportunity for improvement” in the way the industry approaches energy efficiency, including cooling, Mr. Hoelzle said.

Data centers account for most of the energy used by Google. The servers inside are key to ever-faster search results and data-rich services like video-conferencing and music downloads. Industries like banking and health care are also creating huge demand for added capacity.

In a study published three years ago, Jonathan Koomey, a consulting professor at Stanford University, found that powering and cooling the equipment in data centers represented about 1 percent of total global electricity consumption in 2005, or about 0.3 percent of global emissions of carbon dioxide.

Mr. Koomey, who is updating those figures, emphasized that the most useful measure of the environmental footprint for the technology industry was not necessarily the amount of emissions created by data centers or digital devices taken on their own. He said it also was important to examine the way technology improved the environmental performance of the broader economy. He said downloading music represented huge savings in greenhouse gases that otherwise would have been emitted in manufacturing, shipping and recycling CDs.

Mr. Koomey said moving more of the operations run “in house” by companies to more efficient data centers would substantially lower the overall environmental footprint of the industry.

He also said there was a need to continue making all data center equipment as efficient as possible. Locating “data centers near cool bodies of water is one technique that works,” he said.

Even so, building more efficient data centers and getting smarter at managing them could “only blunt the underlying growth” of the sector and the “strong growth in the electricity that data centers consume,” said James M. Kaplan, a partner at McKinsey & Co. in New York.

Google already uses water for cooling at a center in Belgium. The facility treats and cleans water from a canal. The water is pumped to the data center and then into coils, over which warmed air from the servers is passed. The water in the coils absorbs the heat before it is pumped to a tower. Some of the water is recycled and some evaporates into the atmosphere.

That concept is somewhat similar to efforts by PEER 1 Hosting, which operates 17 server farms in Europe and North America and plans to open a new site at Portsmouth, England, in October.

In Portsmouth, PEER 1 plans to funnel air warmed by the servers to a chamber where it is to be cooled as it passes through metal plates sprayed with water. The water would be recycled, while the cooled air would be blown back through specially sealed aisles, rather than wasted on empty parts of the building. Refrigeration could still be used, but only when weather was particularly hot or humid.

New cooling methods could help PEER 1 win business and maintain profitability when electricity prices are rising, said Dominic Monkhouse, the managing director for PEER 1 in Europe. Companies like the giant supermarket chain Tesco that were directly or indirectly using PEER 1 services were demanding lower energy use from all parts of their supply chains, including data centers, as part of efforts to reduce their carbon footprint, he said.

At its best-performing facility in Toronto, PEER 1 needs power for cooling, mostly involving fans, amounting to 35 percent beyond what it uses to run the servers.

At Portsmouth, it aims to lower that figure to 10 percent.

At the five centers owned and operated by Google, that figure is 16 percent.

Mr. Hoelzle said the site at Hamina, northeast of Helsinki, should turn out to be somewhat more efficient in terms of water and energy use than the Belgium location.

Google plans to draw raw seawater directly from the Gulf of Finland into a large tunnel that a paper mill used for cooling. The seawater would then be used to cool a separate set of water pipes, running in a closed loop inside the data center, that would absorb heat from the servers. The warmed seawater would then be allowed to cool before it was pumped back into the gulf to minimize effects on the environment.

Google has invested about $400 million in renewable energy projects, and it plans to buy increasing amounts of electricity from those sources for its centers.

But the company was not about to install windmills or solar panels to feed green power directly.

On-site renewable energy “looks good” but was not “a rational idea,” Mr. Hoelzle said. Suitable sites for data centers “may not be very sunny because you don’t want it to be too hot, and it may not be very windy,” he said.

But putting new farms near bodies of cold water would not always be practical either, because of factors like the need to locate servers near enough to users to offer the best network speed.

“Data center site selection is sort of the art of compromise,” Mr. Hoelzle said.

Mr. Monkhouse of PEER 1 said the race to lower energy use at data centers had generated an explosion of ideas for cooling servers, including some that appeared far-fetched or impractical like immersing the machines in metal cases surrounded by oil to drain heat away even faster than water.

“I must get three or four e-mails a week saying, ‘Have you seen our new technology?”’ he said. “At every level, people are trying to innovate.”

沒有留言:

網誌存檔