Reducing the Massive Energy Appetite of Data Centers

Reducing the Massive Energy Appetite of Data Centers lead image
Tony Webster via flickr
(Inside Science) -- The data centers of Google, Facebook and other major online giants consume vast amounts of electricity. Now scientists are devising new chips and other strategies to help data centers save energy.
According to the U.S. Department of Energy, U.S. data centers, which are giant warehouses packed with computer servers, accounted for roughly one-fiftieth of total U.S. electricity use in 2014
Roughly 40 percent of the energy that data centers consume
Another strategy that researchers are investigating to reduce data center energy consumption is to make them more efficient. For instance, Massachusetts Institute of Technology computer scientist Arvind and his colleagues now find that changes to the “caches” that data centers use to store the results of common queries can slash cache power consumption by a factor of 25. “My guess is that these caches take about 10 percent of the energy resources of data centers,” Arvind said.
Date center caches generally store data using dynamic random-access memory (DRAM) microchips, which are fast but expensive and energy-hungry. Instead, Arvind and his colleagues’ new strategy, known as BlueCache
Flash is much slower than DRAM. Still, the scientists noted that flash still works much faster than human reactions, so they reasoned that incorporating flash could help data centers save energy “while incurring delays that people find acceptable,” Arvind said.
BlueCache employs a number of techniques to make their servers competitive with the current DRAM workhorses. For instance, it adds a little DRAM to each of its caches to make it more efficient at spotting data it has not yet imported that are needed for common queries. Also, instead of relying on software to read, write and delete data, each of its caches uses specialized hardware circuits for these operations, increasing speed and lowering power consumption. Moreover, BlueCache bundles queries for data from the caches together to maximize efficiency of communication.
Another strategy that electrical engineer David Wentzlaff at Princeton University and his colleagues are pursuing is based on how data centers often help many users carry out similar tasks, such as checking email or browsing the web. The new microchip architecture they are developing, called Piton
Each Piton chip also controls the access that competing programs have to memory in a way that can independently yield an 18 percent performance boost. Moreover, placing the cores or processors of a chip physically closer to the data they need can independently increase efficiency by 29 percent
Currently Wentzlaff and his colleagues have packed 25 cores
One way data centers might conceivably become more environmentally friendly is to rely on electricity from renewable sources instead of fossil fuels. However, previous research by computer engineers Michael Zink and David Irwin at the University of Massachusetts at Amherst and their colleagues found that data centers actually consume more electricity
Still, there are ways to optimize data center use to account for these fluctuations, Irwin said. “Computation is really flexible -- the workload of one data center can be sent to another data center where the sun is shining,” he said. “We see renewables as a big part of data center energy consumption going forward.”
When it comes to reducing data center energy consumption, “there are many, many strategies that people are pursuing,” Arvind said. “I don’t see any as a silver bullet, but when you accumulate them all over five or 10 years, their results look amazing.”