INSIDE BINGHAMTON UNIVERSITY
New contract supports improvement of data centers
By : Susan E. Barker
Over the next two years, researchers at Binghamton University and partnered institutions will help protect life as we know it by optimizing the work of data centers. A $437,270 two-year project backed by a $247,533 contract from the New York State Energy Research and Development Authority (NYSERDA) will help improve the design, energy efficiency and information processing efficacy of such centers.
Thousands of data centers all process information, critically important to much that drives our daily lives from world financial markets, and government and military operations, to business and industry, worldwide shipping and transportation, health and human services, and entertainment even organized athletics and religion.
The project, “Optimizing Airflow Management Protocols in New York Data Centers,” will team researchers at the University, Georgia Tech, Lawrence Berkley Labs and IBM to survey, model, and test design improvements to an existing Manhattan data center, in hopes of devising new design strategies to employ the world over.
Peter R. Smith, NYSERDA’s president, said the project is one of several high-tech, but fundamentally important, research projects sponsored by the public benefit corporation. “These data centers are high electric-demand nerve centers whose utility service is large,” Smith said. “They require stable and secure electric power for machine operation and cooling. Considering New York’s prime financial center role, NYSERDA seeks to find ways to serve these centers efficiently and securely, and then replicate those designs at universities and other large computing power centers around New York.”
Just now approaching adolescence, data centers have become the heart and central nervous system of the information age. Without them, the Internet would be reduced, to a grown-up, digitized version of the old Campbell’s soup-can network, where end users hunker down and share sketchy information available to few and meaningful to even fewer.
Bahgat Sammakia, BU interim vice president for research and director of the Integrated Electronics Engineering Center, has spent much of his 30-year career working to improve thermal management strategies in electronics packaging devising ways to keep computers and other electronics cool.
“Imagine the heat generated by a 100-watt light bulb,” Sammakia said. “Now take that same amount of power and double it, so you have the equivalent of a 200-watt light bulb in an area the size of a computer chip. Now you have extremely high heat density. When you turn your machine on, the temperature shoots up to 200, 300 or even 400 degrees Celsius.”
Without adequate thermal management, the electricity coursing through your computer would reduce it to a puddle of melted solder and burnt plastic before you could pull the power cord.
The issue becomes even more heated, however, when hundreds of pieces of electronic equipment, drawing thousands of kilowatts of power 24 hours a day, seven days a week, 52 weeks a year, are housed together in a data center, he said. If computers had their way, data center energy costs would be significantly higher. For all the heat they generate, computers actually thrive on cold. Most electronic computing devices run 30 to 40 percent faster at subfreezing temperatures, Sammakia said.
Human interaction and monitoring of data center equipment, however, is a round-the-clock enterprise, so optimum temperatures in the range of 60 to 70 degrees Farenheit are maintained for the practical purposes of human comfort and survival. While computers might prefer an Arctic clime achieving temperatures even as cool as a late spring day are a tall order, Sammakia said.
“We’re talking about rooms the size of basketball courts, where you have row after row of main frames and servers, all dissipating heat, and the entire room is designed with the sole purpose of sustaining this equipment and maintaining it at the right temperature,” Sammakia said. “These data centers, and there are hundreds of them in New York City alone, consume massive amounts of power. By making computations more cost and energy efficient, by reducing total energy consumption, and by passing on the environmental benefits of those savings, any energy efficiency can make a very significant difference.”
Simply discovering the best location for cold-air delivery vents could easily mean million-dollar savings once incorporated as a standard strategy in data center designs, Sammakia said.
The research team will strive to achieve whatever energy efficiencies it can. Sammakia believes improvements on the order of 20 to 25 percent are “very doable.” To reach that goal, University researchers will build numerical, computer models of the Manhattan data center during the project’s first year. Their models will then be used to enhance the ability to predict the efficacy of design changes proposed throughout the remainder of the project.
Researchers at Georgia Tech will confirm modeling accuracy by building an actual room at scale and taking appropriate measurements. In year two, researchers will take measurements in the actual data center and will write a design guide to help improve energy efficiency in all data centers. IBM, an established leader in energy metrics, will mentor modeling and measurements, while Lawrence Berkley Labs, a prominent name in data center research and design in California, will help benchmark the Manhattan data center.