Artificial intelligence has grow to be vital in business and financial dealings, medical care, technology development, research, and way more. Without realizing it, consumers depend on AI once they stream a video, do online banking, or perform a web-based search. Behind these capabilities are greater than 10,000 data centers globally, every one an enormous warehouse containing 1000’s of computer servers and other infrastructure for storing, managing, and processing data. There at the moment are over 5,000 data centers in the USA, and recent ones are being built day by day — within the U.S. and worldwide. Often dozens are clustered together right near where people live, attracted by policies that provide tax breaks and other incentives, and by what looks like abundant electricity.
And data centers do devour huge amounts of electricity. U.S. data centers consumed greater than 4 percent of the country’s total electricity in 2023, and by 2030 that fraction could rise to 9 percent, in line with the Electric Power Research Institute. A single large data center can devour as much electricity as 50,000 homes.
The sudden need for thus many data centers presents a large challenge to the technology and energy industries, government policymakers, and on a regular basis consumers. Research scientists and college members on the MIT Energy Initiative (MITEI) are exploring multiple facets of this problem — from sourcing power to grid improvement to analytical tools that increase efficiency, and more. Data centers have quickly grow to be the energy issue of our day.
Unexpected demand brings unexpected solutions
Several corporations that use data centers to supply cloud computing and data management services are announcing some surprising steps to deliver all that electricity. Proposals include constructing their very own small nuclear plants near their data centers and even restarting one in all the undamaged nuclear reactors at Three Mile Island, which has been shuttered since 2019. (A special reactor at that plant partially melted down in 1979, causing the nation’s worst nuclear power accident.) Already the necessity to power AI is causing delays within the planned shutdown of some coal-fired power plants and raising prices for residential consumers. Meeting the needs of knowledge centers isn’t only stressing power grids, but in addition setting back the transition to wash energy needed to stop climate change.
There are numerous features to the information center problem from an influence perspective. Listed here are some that MIT researchers are specializing in, and why they’re necessary.
An unprecedented surge within the demand for electricity
“Prior to now, computing was not a big user of electricity,” says William H. Green, director of MITEI and the Hoyt C. Hottel Professor within the MIT Department of Chemical Engineering. “Electricity was used for running industrial processes and powering household devices reminiscent of air conditioners and lights, and more recently for powering heat pumps and charging electric cars. But now swiftly, electricity used for computing basically, and by data centers particularly, is becoming a huge recent demand that nobody anticipated.”
Why the shortage of foresight? Normally, demand for electric power increases by roughly half-a-percent per 12 months, and utilities herald recent power generators and make other investments as needed to satisfy the expected recent demand. But the information centers now coming online are creating unprecedented leaps in demand that operators didn’t see coming. As well as, the brand new demand is constant. It’s critical that an information center provides its services all day, day by day. There will be no interruptions in processing large datasets, accessing stored data, and running the cooling equipment needed to maintain all of the packed-together computers churning away without overheating.
Furthermore, even when enough electricity is generated, getting it to where it’s needed could also be an issue, explains Deepjyoti Deka, a MITEI research scientist. “A grid is a network-wide operation, and the grid operator can have sufficient generation at one other location and even elsewhere within the country, however the wires may not have sufficient capability to hold the electricity to where it’s wanted.” So transmission capability should be expanded — and, says Deka, that’s a slow process.
Then there’s the “interconnection queue.” Sometimes, adding either a brand new user (a “load”) or a brand new generator to an existing grid may cause instabilities or other problems for everybody else already on the grid. In that situation, bringing a brand new data center online could also be delayed. Enough delays may end up in recent loads or generators having to face in line and wait for his or her turn. Straight away, much of the interconnection queue is already filled up with recent solar and wind projects. The delay is now about five years. Meeting the demand from newly installed data centers while ensuring that the standard of service elsewhere isn’t hampered is an issue that should be addressed.
Finding clean electricity sources
To further complicate the challenge, many corporations — including so-called “hyperscalers” reminiscent of Google, Microsoft, and Amazon — have made public commitments to having net-zero carbon emissions inside the following 10 years. Many have been making strides toward achieving their clean-energy goals by buying “power purchase agreements.” They sign a contract to purchase electricity from, say, a solar or wind facility, sometimes providing funding for the power to be built. But that approach to accessing clean energy has its limits when faced with the intense electricity demand of an information center.
Meanwhile, soaring power consumption is delaying coal plant closures in lots of states. There are simply not enough sources of renewable energy to serve each the hyperscalers and the present users, including individual consumers. Because of this, conventional plants fired by fossil fuels reminiscent of coal are needed greater than ever.
Because the hyperscalers search for sources of unpolluted energy for his or her data centers, one option may very well be to construct their very own wind and solar installations. But such facilities would generate electricity only intermittently. Given the necessity for uninterrupted power, the information center would have to take care of energy storage units, that are expensive. They may as a substitute depend on natural gas or diesel generators for backup power — but those devices would must be coupled with equipment to capture the carbon emissions, plus a close-by site for permanently disposing of the captured carbon.
Due to such complications, several of the hyperscalers are turning to nuclear power. As Green notes, “Nuclear energy is well matched to the demand of knowledge centers, because nuclear plants can generate plenty of power reliably, without interruption.”
In a much-publicized move in September, Microsoft signed a deal to purchase power for 20 years after Constellation Energy reopens one in all the undamaged reactors at its now-shuttered nuclear plant at Three Mile Island, the positioning of the much-publicized nuclear accident in 1979. If approved by regulators, Constellation will bring that reactor online by 2028, with Microsoft buying all of the ability it produces. Amazon also reached a deal to buy power produced by one other nuclear plant threatened with closure as a result of financial troubles. And in early December, Meta released a request for proposals to discover nuclear energy developers to assist the corporate meet their AI needs and their sustainability goals.
Other nuclear news focuses on small modular nuclear reactors (SMRs), factory-built, modular power plants that may very well be installed near data centers, potentially without the price overruns and delays often experienced in constructing large plants. Google recently ordered a fleet of SMRs to generate the ability needed by its data centers. The primary one can be accomplished by 2030 and the rest by 2035.
Some hyperscalers are betting on recent technologies. For instance, Google is pursuing next-generation geothermal projects, and Microsoft has signed a contract to buy electricity from a startup’s fusion power plant starting in 2028 — though the fusion technology hasn’t yet been demonstrated.
Reducing electricity demand
Other approaches to providing sufficient clean electricity concentrate on making the information center and the operations it houses more energy efficient in order to perform the identical computing tasks using less power. Using faster computer chips and optimizing algorithms that use less energy are already helping to cut back the load, and likewise the warmth generated.
One other idea being tried involves shifting computing tasks to times and places where carbon-free energy is offered on the grid. Deka explains: “If a task doesn’t should be accomplished immediately, but somewhat by a certain deadline, can or not it’s delayed or moved to an information center elsewhere within the U.S. or overseas where electricity is more abundant, cheaper, and/or cleaner? This approach is generally known as ‘carbon-aware computing.’” We’re not yet sure whether every task will be moved or delayed easily, says Deka. “If you happen to consider a generative AI-based task, can it easily be separated into small tasks that will be taken to different parts of the country, solved using clean energy, after which be brought back together? What’s the price of doing this sort of division of tasks?”
That approach is, in fact, limited by the issue of the interconnection queue. It’s difficult to access clean energy in one other region or state. But efforts are under strategy to ease the regulatory framework to be certain that that critical interconnections will be developed more quickly and simply.
What in regards to the neighbors?
A serious concern running through all the choices for powering data centers is the impact on residential energy consumers. When an information center comes right into a neighborhood, there usually are not only aesthetic concerns but in addition more practical worries. Will the local electricity service grow to be less reliable? Where will the brand new transmission lines be situated? And who can pay for the brand new generators, upgrades to existing equipment, and so forth? When recent manufacturing facilities or industrial plants go right into a neighborhood, the downsides are generally offset by the provision of latest jobs. Not so with an information center, which can require just a pair dozen employees.
There are standard rules about how maintenance and upgrade costs are shared and allocated. However the situation is completely modified by the presence of a brand new data center. Because of this, utilities now have to rethink their traditional rate structures in order not to position an undue burden on residents to pay for the infrastructure changes needed to host data centers.
MIT’s contributions
At MIT, researchers are serious about and exploring a variety of options for tackling the issue of providing clean power to data centers. For instance, they’re investigating architectural designs that may use natural ventilation to facilitate cooling, equipment layouts that may permit higher airflow and power distribution, and highly energy-efficient air-con systems based on novel materials. They’re creating recent analytical tools for evaluating the impact of knowledge center deployments on the U.S. power system and for locating essentially the most efficient ways to supply the facilities with clean energy. Other work looks at the right way to match the output of small nuclear reactors to the needs of an information center, and the right way to speed up the development of such reactors.
MIT teams also concentrate on determining one of the best sources of backup power and long-duration storage, and on developing decision support systems for locating proposed recent data centers, making an allowance for the provision of electrical power and water and likewise regulatory considerations, and even the potential for using what will be significant waste heat, for instance, for heating nearby buildings. Technology development projects include designing faster, more efficient computer chips and more energy-efficient computing algorithms.
Along with providing leadership and funding for a lot of research projects, MITEI is acting as a convenor, bringing together corporations and stakeholders to handle this issue. At MITEI’s 2024 Annual Research Conference, a panel of representatives from two hyperscalers and two corporations that design and construct data centers together discussed their challenges, possible solutions, and where MIT research may very well be most useful.
As data centers proceed to be built, and computing continues to create an unprecedented increase in demand for electricity, Green says, scientists and engineers are in a race to supply the ideas, innovations, and technologies that may meet this need, and at the identical time proceed to advance the transition to a decarbonized energy system.