News - Transmission and Power Markets
During the Energy Future Forum on May 19, Eolian CEO Aaron Zubaty told the audience that batteries can help address the main issue of growing energy demand by helping balance the transmission system and adding flexibility to data centers without the emissions associated with more standard diesel backup generation. Zubaty pointed to a Duke University analysis that has "been making waves since its release this winter," reports RTO Insider.
Soaring energy demand from AI data centers is prompting many power providers to plan new capacity to make sure they have plenty of “headroom” to meet any spikes in demand. But after analyzing data from power systems across the country, a team of Duke University scholars found that if large AI facilities cut back their electricity use during hours of peak demand, many regional power grids could accommodate those AI customers without adding new generation capacity, reports MIT Technology Review.
Tyler Norris, a Ph.D. student at the Nicholas School of the Environment, joined The Energy Gang podcast to discuss what the host called "the most talked-about academic paper this year in the world of energy." Norris coauthored Rethinking Load Growth, which offers a new perspective on how to provide power for data centers and other large consumers by embracing load flexibility.
The 2024–2025 Bass Connections program featured 16 interdisciplinary teams in the Energy & Environment theme administered by the Nicholas Institute. Duke students involved in a handful of the teams talked about their projects during the annual Fortin Foundation Bass Connections Showcase last month.
Greg Robinson, cofounder and CEO of Aston, writes for the Forbes Technology Council about two studies that illustrate on-grid and off-grid approaches to the future of U.S. power. The first study, by the Nicholas Institute, focuses on creative ways to become flexible with the grid to manage the spike in energy demand due to the rise of AI data centers and the electrification of everything. The other study focuses on off-grid solar microgrids as a way to solve AI energy spikes.
The International Energy Agency recommended that data centers be built in areas with more grid capacity and more power for their energy usage. A recent Nicholas Institute study suggested that data centers could operate more flexibly, temporarily reducing their energy consumption from the grid during times of peak demand.
The power demands of AI data centers are skyrocketing—but how will we meet them? In this episode of the Build, Repeat podcast, Tyler Norris (Duke University) and Kyle Baranko (Paces) dug into two influential white papers offering different solutions. The Duke paper highlights how existing grid capacity can be better utilized by embracing flexible interconnections and load curtailment.
John Quigley, a senior fellow at the University of Pennsylvania’s Kleinman Center for Energy Policy, tells Floodlight that plenty can be done to update the existing grid’s capacity to meet the electricity needs of data centers without immediately installing new transmission lines. He pointed to a recent Nicholas Institute study that finds load flexibility could be a “key solution to the United States' soaring electrical demand.”
Doctoral candidate Tyler Norris returned as a guest on the Catalyst podcast, discussing how electric load flexibility can help power the AI-focused data center boom.
U.S. power demand has soared in recent years with growing artificial intelligence, construction and electrification needs. The Bloomberg Energy Daily cites new Nicholas Institute research saying the United States can meet this rising consumption and quickly add huge loads to its grids without building new power plants by deploying grid flexibility—strategically cutting consumption by the equivalent of about one day a year’s usage.