News - Tyler H. Norris
ERCOT’s unique connect-and-manage interconnection process is significantly faster than that of any other US energy market. Tyler Norris, a Ph.D. student at Duke's Nicholas School of the Environment, spoke to Utility Dive how other markets could use ERCOT's template to speed up interconnection queues and reduce the cost of network upgrades.
Soaring energy demand from AI data centers is prompting many power providers to plan new capacity to make sure they have plenty of “headroom” to meet any spikes in demand. But after analyzing data from power systems across the country, a team of Duke University scholars found that if large AI facilities cut back their electricity use during hours of peak demand, many regional power grids could accommodate those AI customers without adding new generation capacity, reports MIT Technology Review.
Tyler Norris, a Ph.D. student at the Nicholas School of the Environment, joined The Energy Gang podcast to discuss what the host called "the most talked-about academic paper this year in the world of energy." Norris coauthored Rethinking Load Growth, which offers a new perspective on how to provide power for data centers and other large consumers by embracing load flexibility.
Greg Robinson, cofounder and CEO of Aston, writes for the Forbes Technology Council about two studies that illustrate on-grid and off-grid approaches to the future of U.S. power. The first study, by the Nicholas Institute, focuses on creative ways to become flexible with the grid to manage the spike in energy demand due to the rise of AI data centers and the electrification of everything. The other study focuses on off-grid solar microgrids as a way to solve AI energy spikes.
The International Energy Agency recommended that data centers be built in areas with more grid capacity and more power for their energy usage. A recent Nicholas Institute study suggested that data centers could operate more flexibly, temporarily reducing their energy consumption from the grid during times of peak demand.
Tyler Norris, a Ph.D. student at the Nicholas School of the Environment, joined the Energy Capital Podcast to discuss a recent Nicholas Institute study on large flexible loads, the data center boom and how demand-side innovation can ease grid strain and enable load growth.
The power demands of AI data centers are skyrocketing—but how will we meet them? In this episode of the Build, Repeat podcast, Tyler Norris (Duke University) and Kyle Baranko (Paces) dug into two influential white papers offering different solutions. The Duke paper highlights how existing grid capacity can be better utilized by embracing flexible interconnections and load curtailment.
John Quigley, a senior fellow at the University of Pennsylvania’s Kleinman Center for Energy Policy, tells Floodlight that plenty can be done to update the existing grid’s capacity to meet the electricity needs of data centers without immediately installing new transmission lines. He pointed to a recent Nicholas Institute study that finds load flexibility could be a “key solution to the United States' soaring electrical demand.”
Doctoral candidate Tyler Norris returned as a guest on the Catalyst podcast, discussing how electric load flexibility can help power the AI-focused data center boom.
U.S. power demand has soared in recent years with growing artificial intelligence, construction and electrification needs. The Bloomberg Energy Daily cites new Nicholas Institute research saying the United States can meet this rising consumption and quickly add huge loads to its grids without building new power plants by deploying grid flexibility—strategically cutting consumption by the equivalent of about one day a year’s usage.