Princeton University's Andlinger Center for Energy and the Environment recently held its 13th annual meeting themed “Energy for AI and AI for Energy,” which brought together experts from industry, academia, and government to discuss the relationship between artificial intelligence (AI) and the energy transition. While AI offers potential in optimizing energy systems, advancing climate research, and accelerating the deployment of clean technologies, it also introduces significant challenges due to its high energy consumption and carbon-intensive infrastructure requirements.
“AI is influencing nearly every academic discipline—frankly, nearly every human endeavor,” said Jennifer Rexford, Princeton’s provost and the Gordon Y. S. Wu Professor in Engineering.
Keynote speaker Melanie Nakagawa, Microsoft’s chief sustainability officer, stressed the importance of considering AI’s lifecycle impacts holistically. Data centers, the backbone of AI operations, are major energy consumers and contribute significantly to carbon emissions, particularly through the materials used in their construction. Nakagawa highlighted Microsoft's initiatives to mitigate these impacts by collaborating with partners like Google and Nucor to promote cleaner building materials and advanced energy solutions, aiming to set an industrywide standard for sustainable practices.
“Ultimately, this is a systems challenge,” Nakagawa said. “We want to create an impact beyond our company, so we are investing in solutions and advocating for policies that can support a net-zero future for everyone.”
Panelists explored how AI-driven demand intersects with broader energy challenges, including the electrification of vehicles and the rise of hydrogen and other emerging technologies. Regulatory frameworks and energy systems, still designed for traditional centralized energy sources, need to evolve to accommodate new renewable projects and transmission requirements.
Experts including Allison Clements, former commissioner at the US Federal Energy Regulatory Commission (FERC), and Matt DeNichilo, a partner at Energy Capital Partners, emphasized the need for enhanced grid capacity and the integration of long-duration energy storage to support the increasing load from AI operations sustainably.
Speakers also highlighted AI’s transformative role in addressing energy and climate challenges. For example, AI-powered tools are being leveraged for real-time diagnostics in fusion energy, automation in power grid operations, and rapid simulation of environmental models, enabling faster and more effective decision-making. However, challenges remain, including scaling AI solutions, ensuring their safety and robustness, and overcoming data limitations.
Discussions also extended to the energy-intensive nature of computing technologies that underpin AI. Advances in chip performance have dramatically increased computing power, with Nvidia’s senior director of circuits research Tom Gray estimating performance to have increased “nearly 1,000-fold over the past 8 years,” but this increase has come at the cost of rising energy consumption.
Industry leaders like Gray and AMD senior fellow Stephen Kosonocky emphasized the urgency of developing new computing architectures and energy-efficient technologies to counteract this trend.
“The power demand of computing and data centers isn’t going away, and conventional techniques for boosting efficiency are quickly running out of steam,” Kosonocky said. “What we need are new trajectories and new computing architectures to support greater efficiencies as AI evolves.”
Beyond operational energy use, panelists like Carole-Jean Wu, director of AI research at Meta, pointed to the significant emissions associated with manufacturing AI hardware, urging the industry to consider total lifecycle emissions. Despite these challenges, many saw AI as a vital tool for driving clean energy innovation. Initiatives like the US Department of Energy’s Frontiers in Artificial Intelligence for Science, Security and Technology (FASST) program aim to integrate AI into energy storage, fusion research, and more, accelerating progress in these areas.
Speakers also reflected on the limitations of AI, cautioning against overreliance on it, advocating for combining AI with physics-based models to ensure comprehensive and interpretable solutions.
“I always ask my students to start with simple models or to tell me why we should be using something more complicated,” said panelist Ning Lin, Princeton professor of civil and environmental engineering, who has used AI and machine learning to upscale hurricane hazard models. “Unless the more complicated model produces much better results, the simpler models are easier to understand and interpret.”
The event concluded with a call to action for interdisciplinary collaboration and systemic thinking to balance AI’s benefits and environmental costs, underscoring its role in shaping a sustainable and inclusive energy future.