AI Accelerates Modeling of New Materials for Carbon Capture

Artificial intelligence is increasingly being used to assist in the development of materials, including metal-organic frameworks (MOFs), to advance carbon capture technologies. Researchers assembled more than 120,000 new MOF candidates within 30 minutes.

a–f Crystal structure of the top six AI-generated MOFs. The color code used to represent atoms is: carbon in grey, nitrogen in dark blue, fluorine in cyan, zinc in purple, hydrogen in white, and lithium in green.
a–f Crystal structure of the top six AI-generated MOFs. The color code used to represent atoms is: carbon in grey, nitrogen in dark blue, fluorine in cyan, zinc in purple, hydrogen in white, and lithium in green. Source: H. Park et al.

Artificial intelligence (AI) is increasingly being employed to assist in the development of materials, including metal-organic frameworks (MOFs), to develop carbon capture technologies.

MOFs are modular materials made up of three building blocks: inorganic nodes such as zinc or copper; organic nodes; and organic linkers made up of carbon, oxygen, and other elements. By changing the relative positions and configurations of the building blocks, the potential combinations for creation of unique MOFs are countless. The idea is to create a porous carbon dioxide “trap” to capture carbon from the air. The structure created by the building blocks can be thought of simplistically as a scaffolding with joints (linkers) that functions to absorb carbon.

In a recent paper, researchers from the US Department of Energy Argonne National Laboratory, The University of Illinois Chicago (UIC), University of Illinois, Northwestern University, and TotalEnergies described their use of generative AI to identify “good carbon absorbers” among “billions and billions of possibilities.”

Although early work on developing MOFs began in the 1990s, the power of AI modeling can generate new models with desired properties such as optimal selectivity and capacity without the laborious, reiterative experimental and computational efforts once required. In a press release, Argonne said, “… The team was able to quickly assemble, building block by building block, over 120,000 new MOF candidates within 30 minutes” on a supercomputer at the Argonne Leadership Computing Facility.

“The race for capturing carbon hinges on finding needles in a haystack, and trial and error is too slow. You have billions and billions of possibilities, and then you must narrow down to candidates that are good carbon absorbers,” said paper coauthor Santanu Chaudhuri, professor of civil, materials, and environmental engineering at UIC and director of manufacturing science and engineering at Argonne, in a news release. “With this project, we have taken the first significant step towards closing that gap by using generative AI.”

Coauthor Eliu Huerta, an Argonne computational scientist who helped lead the study, said, "The traditional methods have typically involved experimental synthesis and computational modeling with molecular dynamics simulations. But trying to survey the vast MOF landscape in this way is just impractical.

“We wanted to add new flavors to the MOFs that we were designing. We needed new ingredients for the AI recipe,” Huerta said.

The AI framework, called GHP-MOFassemble, screened the newly built 120,000 MOFs in 40 minutes to identify nearly 79,000 with valid bonds (joints). Those were screened to identify 19,000 with valid chemistry within 205 minutes. From this group, 364 MOFs were identified in 50 minutes with CO2 capacity higher than a selected value.

From assembly to selection of MOFs, GHP-MOFassemble completed the analysis within 5 hours and 7 minutes.

Powering the AI Future

It’s ironic that processing of massive volumes of data on supercomputers for emissions-related research, such as AI simulations, could contribute to the problem to be solved because of the electricity required. Supercomputers use the computing power of multiple interconnected processing cores, which require an immense amount of energy—and depending on the source of the electricity generation, also contribute to emissions.

In a November 2023 commentary, International Energy Agency (IEA) analysts estimated that AI uses more energy than other forms of computing. Training a single model uses more electricity than 100 US homes consume in an entire year. In 2022, Google reported that machine learning accounted for about 15% of its total energy use over the prior 3 years.

In a January report, IEA estimated that Google could see a tenfold increase in their electricity demand if AI is fully implemented in its search engine. The average electricity demand of a typical search is 0.3 Wh. OpenAI’s ChatGPT requires 2.9 Wh per request. Considering 9 billion searches per day, nearly 10 TWh of additional electricity would be required in a year.

Google and other tech companies are shifting their data operations around the world daily or hourly to tap into excess renewable energy production. Google uses its “carbon-intelligent” platform to analyze day-ahead predictions of how much a given grid will be relying on carbon-intensive energy. It then shifts computing globally in favor of regions where more carbon-free electricity is available.

As AI advances in modeling complexity with a greater number of parameters, there is a growing emphasis on sustainability and optimizing power consumption, including developing energy-efficient algorithms, hardware, and data center management strategies.

For Further Reading

A Generative Artificial Intelligence Framework Based on a Molecular Diffusion Model for the Design of Metal-Organic Frameworks for Carbon Capture by H. Park, X. Yan, and R. Zhu, et al. Communications Chemistry.

Why AI and Energy Are the New Power Couple by V. Rozite, J. Miller, and S. Oh, International Energy Agency.

Electricity 2024. International Energy Agency.

We Now Do More Computing Where There’s Cleaner Energy by R. Koningstein, Google Research.