Berkeley Lab Leads Effort to Build AI Assistant for Energy Materials Discovery
Developing the lithium-ion battery took decades of research. A new multi-institutional project led by the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) aims to cut that timeline dramatically, using AI and supercomputers to speed the discovery of materials for batteries, semiconductors, and other energy technologies.
The project — called FORUM-AI (Foundation Models Orchestrating Reasoning Agents to Uncover Materials Advances and Insights) — supports the Genesis Mission, a new national initiative led by the Department of Energy to advance AI and accelerate discovery, providing solutions for challenges in science, energy, and national security.
“FORUM-AI aims to be the first full-stack, agentic AI system for materials science research and discovery,” said principal investigator Anubhav Jain, a staff scientist in Berkeley Lab’s Energy Technologies Area who is leading the project. “It will help scientists at every step of energy materials research, from hypothesis generation and computer simulations to laboratory experiments and analysis.” Jain is also the Associate Director of the Materials Project, an open-access materials database managed by Berkeley Lab, and the materials capability lead for the Department of Energy Durable Module Materials (DuraMat) consortium.
The effort is a collaboration between Berkeley Lab, Oak Ridge National Laboratory, Argonne National Laboratory, the Massachusetts Institute of Technology, and The Ohio State University with a goal to develop an open-source, general-purpose AI platform for research in materials and the physical sciences. The multi-institutional team was selected by the Department of Energy under its Scientific Discovery through Advanced Computing (SciDAC) program to co-lead the four-year, $10 million collaborative project using some of the nation’s most advanced high-performance computers to develop FORUM-AI.
In this Q&A, Jain shares his perspective on the exciting evolution of computational materials science, and AI’s essential role in accelerating materials discoveries.
Q: What inspired FORUM-AI? How will it help materials researchers?
Jain: The advancements in machine learning that have occurred the last few years inspired us to develop the FORUM-AI platform. Almost everyone has used ChatGPT to brainstorm ideas, and scientists have been doing this as well, but FORUM-AI is aiming to push the boundaries of what’s possible by using the Department of Energy’s leadership computing facilities to evaluate hundreds of hypotheses and hundreds of research plans of action in parallel.
Traditionally, if you have a research problem, you would test one hypothesis at a time. Under this new framework, instead you could use FORUM-AI to run large-scale simulations on the supercomputers at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC), Oak Ridge National Laboratory’s Oak Ridge Leadership Computing Facility (OLCF), or Argonne National Laboratory’s Argonne Leadership Computing Facility (ALCF) and perhaps execute some robotic experiments on your behalf to see which of these hypotheses seem to be the most promising.
“FORUM-AI aims to be the first full-stack, agentic AI system for materials science research and discovery. It will help scientists at every step of energy materials research, from hypothesis generation and computer simulations to laboratory experiments and analysis.” — Anubhav Jain, Energy Technologies Area staff scientist and Associate Director of the Materials Project
This assistant will use three classes of AI to do this work: generative AI to make images or write text; reasoning models, which enable an internal thought process to provide recommendations for how to solve your materials science problems and help you interpret data; and agentic models, which perform actions on your behalf like running a simulation and controlling experimental facilities.
This is important because we face complex energy challenges that require more resources to solve. High-performance computers and AI will help us greatly accelerate the pace of discovery at every step of the process, from planning out research studies and running computer simulations to performing experiments in the lab.
Q: How do you know if the FORUM-AI assistant’s information is based on fact?
Jain: That’s a great question, because one of the shortcomings of AI in the sciences is the tendency of these AI models to hallucinate information, meaning they give you back information that’s incorrect or untrue.
There are a few aspects of this project designed to prevent those kinds of issues.
First is a high-quality database of specific materials data, so that when an AI agent is asked, for example, what are the band gaps of cadmium telluride and silicon, the agent doesn’t have to rely on its own model weights, its own memory, to give you a response. Rather, it can look up the data in a verified database and return accurate results on that question.
Second is transparency so that the researcher can see how the AI plans to tackle a particular problem. Research plans and reasoning traces will be inspectable and visualizable so the researcher can edit or disregard those research plans if they feel that they’re incorrect.
And finally, these AI agents will use standard, physics-based simulation tools to predict the property of a material. Because these tools and methods are well-benchmarked, the agents will use community-standard approaches to ensure reliability.
Q: We hear a lot about the energy demands of AI. How could FORUM-AI enable an AI platform that is more energy efficient than current AI technologies?
Jain: We are going to work on what’s called distillation, which is where you take a more expensive, compute-intensive model, and then you find a way to train a smaller model that essentially reproduces the behavior of the larger model. Using this distilled model is less energy intensive.
A distilled model can often fit on a laptop, and can be run on your own computer, which is very useful for researchers. You could, for example, attach it to a device, like an X-ray diffraction machine, whereas a large model that’s only usable on a supercomputer is much more difficult to operate for the general user.
Q: Why are the national labs essential to AI research?
Jain: The national labs are an ideal place to develop AI-assisted materials research because for the last few decades they’ve been laying the foundation for it.
For example, at Berkeley Lab, we have been creating the Materials Project, which is a large database of materials properties that the Agentic AI can use in order to better inform its decisions. We’ve also been developing software tools that allow you to automate simulations of materials properties, which has been traditionally very difficult to do. In fact, when we started the Materials Project, many people thought that it would be impossible to automate material simulations because they just required too much physical insight to determine the parameters.
Legal Disclaimer:
EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.