It’s estimated that about 70 percent of the energy generated worldwide finally ends up as waste heat.
If scientists could higher predict how heat moves through semiconductors and insulators, they might design more efficient power generation systems. Nevertheless, the thermal properties of materials could be exceedingly difficult to model.
The difficulty comes from phonons, that are subatomic particles that carry heat. A few of a cloth’s thermal properties rely on a measurement called the phonon dispersion relation, which could be incredibly hard to acquire, let alone utilize within the design of a system.
A team of researchers from MIT and elsewhere tackled this challenge by rethinking the issue from the bottom up. The results of their work is a brand new machine-learning framework that may predict phonon dispersion relations as much as 1,000 times faster than other AI-based techniques, with comparable and even higher accuracy. In comparison with more traditional, non-AI-based approaches, it could possibly be 1 million times faster.
This method could help engineers design energy generation systems that produce more power, more efficiently. It may be used to develop more efficient microelectronics, since managing heat stays a serious bottleneck to speeding up electronics.
“Phonons are the wrongdoer for the thermal loss, yet obtaining their properties is notoriously difficult, either computationally or experimentally,” says Mingda Li, associate professor of nuclear science and engineering and senior creator of a paper on this system.
Li is joined on the paper by co-lead authors Ryotaro Okabe, a chemistry graduate student; and Abhijatmedhi Chotrattanapituk, an electrical engineering and computer science graduate student; Tommi Jaakkola, the Thomas Siebel Professor of Electrical Engineering and Computer Science at MIT; in addition to others at MIT, Argonne National Laboratory, Harvard University, the University of South Carolina, Emory University, the University of California at Santa Barbara, and Oak Ridge National Laboratory. The research appears in Nature Computational Science.
Predicting phonons
Heat-carrying phonons are tricky to predict because they’ve an especially wide frequency range, and the particles interact and travel at different speeds.
A fabric’s phonon dispersion relation is the connection between energy and momentum of phonons in its crystal structure. For years, researchers have tried to predict phonon dispersion relations using machine learning, but there are such a lot of high-precision calculations involved that models get bogged down.
“If you’ve 100 CPUs and a number of weeks, you could possibly probably calculate the phonon dispersion relation for one material. The entire community really wants a more efficient method to do that,” says Okabe.
The machine-learning models scientists often use for these calculations are often known as graph neural networks (GNN). A GNN converts a cloth’s atomic structure right into a crystal graph comprising multiple nodes, which represent atoms, connected by edges, which represent the interatomic bonding between atoms.
While GNNs work well for calculating many quantities, like magnetization or electrical polarization, they should not flexible enough to efficiently predict an especially high-dimensional quantity just like the phonon dispersion relation. Because phonons can travel around atoms on X, Y, and Z axes, their momentum space is tough to model with a hard and fast graph structure.
To achieve the flexibleness they needed, Li and his collaborators devised virtual nodes.
They create what they call a virtual node graph neural network (VGNN) by adding a series of flexible virtual nodes to the fixed crystal structure to represent phonons. The virtual nodes enable the output of the neural network to differ in size, so it is just not restricted by the fixed crystal structure.
Virtual nodes are connected to the graph in such a way that they will only receive messages from real nodes. While virtual nodes shall be updated because the model updates real nodes during computation, they don’t affect the accuracy of the model.
“The way in which we do that may be very efficient in coding. You only generate a number of more nodes in your GNN. The physical location doesn’t matter, and the actual nodes don’t even know the virtual nodes are there,” says Chotrattanapituk.
Cutting out complexity
Because it has virtual nodes to represent phonons, the VGNN can skip many complex calculations when estimating phonon dispersion relations, which makes the tactic more efficient than a typical GNN.
The researchers proposed three different versions of VGNNs with increasing complexity. Each could be used to predict phonons directly from a cloth’s atomic coordinates.
Because their approach has the flexibleness to rapidly model high-dimensional properties, they will use it to estimate phonon dispersion relations in alloy systems. These complex mixtures of metals and nonmetals are especially difficult for traditional approaches to model.
The researchers also found that VGNNs offered barely greater accuracy when predicting a cloth’s heat capability. In some instances, prediction errors were two orders of magnitude lower with their technique.
A VGNN could possibly be used to calculate phonon dispersion relations for a number of thousand materials in only a number of seconds with a laptop computer, Li says.
This efficiency could enable scientists to look a bigger space when looking for materials with certain thermal properties, resembling superior thermal storage, energy conversion, or superconductivity.
Furthermore, the virtual node technique is just not exclusive to phonons, and may be used to predict difficult optical and magnetic properties.
In the longer term, the researchers need to refine the technique so virtual nodes have greater sensitivity to capture small changes that may affect phonon structure.
“Researchers got too comfortable using graph nodes to represent atoms, but we will rethink that. Graph nodes could be anything. And virtual nodes are a really generic approach you could possibly use to predict a variety of high-dimensional quantities,” Li says.
“The authors’ modern approach significantly augments the graph neural network description of solids by incorporating key physics-informed elements through virtual nodes, as an example, informing wave-vector dependent band-structures and dynamical matrices,” says Olivier Delaire, associate professor within the Thomas Lord Department of Mechanical Engineering and Materials Science at Duke University, who was not involved with this work. “I find that the extent of acceleration in predicting complex phonon properties is amazing, several orders of magnitude faster than a state-of-the-art universal machine-learning interatomic potential. Impressively, the advanced neural net captures nice features and obeys physical rules. There may be great potential to expand the model to explain other vital material properties: Electronic, optical, and magnetic spectra and band structures come to mind.”
This work is supported by the U.S. Department of Energy, National Science Foundation, a Mathworks Fellowship, a Sow-Hsin Chen Fellowship, the Harvard Quantum Initiative, and the Oak Ridge National Laboratory.