HOUSTON – (April 30, 2021) – Microscopic structures and material properties are intertwined and customizing them is a challenge. Rice University engineers are determined to simplify the process through machine learning.
To this end, materials scientist Ming Tang’s Rice lab, in collaboration with physicist Fei Zhou of Lawrence Livermore National Laboratory, introduced a technique to predict the evolution of microstructures – structural features between 10 nanometers and 100 microns – in materials.
Their open access article in the journal Cell Press Reasons shows how neural networks (computer models that mimic neurons in the brain) can be trained to predict how a structure will develop in a certain environment, much like a snowflake forms from moisture in nature .
In fact, snowflake-shaped dendritic crystal structures were one of the examples the lab used in their proof-of-concept study.
“In modern materials science, it is widely accepted that microstructure often plays a critical role in controlling the properties of a material,” Tang said. “You not only want to control how atoms are laid out on lattices, but also what the microstructure looks like, giving you good performance and even new functionality.
“The holy grail of materials design is being able to predict how a microstructure will change under given conditions, whether we heat it up or apply a stress or some other kind of stimulation,” he said.
Tang has worked to refine microstructure prediction throughout his career, but said the traditional equation-based approach faces significant challenges in enabling scientists to meet the demand for new materials.
“The huge advancements in machine learning have encouraged Lawrence Livermore’s Fei and us to see if we can apply it to materials,” he said.
Fortunately, there was a lot of data from the traditional method to help train the team’s neural networks, which visualize the early evolution of microstructures to predict the next step, and the next, and so on.
“That’s what machinery is good at, seeing the correlation in a very complex way that the human mind is not able to do,” Tang said. “We are taking advantage of it.”
The researchers tested their neural networks on four distinct types of microstructure: plane wave propagation, grain growth, spinodal decomposition, and dendritic crystal growth.
In each test, the networks were fed between 1000 and 2000 sets of 20 successive images illustrating the evolution of the microstructure of a material as predicted by the equations. After learning the rules of evolution from this data, the network then received 1 to 10 images to predict the next 50 to 200 images, and usually did so within seconds.
The advantages of the new technique quickly became evident: Neural networks, powered by graphics processors, accelerated calculations up to 718 times for grain growth, compared to the previous algorithm. When run on a standard central processor, they were still 87 times faster than the old method. Prediction of other types of microstructure evolution showed similar rate increases, but not as dramatic.
Comparisons with images from the traditional simulation method proved that the predictions were largely correct, Tang said. “Based on this, we see how we can update the parameters to make the prediction more and more accurate,” he said. “Then we can use those predictions to help design materials we’ve never seen before.
“Another advantage is that it is able to make predictions even when we don’t know everything about the properties of materials in a system,” Tang said. “We couldn’t do this with the equation-based method, which needs to know all the parameter values in the equations to run simulations.”
Tang said the computational efficiency of neural networks could accelerate the development of new materials. He expects this to be helpful in his lab’s ongoing design for more efficient batteries. “We are thinking about new three-dimensional structures that will help charge and discharge batteries much faster than what we currently have,” Tang said. “It’s an optimization problem that fits perfectly with our new approach.”
Graduate student Rice Kaiqi Yang is the lead author of the article. The co-authors are Yifan Cao, alumnus of Rice, and graduate students Youtian Zhang and Shaoxun Fan; and researchers Daniel Aberg and Babak Sadigh of Lawrence Livermore. Zhou is a physicist with Lawrence Livermore. Tang is an assistant professor of materials science and nanotechnology at Rice.
The Department of Energy, the National Science Foundation, and the American Chemical Society Petroleum Research Fund supported the research.
Read the article on https: /
This press release can be viewed online at https: /
Follow Rice News and media relations via Twitter @RiceUNews.
Meso Scale Materials Science Group: http: // tanggroup.
Department of Materials Science and Nanotechnology: https: /
George R. Brown School of Engineering: https: /
Image to download:
Engineers at Rice University and the Lawrence Livermore National Laboratory are using neural networks to accelerate the prediction of the evolution of material microstructures. This example predicts the growth of snowflake shaped dendritic crystals. (Credit: Mesoscale Materials Science Group / Rice University)
Located on a 300-acre wooded campus in Houston, Rice University is consistently ranked among the top 20 universities in the country by US News & World Report. Rice has highly respected schools of Architecture, Business, Continuing Studies, Engineering, Humanities, Music, Natural and Social Sciences and is home to the Baker Institute for Public Policy. With 3,978 undergraduates and 3,192 graduate students, Rice’s undergraduate student-to-faculty ratio is just under 6 to 1. Her residential college system builds tight-knit communities and lasting friendships, one reason for which Rice is ranked # 1 for many race / class interactions and # 1 for quality of life by the Princeton Review. Rice is also rated as the best value among private universities by Kiplinger Personal Finance.