Researchers at Pennsylvania State University are harnessing the artificial intelligence technology used to generate deepfakes – synthetic media in which AI is used to replace a person in an existing image or video with someone else’s likeness – to power the next wave of innovations in materials design.
‘We hear a lot about deepfakes in the news today,’ said Wesley Reinhart, assistant professor of materials science and engineering at Penn State. ‘That’s exactly the same technology we used in our research. We’re basically just swapping out this example of images of human faces for elemental compositions of high-performance alloys.’
The researchers trained a so-called generative adversarial network (GAN) to create novel refractory high-entropy alloys – materials that can maintain their strength even while being exposed to ultra-high temperatures. The alloys have a wide range of applications, from turbine blades to rockets.
‘There are a lot of rules about what makes an image of a human face or what makes an alloy, and it would be really difficult for you to know what all those rules are or to write them down by hand,’ Reinhart said. ‘The whole principle of this GAN is you have two neural networks that basically compete in order to learn what those rules are, and then generate examples that follow the rules.’
The two neural networks in the GAN are a generator that creates new compositions and a critic that attempts to discern whether these novel alloys look realistic compared to a training dataset derived from hundreds of published examples of alloys. As this adversarial game continues over numerous iterations, the generator’s ability to create new alloys improves.
After the training was complete, the scientists asked the generator to focus on creating alloy compositions that had specific properties that would be ideal for use in turbine blades.
‘Our preliminary results show that generative models can learn complex relationships in order to generate novelty on demand,’ said Zi-Kui Liu, Dorothy Pate Enright professor of materials science and engineering at Penn State. ‘This is phenomenal. It’s really what we are missing in our computational community in materials science in general.’
Traditional, or rational, design relies on human intuition to find patterns and improve materials. However, this has become increasingly challenging as materials chemistry and processing have grown more complex, which is where the ‘inverse design’ carried out by the AI comes into its own, the researchers said.
‘When you are dealing with design problems you often have dozens or even hundreds of variables you can change,’ Reinhart said. ‘Your brain just isn’t wired to think in 100-dimensional space; you can’t even visualise it. So, one thing that this technology does for us is to compress it down and show us patterns we can understand. We need tools like this to be able to even tackle the problem. We simply can’t do it by brute force.
‘With rational design, you have to go through each one of these steps one at a time; do simulations, check tables, consult other experts,’ he continued. ‘Inverse design is basically handled by this statistical model. You can ask for a material with defined properties and get 100 or 1,000 compositions that might be suitable in milliseconds.’
The generator model isn’t perfect, however. Its suggestions must still be validated with high-fidelity simulations, but the researchers said that it removes guesswork and offers a promising new tool to determine which materials to try.
The findings have been published in the Journal of Materials Informatics.