Using AI researchers have developed a new method for turning X-ray data into 3D visualizations, hundreds of times faster than traditional methods.
A team of scientists from Argonne National Laboratory developed a new method for turning X-ray data into visible, 3D images with the help of AI. The study, published in Applied Physics Reviews, develops a computational framework capable of taking data from the lab’s Advanced Photon Source (APS) and creating 3D visualizations hundreds of times faster than traditional methods.
“In order to make full use of what the upgraded APS will be capable of, we have to reinvent data analytics. Our current methods are not enough to keep up. Machine learning can make full use and go beyond what is currently possible,” Mathew Cherukara, a computational scientist at Argonne and study coauthor, said in a press release.
The advancement could have wide-ranging benefits to many areas of study relying on sizable amounts of 3D data, ranging from astronomy to nanoscale imaging.
Described as one of the most technologically complex machines in the world, the APS uses extremely bright X-ray beams to help researchers see the structure of materials at the molecular and atomic level. As these beams of light bounce off an object, detectors collect them in the form of data. With time and complex computations, this data is converted into images, revealing the object’s structure.
However, detectors are unable to capture all the beam data, leaving missing pieces of information. The researchers fill this gap by using neural networks that train computer models to identify objects and visualize an image, based on the raw data it is fed.
With 3D images this can be extremely timely due to the amount of information processed.
“We used computer simulations to create crystals of different shapes and sizes, and we converted them into images and diffraction patterns for the neural network to learn. The ease of quickly generating many realistic crystals for training is the benefit of simulations,” said Henry Chan, an Argonne postdoctoral researcher, and study coauthor.
The work for the new computational framework, known as 3D-CDI-NN, was developed using GPU resources at Argonne’s Joint Laboratory for System Evaluation, consisting of NVIDIA A100 and RTX 8000 GPUs.
“This paper… greatly facilitates the imaging process. We want to know what a material is, and how it changes over time, and this will help us make better pictures of it as we make measurements,” said Stephan Hruszkewycz, study coauthor and physicist with Argonne’s Materials Science Division.
Read more >>>
Read the full article in Applied Physics Reviews >>>