AI Helps Scientists Quantify Irradiation Effects
Novel Convolutional Neural Network combined with advanced microscopy offers a path to automated and reliable radiation defect analysis.
The Science
Physicist Sébastien Balibar said of nuclear fusion reactors, “We say that we will put the sun into a box. The idea is pretty. The problem is, we don’t know how to make the box.” For example, the materials used to make this nuclear reaction “box”—mostly metallic alloys—must withstand significant damage from radiation. This damage dictates a reactor’s operational safety and its lifetime. By developing artificial intelligence (AI) computer vision models, nuclear materials scientists have created a powerful new tool to automate the detection of defects in alloys. This tool provides statistically meaningful defect quantification to develop a better understanding of the effects of irradiation damage on materials performance.
The Impact
The novel DefectSegNet tool for defect detection uses the AI Convolutional Neural Network (CNN) model. The tool demonstrates the feasibility of using AI tools with microscopy data to find defects in structural alloys quickly and accurately. The CNN model could bring huge changes to defect analysis. It could open new opportunities to standardize defect quantification for reactor materials. The result would be faster development of next-generation alloys for future nuclear reactors. These would contribute to nuclear power plants that help to stabilize our climate.
Summary
Scientists who are working to understand how defects induced by neutron irradiation govern the changes in reactor alloy properties must characterize and quantify radiation-induced defects. The lack of automated defect analysis has hindered scientists’ ability to produce statistically meaningful quantification of radiation-induced defects. This has created an increasing bottleneck for the design of alloys for nuclear reactors.
The research team drew on their dual roles in microscopy and computer vision. To pave the way for reliable defect detection using computer vision AI, the team first resolved well-known contrast issues in conventional defect images by establishing an alternative advanced imaging mode able to record defect contrast with better clarity and feature homogeneity. Trained on these “good data,” the team developed a new hybrid deep CNN architecture called DefectSegNet. Compared to human experts, defect quantification assisted by this AI is more accurate, more reproducible, and at least two orders of magnitude more efficient. These encouraging results demonstrate the potential for deep learning to accelerate scientific understanding of radiation damage. This will enable faster development of next generation high-performance structural alloys for nuclear fusion reactors.
Contact
Yuanyuan Zhu
University of Connecticut
yuanyuan.2.zhu@uconn.edu
860 486-2378
Danny J. Edwards
Pacific Northwest National Laboratory
dan.edwards@pnnl.gov
509-371-7284
Funding
The work was supported by the Department of Energy Office of Science, Fusion Energy Sciences program, and Office of Nuclear Energy. The University of Connecticut High Performance Computing facility and the Pacific Northwest National Laboratory Research Computing Program provided the CNN model training in this research.
Publications
Roberts G., et al., “Deep Learning for Semantic Segmentation of Defects in Advanced STEM Images of Steels.” Scientific Reports 9, 12744 (2019) [DOI: 10.1038/s41598-019-49105-0]
Zhu, Y., et al., “Towards bend-contour-free dislocation imaging via diffraction contrast STEM.” Ultramicroscopy 193, 12 (2018) [DOI: 10.1016/j.ultramic.2018.06.001]
Related Links
NoneHighlight Categories
Program: FES
Performer: University
Additional: NE