Document Type
Article
Publication Date
Winter 1-20-2022
Abstract
This work investigates and applies machine learning paradigms seldom seen in analytical spectroscopy for quantification of gallium in cerium matrices via processing of laser-plasma spectra. Ensemble regressions, support vector machine regressions, Gaussian kernel regressions, and artificial neural network techniques are trained and tested on cerium-gallium pellet spectra. A thorough hyperparameter optimization experiment is conducted initially to determine the best design features for each model. The optimized models are evaluated for sensitivity and precision using the limit of detection (LoD) and root mean-squared error of prediction (RMSEP) metrics, respectively. Gaussian kernel regression yields the superlative predictive model with an RMSEP of 0.33% and an LoD of 0.015% for quantification of Ga in a Ce matrix. This study concludes that these machine learning methods could yield robust prediction models for rapid quality control analysis of plutonium alloys.
DOI
10.1364/AO.444093
Source Publication
Applied Optics
Recommended Citation
Ashwin P. Rao, Phillip R. Jenkins, John D. Auxier, Michael B. Shattan, and Anil K. Patnaik, "Development of advanced machine learning models for analysis of plutonium surrogate optical emission spectra," Appl. Opt. 61, D30-D38 (2022). https://doi.org/10.1364/AO.444093 https://opg.optica.org/ao/abstract.cfm?URI=ao-61-7-D30
Included in
Atomic, Molecular and Optical Physics Commons, Data Science Commons, Engineering Physics Commons, Nuclear Commons
Comments
The "Download" button on this page provides the CHORUS-furnished accepted manuscript of the article.
The version of record of the article is available to subscribers of Applied Optics at the URL in the citation below.
Copyright statement: © 2022 Optica Publishing Group
An embargo was imposed on the accepted manuscript, and was lifted in January 2023, in accordance with research funding requirements.