A counting-time optimization method for artificial neural network (ANN) based gamma-ray spectroscopy

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 22
  • Download : 0
With advancements in machine learning technologies, artificial neural networks (ANNs) are being widely used to improve the performance of gamma-ray spectroscopy based on NaI(Tl) scintillation detectors. Typically, the performance of ANNs is evaluated using test datasets composed of actual spectra. However, the generation of such test datasets encompassing a wide range of actual spectra representing various scenarios often proves inefficient and time-consuming. Thus, instead of measuring actual spectra, we generated virtual spectra with diverse spectral features by sampling from categorical distribution functions derived from the base spectra of six radioactive isotopes: 54Mn, 57Co, 60Co, 134Cs, 137Cs, and 241Am. For practical applications, we determined the optimum counting time (OCT) as the point at which the change in the Kullback–Leibler divergence (ΔKLDV) values between the synthetic spectra used for training the ANN and the virtual spectra approaches zero. The accuracies of the actual spectra were significantly improved when measured up to their respective OCTs. The outcomes demonstrated that the proposed method can effectively determine the OCTs for gamma-ray spectroscopy based on ANNs without the need to measure actual spectra.
Publisher
Elsevier BV
Issue Date
2024-07
Language
English
Article Type
Article
Citation

Nuclear Engineering and Technology, v.56, no.7, pp.2690 - 2697

ISSN
1738-5733
DOI
10.1016/j.net.2024.02.029
URI
http://hdl.handle.net/10203/320182
Appears in Collection
NE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0