Nuclear data

Knowledge about the atomic nuclei is important for various applications, such as nuclear medicine, nuclear reactors, and radiometric dating. Estimates of the nuclides' properties, such as cross sections and half-lifes, are referred to as nuclear data. These estimates can be obtained by performing experiments. Importantly, the raw data from measurements, such as detector counts, must always be transformed and corrected to reflect the properties of the nuclei, which are independent of the applied experimental technique. In practice, these corrections cannot be perfectly done, and the estimates of nuclear quantities are impacted to varying degrees by the specifics of the employed experimental setup. For this reason, the corrected data from experiments are assessed another time by evaluators. As a first step of an evaluation, evaluators collect and compare available experimental data. If there is a mismatch between different experiments, detected either by visual inspection of plots or by statistical methods, such as Chauvenet's criterion, an attempt is made to identify the problem by reading the original publications and talking with the experimenters. Depending on the outcome of this investigation, experimental data are then corrected if possible or discarded. A correction may be the adjustment of an estimate or of the associated uncertainty. After this initial assessment of experimental data, the evaluator applies a statistical procedure, usually the Bayesian version of the Generalized Least Squares method, to combine the information from related experiments to obtain estimates and associated uncertainties. The results of such an evaluation are put into evaluated nuclear data (ENDF) files and integrated into evaluated nucear data libraries, such as ENDF/B, JEFF, and JENDL. There is also the TENDL library, which is created with the nuclear models code TALYS and features automation of the library creation process to a large extent.

Methodological Developments

The generalized least squares (GLS) method is a useful and well-established tool in the field of nuclear data evaluation. In order to use GLS, all relationships between variables, e.g., cross sections and parameters of nuclear models, must be modeled as linear relationships and the joint distribution must be multivariate normal. These assumptions are sometimes too crude. For instance, some experiments measure the ratio of two cross sections. If the value of the ratio follows a normal distribution, the individual cross sections will not. Also, nuclear models are usually non-linear and the use of a linear approximation in the GLS method may lead to misleading results. For this reason, other evaluation techniques, such as Bayesian Monte Carlo, UMC-B, and BFMC, have been developed to overcome some limitations of the GLS method. Like GLS, they are Bayesian methods. However, unlike GLS they don't linearize the nuclear model but employ importance sampling to draw samples from the posterior distribution.

Even though the mentioned developments are important steps towards more robust and reliable evaluations, there are still many unsolved problems left. For instance, scaling up Monte Carlo methods in terms of the number of isotopes and the number of included experimental data points is difficult because it is not straight-forward to efficiently target the posterior distribution. Sometimes the nuclear models cannot describe the experimental data well, even if model parameters are fitted to the very same data. Therefore evaluation methods must account for the possibility that models are imperfect. Also the experimental data can be wrong due to various reasons, and an evaluation procedure should ideally detect such cases and reasonably handle them. It can be foreseen that new research in the field of uncertainty quantification, Bayesian statistics, and machine learning will help addressing these issues and eventually change the way we deal with nuclear data.

Objective of this portal

This portal for nuclear data is still in its initial stage and under development but its aim is clear: It should provide an overview of existing evaluation methods, inform about recent developments concerning evaluation methodology, and provide interactive demonstrations of algorithms. The term evaluation methodology is here understood in a broad sense and also encompasses mathematical methods that help experimenters analyzing their experiments. The portal should also give pointers to useful software and other ressources relevant for nuclear data. Ideally in the future, it will grow and get more useful thanks to a collaborative effort of the nuclear physics community.

If you are aware of a useful web resource or insightful paper related to nuclear data evaluation which you want to see linked here, write an email to georg.schnabel@nucleardata.com.

Conferences, workshops and meetings

Sites with announcements of events

Upcoming events

Past events

Interactive demonstrations

Especially when it comes to the study of novel nuclear data evaluation methods, the best way to develop an intuition and understanding of their working is to play with them using different assumptions and to apply them in various scenarios. Better yet if everyone is empowered to explore interactively the behavior of evaluation algorithms without the need to install additional software. Removing technical obstacles between the user and the application of algorithms facilitates the involvement of a broader expert audience. This in turn makes it more likely that shortcomings and unfavorable features of algorithms are discovered before they are employed in "production" evaluations. Therefore, the interactive exploration potentially helps in the conception of better algorithms. In this section you find interactive demonstrations of algorithms, which run in your browser. There is no need to install any additional software.

Separation of different contributions in a detector output [link]

In experimental nuclear physics during data analysis, one is facing the task to split the measured signal of a particle detector in several contributing components in order to extract the relevant one. The relevant contribution is given by the particles being produced in a controlled way within the experiment whereas the other contributions are associated with particles entering the detector from the outside (e.g., from space) or particles that are produced in the experiment but whose impact on the detector output is undesired. This interactive demonstration showcases the separation of two bivariate normal peaks from uniform background noise either by the expectation-maximization algorithm or Gibbs sampling. A nice feature of the Gibbs sampling approach is that uncertainties of all quantities involved, i.e., component proportions, centers, and covariance matrices, are also available. Gibbs sampling is a specific realization of a Markov Chain Monte Carlo (MCMC) algorithm.

Impact of non-linearities in nuclear data evaluation [link]

One common approach to perform nuclear data evaluation is to use a special case of the Kalman filter, which coincides with the Bayesian version of the Generalized Least Squares (GLS) method. The Kalman filter (in its basic form) assumes a linear mapping between the state and the observation and therefore nuclear models have to be replaced by a first-order Taylor approximation for the procedure. The linear approximation of non-linear nuclear models can lead to strongly distorted results. This demonstration enables the interactive exploration of the magnitude of such distortions at the example of the nuclear model code TALYS and the neutron-induced total cross section of 181Ta.

Nuclear data evaluation with multivariate mixture models [link]

The comparison of experimental data collected over decades often reveals discrepancies between experiments. One possible approach to deal with this issue in an automated manner is to employ Bayesian hierarchical modelling which enables treating the experimental uncertainties and correlations as unknowns (or adjustable parameters). This route has been explored in this paper. The underlying idea is that all experimental estimates are valid and only the associated uncertainties have been occassionally misjudged. However, it is also possible to take the view that some experimental datasets are correct and others are not. This perspective can be modelled using mixture models. As nuclear data evaluation is routinely done using the Generalized Least Squares method and hence based on the assumption of a multivariate normal distribution, the most straight-forward extension is to employ a multivariate normal mixture model. The linked interactive demonstration allows to explore this approach with consistent and inconsistent datasets under the assumption of either a linear model, Gaussian process, or custom model as prior.

Software

exforParser - EXFOR to JSON converter

The JSON format is a lightweight data-interchange format which is easy for humans to read and write and easy for machines to parse and generate. Most programming languages provide facilities to conveniently deal with it. The availability of EXFOR data in the JSON format is therefore helpful for nuclear data evaluators and in general everyone using EXFOR data in their work. For this reason, the JSON format has also been added to the IAEA web interface to EXFOR as an output option. Here an EXFOR to JSON converter written in the programming language R is provided. This EXFOR parser follows the philosophy of minimal structural change. Fields in the original EXFOR data are fields in the outputted JSON object keeping modifications of their content to a minimum. The idea is that changes in structure of the JSON object can be easily effected using a high-level language with rich string manipulation facilities, such as Python.

Link to the EXFOR to JSON converter (exforParser)

Online repositories

The authority for the systematic collection, compilation and disseminiation of nuclear data are the Nuclear Reaction Data Centres (NRDC) (see also here). They form a worldwide cooperation under the auspices of the International Atomic Energy Agency (IAEA). Online services for the retrieval of nuclear data in different formats and visualizations can be found on the websites of the centers.
EXFOR library
The EXFOR library [16] contains a comprehensive collection of reaction data from nuclear experiments performed throughout the history of nuclear physics. The NRDCs provide online services, such as the EXFOR search at the IAEA, to search for data, visualize them, and download them in various formats. For the development of new evaluation methods and also for regular evaluations, it is beneficial to have the database stored locally on the computer for fast accesss. For convenience, the complete EXFOR library can be downloaded here in bundled form. It has been retrieved on 7 March 2019 using the IAEA search form. If a fully up-to-date version of the library is needed, please contact directly the Nuclear Data Services Unit of the IAEA. The links in the following table point to twenty compressed tar archives containing the EXFOR entries in text format. The numbers in the link names denote the range of EXFOR access numbers present in the respective archive.

10001-10535 10536-11690 11691-13569 13570-13768 13769-14239
14240-20118 20119-21773 21774-22412 22413-22921 22922-23129
23130-23250 23251-23324 23325-23415 23416-A0099 A0100-C1030
C1031-D0487 D0488-E1841 E1842-F1045 F1046-O0678 O0679-V1002

Docker images

Automation, reproducibility, and transparency are important topics in the ongoing discussions about how to improve the process of nuclear data evaluation. Over the last few years, the Docker technology is rapidly gaining momentum in the IT industry and it can also help nuclear data evaluation to get more transparent, automated, and reproducible. The Docker application helps to manage and operate with so-called Docker images and Docker containers. Loosely speaking, a Docker container can be regarded as a light-weight virtual machine which can run applications isolated from the rest of the computer system. Docker images are templates to create Docker containers. The advantage of using Docker images is that an application can be bundled with all its dependencies. The installation of such a bundle becomes very easy thanks to the Docker application. The setup of complex evaluation pipelines that depend on a lot of components, such as databases, libraries, interpreters for specific languages, nuclear physics codes, etc. becomes trivial. This circumstance facilitates sharing of data and code between researchers. Collaborating is facilitated and consequently the whole research and evaluation process accelerated. This section provides Dockerfiles which are scripts to create Docker images relevant and helpful for nuclear data evaluation.

Installation instructions for Docker Community Edition on Windows, Linux, and Mac can be found in the official Docker documentation. Here are direct links for Windows, Mac, Ubuntu, Debian, Fedora, and CentOS. At the time of writing the Docker Community Edition is provided under the Apache License 2.0, which is a permissive license. The up-to-date license information can be found in the official Docker GitHub repository.

Prototype of a pipeline for reproducible nuclear data evaluation

During evaluation work many choices have to be made concerning the selection of experimental data, their uncertainties, model parameters, and statistical algorithms to adjust model parameters based on the information from experiments. It is very difficult to convey all information in a technical report or paper that would be required to reproduce an evaluation and reproducibility is important in nuclear data evaluation. Therefore, all choices made in an evaluation should ideally be implemented as a sequence of scripts, also referred to as a pipeline. In this way, other people can comparatively easily reproduce the evaluation by rerunning the pipeline, scrutinize and test the impact of assumptions or do an improved evaluation using an available evaluation pipeline as a starting point. A prototype of an evaluation pipeline is provided here, which contains several innovations in evaluation methodology, such as the automatic correction of experimental uncertainties using marginal likelihood optimization (MLO), Gaussian process priors on energy-dependent model parameters, and the optimization of model parameters using a customized Levenberg-Marquardt algorithm, which takes into account prior knowledge and the non-linearity of the physics model. The pipeline can be used in combination with a cluster to perform a full-scale evaluation. In its current form, it implements the evaluation of neutron-induced cross sections of Fe56 and has been successfully employed to adjust about 150 parameters of the nuclear models code TALYS after sensitivity analysis of about thousand model parameters.

Yet another computational EXFOR database (YacoX4)

The EXFOR library [16] is an essential resource for nuclear data evaluation. Convenient programmatic access from the popular high-level languages is therefore one key element for automated and reproducible evaluation pipelines. The nuclear data section of the IAEA and some other NRDCs provide computational formats, such as the C4 format, which are convenient for automated processing. Complementary to these formats, the complete EXFOR library has been converted to a MongoDB database. This database belongs to the class of document-oriented databases, a subclass of a so-called NoSQL database. The MongoDB database software features an expressive query language and MonoGDB databases can be accessed from a variety of high-level programming languages including C, C++, Python, R, Java, and Perl. The following resources give guidance in the installation and use of the MongoDB EXFOR database on your local computer or cluster:

Important disclaimer: This database is a prototype enabling access to the EXFOR library via the MongoDB API. It is not fully up-to-date with the EXFOR library maintained by the Nuclear Reaction Data Centres (NRDC) and may contain other shortcomings. Therefore, neither the creator of this Docker image nor the NRDCs take any responsibility for the completeness or correctness of the information in this database.

EXFOR-CouchDB

The EXFOR library has also been made available as a CouchDB database. The reason being that CouchDB database software comes with a more permissive license than the MongoDB database software, which was originally used to make accessible the EXFOR library as a NoSQL database. Technical details aside, CouchDB provides about the same functionality as MongoDB. It also enables the storage of the EXFOR subentries in the JSON format.

Important disclaimer: This database is a prototype enabling access to the EXFOR library via the CouchDB API. It is not fully up-to-date with the EXFOR library maintained by the Nuclear Reaction Data Centres (NRDC) and may contain other shortcomings. Therefore, neither the creator of this Docker image nor the NRDCs take any responsibility for the completeness or correctness of the information in this database.

References

  1. G. Schnabel, A computational EXFOR database, 2019, arXiv:1908.00209 [link]

  2. G. Schnabel, H. Sjöstrand, A first sketch: Construction of model defect priors inspired by dynamic time warping, Proc. of Wonder-2018, 2019 [link]

  3. Helgesson, P., Sjöstrand, H., Treating model defects by fitting smoothly varying model parameters: Energy dependence in nuclear data evaluation. Annals of Nuclear Energy 120, 35–47, 2018 [link]

  4. G. Schnabel, Estimating model bias over the complete nuclide chart with sparse Gaussian processes at the example of INCL/ABLA and double-differential neutron spectra, EPJ-N, 2018, [link]

  5. Neudecker, D., Hejnal, B., Tovesson, F., White, M.C., Smith, D.L., Vaughan, D., Capote, R., Template for estimating uncertainties of measured neutron-induced fission cross-sections. EPJ Nuclear Sciences & Technologies 4, 21. 2018 [link]

  6. Helgesson, P., Sjöstrand, H., Fitting a defect non-linear model with or without prior, distinguishing nuclear reaction products as an example. Review of Scientific Instruments 88, 115114, 2017 [link]

  7. G. Schnabel, Fitting and Analysis Technique for Inconsistent Nuclear Data, Proc. of MC2017, 2017, [link] [conf-link]

  8. G. Schnabel, H. Leeb, A modified Generalized Least Squares method for large scale nuclear data evaluation, Nuclear Instruments and Methods A, 2017, [link]

  9. G. Schnabel, Adaptive Monte Carlo for nuclear data evaluation, Proc. of ND2016, 2017 [link]

  10. G. Schnabel, H. Leeb, Differential Cross Sections and the Impact of Model Defects in Nuclear Data Evaluation, Proc of Wonder-2015, 2016 [link]

  11. Koning, A.J., Bayesian Monte Carlo method for nuclear data evaluation. The European Physical Journal A 51, 184, 2015 [link]

  12. G. Schnabel, Large scale Bayesian nuclear data evaluation with consistent model defects, Dissertation, TU Vienna, 2015, [link]

  13. H. Leeb, G. Schnabel, Th. Srdinko, V. Wildpaner, Bayesian Evaluation Including Covariance Matrices of Neutron-induced Reaction Cross Sections of 181Ta, Nuclear Data Sheets 123, 2015 [link]

  14. G. Schnabel, H. Leeb, A New Module for Large Scale Bayesian Evaluation in the Fast Neutron Energy Region, Nuclear Data Sheets 123, 2015 [link]

  15. Hoefer, A., Buss, O., Hennebach, M., Schmid, M., Porsch, D., MOCABA: A general Monte Carlo–Bayes procedure for improved predictions of integral functions of nuclear data. Annals of Nuclear Energy 77, 514–521. 2015 [link]

  16. N. Otuka, E. Dupont, V. Semkova et al., Towards a More Complete and Accurate Experimental Nuclear Reaction Data Library (EXFOR): International Collaboration Between Nuclear Reaction Data Centres (NRDC), Nuclear Data Sheets 120, 272-276, 2014 [link]

  17. Rochman, D., Koning, A.J., Bauge, E., Plompen, A.J.M., 2014. From Flatness to Steepness: Updating TALYS Covariances with Experimental Information. Annals of Nuclear Energy 73, 7–16. 2014 [link]

  18. H. Leeb, G. Schnabel, Th. Srdinko, What is the proper evaluation method, Proc. of NEMEA-7, 2014 [link]

  19. Capote, R., Smith, D.L., Trkov, A., Meghzifene, M., A New Formulation of the Unified Monte Carlo Approach (UMC-B) and Cross-Section Evaluation for the Dosimetry Reaction 55 Mn(n,g) 56 Mn. Journal of ASTM International 9, 179–196. 2012 [link]

  20. Leeb, H., Neudecker, D., Srdinko, T., Consistent Procedure for Nuclear Data Evaluation Based on Modeling. Nuclear Data Sheets 109, 2762–2767, 2008 [link]

  21. Bauge, E., Hilaire, S., Dossantos-Uzarralde, P., Evaluation of the Covariance Matrix of Neutronic Cross Sections with the Backward-Forward Monte Carlo Method. Proc. of the Int. Conf. Nuclear Data for Science and Technology, 2007 [link]

  22. Pigni, M.T., Leeb, H., Uncertainty Estimates of Evaluated 56Fe Cross Sections Based on Extensive Modelling at Energies Beyond 20 MeV, Proc. Int. Workshop on Nuclear Data for the Transmutation of Nuclear Waste. GSI-Darmstadt, Germany, 2003 [pdf]