8. Data Handling

The main purpose of NIF is to produce high-quality experimental data that is used to validate theoretical physics models and numerical simulation tools. NIF software tools have been designed and constructed to manage and integrate data from multiple sources, including machine-state configurations and calibrations, experimental shot data, and pre- and post-shot simulations. During a shot, data from each diagnostic is automatically captured and archived in a database. Arrival of new data triggers the shot analysis, visualization, and infrastructure (SAVI) engine, the first automated analysis system of its kind, which distributes the signal and image processing tasks to a Linux cluster and launches an analysis for each diagnostic.18 Results are archived in NIF’s data repository for experimentalist approval and display using a web-based tool. Key features are that scientists can review data results remotely or locally, download results, and perform and upload their own analysis. This applies to unclassified data only; classified data are handled separately (see Section 8.5).

NIF’s automated systems for data analysis can require cutting-edge techniques to avoid relying on a human in the loop. This innovation is necessary because of the sheer quantity of NIF experimental data that must be accurately corrected, calibrated, fused, and analyzed. For example, NIF tools track more than 100,000 installed hardware components with associated calibration data. Analysis of a single diagnostic can require data from hundreds of calibrated parts and analysis parameters. Our SAVI automated analysis portfolio covers 69 physical detector locations with up to 32 detectors per location requiring a total capacity of 765 independent analysis modules that may run after every NIF experiment.

The NIF Shot Archive stores the raw data and analysis results from more than 4,000 experiments and contains 300,000,000 archive objects in a searchable, relational database of over 250 gigabytes. Data in the Shot Archive are immutable but can be versioned. The Shot Archive supports multiple methods of access to support the needs of our experimentalists. These methods include the use of Archive Viewer, Archive WebDAV, and other programmatic interfaces to retrieve data for local and online processing. The Archive Viewer provides 32 diagnostic dashboards, six working group dashboards, and four top-level dashboards that allow scientists to view data in near-real time as the data is saved in the archive.

Post-shot data analysis and laser performance reporting is provided by the Laser Performance Operations Model (LPOM). To do this, LPOM is directly linked to the Integrated Computer Control System (ICCS) shot database. Upon request, it can quickly (within minutes) provide the NIF shot director and user with a report that compares predicted and measured results, summarizing how well the shot met the requested goals of the shot. In addition, the LPOM data reporting system can access and display near-field and far-field images taken on each of the laser diagnostic locations and provide comparisons with predicted images. The results of the post-shot analysis are displayed on the LPOM graphical user interface (see Figure 8-1), while a subset of the analysis is presented to the shot director through a shot supervisor interface.

 

After each shot, users can examine the detailed laser performance for any beamline using a suite of data trending and analysis tools
Figure 8-1. After each shot, users can examine the detailed laser performance for any beamline using a suite of data trending and analysis tools.

 

 


18S. Azevedo et al., “Automated Experimental Data Analysis at the National Ignition Facility,” presented at International Conference on Accelerators and Large Experimental Physics Control Systems, Kobe, Japan, October 13, 2009. M. S. Hutton et al., “Experiment Archive, Analysis, and Visualization at the National Ignition Facility,” Fusion Engineering and Design 87, 2087-2091 (2012).