IEEE 2017 NSS/MIC/RTSD ControlCenter

Online Program Overview Session: N-14

To search for a specific ID please enter the hash sign followed by the ID number (e.g. #123).

Software R&D

Session chair: Paolo Saracco INFN; Arnold Burger
 
Shortcut: N-14
Date: Tuesday, October 24, 2017, 13:40
Room: Centennial I
Session type: NSS Session

Current research in software and computing

Contents

1:40 pm N-14-1

Delayed Gamma Ray Spectroscopy Inverse Monte Carlo Analysis Method for Nuclear Safeguards Non-Destructive Assay Applications (#3003)

D. C. Rodriguez1, F. Rossi1, M. Seya1, M. Koizumi1

1 Japan Atomic Energy Agency, ISCN, Tokai-mura, Ibaraki, Japan

Content

The Japan Atomic Energy Agency (JAEA) and European Commission Joint Research Centre (JRC) are collaborating to develop technology to improve the ability to quantify the uranium and plutonium content of highly radioactive and mixed nuclear material (e.g. spent nuclear fuel, plutonium-nitrate solution, mixed-oxide) for nuclear safeguards verification. A comprehensive system using a pulsed neutron source will integrate multiple active neutron interrogation techniques: differential die-away analysis, prompt gamma-ray analysis, neutron resonance transmission analysis, and delayed gamma-ray spectroscopy. Due to the current limitations of delayed gamma-ray spectroscopy, a separate program focuses on improving the capability to measure, analyze, and predict the delayed gamma-ray spectrum used to determine the fissionable nuclide ratios within a sample. Measurements performed at the JRC-Ispra in Italy and JAEA Plutonium Conversion Development Facility are used to correlate the observed delayed gamma rays to the composition that are then used to calibrate a delayed gamma-ray Monte Carlo. The Monte Carlo has the ability to predict expected delayed gamma rays, provide a sensitivity analysis to optimize future measurements, and can be used to analyze a spectrum using an inverse Monte Carlo method. Analyzing Monte Carlo with this inverse Monte Carlo analysis provides a way to determine systematic uncertainty as well as statistical uncertainty when multiple measurements are not feasible. This work will describe the efforts to develop the delayed gamma ray Monte Carlo, analysis, and capability for current and future applications.

Keywords: Delayed gamma rays, Inverse Monte Carlo, nondestructive assay, nuclear safeguards, nuclear measurements
1:58 pm N-14-2 Download

EUDAT: A European Outlook to Data Management (#4221)

S. de Witt1

1 United Kingdom Atomic Energy Authority, Advanced Computing Group, Abingdon, Oxfordshire, United Kingdom of Great Britain and Northern Ireland

On behalf of EUDAT Collaboration

Content

Data management planning – thinking in advance about what will happen to data produced during the research process – is increasingly required by national research funding agencies, and new guidelines for Horizon 2020 research projects were released by the EU in December 2013 (Guidelines on Data Management in Horizon 2020).  Similar guidelines have been issued by the US Department of Energy (Statement on Digital Data Management), Australia (ANDS Data Management Plans) and across many other countries.

EUDAT exists in part to disseminate and promote best practice in data management for twenty-first century research, and to provide support for communities in adopting basic principles such as PID registration, metadata creation, replication.

As part of its mission to help researchers and research communities manage and preserve their data, EUDAT has begun work with the world-recognised Digital Curation Centre on a version of their widely-used DMPonline tool which will capture the H2020 guidelines in a data management planning tool tailored to the emerging needs of European research.

EUDAT’s is building a Collaborative Data Infrastructure (CDI) as a pan-European solution to the challenge of data proliferation and associated management in Europe’s scientific and research communities. The CDI will allow researchers to share data within and between communities and enable them to carry out their research effectively. Our mission is to provide a solution that will be affordable, trustworthy, robust, persistent, open and easy to use.

Keywords: Data Management, EUDAT, Horizon 2020, Data Curation, Data Lifecycle
2:16 pm N-14-3 Download

The INDIGO-DataCloud project: enabling software technologies for efficient and effective use of Cloud computing and storage for science (#4245)

D. Salomoni1, L. Gaido2, I. Campos Plasencia3, J. Marco de Lucas3, P. Solagna4, J. Gomes5, L. Matyska6, P. Fuhrmann7, M. Hardt8, G. Donvito9, L. Dutka10, M. Plociennik11, R. Barbera12, D. Spiga13

1 INFN, CNAF, Bologna, Italy
2 INFN, Torino, Torino, Italy
3 CSIC, Santander, Spain
4 EGI Foundation, Amsterdam, Netherlands
5 LIP, Lisbon, Portugal
6 Masaryk University, Brno, Czech Republic
7 Deutsches Elektronen-Synchrotron (DESY), Hamburg, Germany
8 Karlsruhe Institute of Technology, Karlsruhe, Germany
9 INFN, Bari, Bari, Italy
10 AGH UST, Krakow, Poland
11 Poznań Supercomputing and Networking Center, Poznan, Poland
12 INFN, Catania, Catania, Italy
13 Istituto Nazionale di Fisica Nucleare (INFN), Perugia, Perugia, Italy

On behalf of the INDIGO-DataCloud Project

Content

The INDIGO-DataCloud Project (https://www.indigo-datacloud.eu) is a software development project funded under the EC Horizon2020 framework.

INDIGO (INtegrating Distributed data Infrastructures for Global ExplOitation) develops modular software components aimed at facilitating the exploitation of distributed, Cloud-based computing and software resources by scientific communities. The overall goal is to make it possible to overcome common gaps and difficulties often preventing scientists to discover and use resources in either private or public infrastructures.

The INDIGO Consortium includes 26 partners located in 11 countries, with major European players in areas such as distributed software engineering, research and commercial e-infrastructure providers, as well as many international communities, belonging to diverse scientific domains.

This contribution first describes the processes and results of the collection of requirements from the INDIGO scientific communities, and then shows the INDIGO architecture that was developed out of that phase.

It then proceeds to present the concrete outcome of the project, that took the form of two major software releases. The latter (ElectricIndigo) was released in April 2017 and includes 40 modular software components, distributed via 170 packages and 50 ready-to-use Docker containers. It provides turnkey solutions for science in areas such as Data Management and Data Analytics, Application-level Interfaces and Automated Service Composition, Programmable Web Portals and Mobile interfaces, Identity and Access Management.

It then shows how INDIGO technologies are being exploited by scientific communities in many fields to address scientific problems using Grid, Cloud, HPC and local infrastructures in public and/or private environments. 

Finally, perspectives on the evolution of INDIGO software in upcoming international initiatives will be given, including its role in the European Open Science Cloud and the European Data Infrastructure.

Keywords: Cloud computing, cloud storage, data analysis, data processing
2:34 pm N-14-4 Download

Fast simulations in LHCb (#3534)

M. Rama1

1 Istituto Nazionale di Fisica Nucleare, sezione di Pisa, Pisa, Italy

On behalf of the LHCb Collaboration

Content

LHCb is one of the major experiments operating at the Large Hadron Collider at CERN. The richness of the physics program and the increasing precision of the measurements in LHCb lead to the need of ever larger simulated samples. This need will increase further when the upgraded LHCb detector will start collecting data in the LHC Run 3. Given the computing resources pledged for the production of Monte Carlo simulated events in the next years, the use of fast simulation techniques will be mandatory to cope with the expected dataset size. A number of fast simulation options are already available or under development to complement the full simulation of the LHCb detector based on Geant4. They include simulating a subset of the generated particles, simplifying the detector geometry, re-using the underlying event, replacing the detailed simulation of the calorimeter with a faster version based on hit libraries, or using a fully parametric simulation of the detector. We present the available options, describe their applications and discuss the future developments. We also mention how we intend to make the different options transparently available in the LHCb simulation framework.

Keywords: fast simulation, high energy physics experiment
2:52 pm N-14-5 Download

Measurements and trends of Geant4 software evolution (#1754)

E. Ronchieri1, M. G. Pia2

1 INFN, CNAF, Bologna, Italy
2 INFN, Genoa Department, Genova, Italy

Content

The Geant4 simulation toolkit is a mature, large scale scientific software system, which is the result of more than two decades' development. The programming environment has evolved substantially since the beginning of Geant4 R\&D in 1994; in parallel, Geant4 functionality has also evolved since its first release in 1998 to cope with new requirements of the experimental community and with advancements in physics and technology.

We present a quantitative assessment of the evolution of Geant4 software based on established software metrics, which concern various aspects of the code (such as size, complexity and object oriented features) and various elements  (such as classes, functions and files). To the best of our knowledge, it is the widest quantitative assessment ever performed over a scientific software system in terms of the set of metrics involved, the number of measurements performed and the software lifetime spanned, not to mention the large scale of the software system itself.

The measurements, performed over the whole lifetime of Geant4, were analyzed by means of econometric methods to identify the presence of trends in Geant4 software evolution. The analysis objectively recognizes improvements and degradation of software properties, such as code complexity and object oriented characteristics, at the desired depth in the  package structure.

We illustrate the trends in  Geant4 software evolution resulting from the analysis. We discuss how the methodology and tools we developed can be applied in software development environments as a risk mitigation strategy and as concrete feedback on the effects of software improvement plans.

Keywords: Software measurements, Trend analysis, Geant4
3:10 pm N-14-6

Ghost Science (#1031)

T. Basaglia1, Z. W. Bell2, A. Burger3, P. V. Dressendorfer4, M. G. Pia5

1 CERN, Geneva, Switzerland
2 Oak Ridge National Laboratory, TN, United States of America
3 Fisk University, TN, TN, United States of America
4 IEEE, Piscataway, United States of America
5 INFN Genova, INFN Sezione di Genova, Genova, Italy

Content

This scientometric study examines and quantifies some features concerning the publication, use and citation of software tools in experimental domains represented in this conference: nuclear and particle physics, astrophysics and radiation medical science. The data are derived from established databases of scientific literature and are analyzed with suitable statistical methods. Major Monte Carlo simulation and analysis systems are assessed to characterize their use and their citation patterns across different research areas. The results of this investigation highlight some general features: the scarcity of publications in scholarly journals as references for widely used software tools and the frequent omission of their citation in scientific papers where the corresponding software system is mentioned. A comprehensive overview of the outcome of this study is presented, with the intent of promoting a reflection on the role of scientific software in experimental research.

Keywords: simulation, data analysis, software, scientometrics