The Data Quality Monitoring Software for the CMS experiment at the LHC: past, present and future
Massachusetts Inst. of Technology
3 Vilnius University LT
4 Universitaet Zuerich CH
5 Fermi National Accelerator Lab. US
6 Universidad de Oviedo. ES
* E-mail: email@example.com
Published online: 17 September 2019
The Data Quality Monitoring software is a central tool in the CMS experiment. It is used in the following key environments: (i) Online, for real-time detector monitoring; (ii) Offline, for the prompt-offline-feedback and final fine-grained data quality analysis and certification; (iii) Validation of all the reconstruction software production releases; (iv) Validation in Monte Carlo productions. Though the basic structure of the Run1 DQM system remains the same for Run2, between the Run1 and Run2 periods, the DQM system underwent substantial upgrades in many areas, not only to adapt to the surrounding infrastructure changes, but also to provide improvements to meet the growing needs of the collaboration with an emphasis on more sophisticated methods for evaluating data quality. We need to cope with the higher-energy and -luminosity proton-proton collision data, as well as the data from various special runs, such as Heavy Ion runs. In this contribution, we will describe the current DQM software, structure and workflow in the different environments. We then discuss the performance and our experiences with the DQM system in Run2. The main technical challenges which we have encountered and the solutions adopted during Run2 will also be discussed, including efficient use of memory in multithreading environments. Finally, we present the prospect of a future DQM upgrade with emphasis on functionality and long-term robustness for LHC Run3.
© The Authors, published by EDP Sciences, 2019
This is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.