https://doi.org/10.1051/epjconf/202125103036
Event Classification with Multi-step Machine Learning
1 International Center for Elementary Particle Physics, The University of Tokyo, 7-3-1 Hongo, Bunkyo, Tokyo, Japan
2 BrainPad Inc., 3-2-10 Shirokanedai, Minato, Tokyo, Japan
3 Institute for AI and Beyond, The University of Tokyo, 7-3-1 Hongo, Bunkyo, Tokyo, Japan
* e-mail: saito@icepp.s.u-tokyo.ac.jp
** e-mail: jtanaka@icepp.s.u-tokyo.ac.jp
Published online: 23 August 2021
The usefulness and value of Multi-step Machine Learning (ML), where a task is organized into connected sub-tasks with known intermediate inference goals, as opposed to a single large model learned end-to-end without intermediate sub-tasks, is presented. Pre-optimized ML models are connected and better performance is obtained by re-optimizing the connected one. The selection of an ML model from several small ML model candidates for each sub-task has been performed by using the idea based on Neural Architecture Search (NAS). In this paper, Differentiable Architecture Search (DARTS) and Single Path One-Shot NAS (SPOS-NAS) are tested, where the construction of loss functions is improved to keep all ML models smoothly learning. Using DARTS and SPOS-NAS as an optimization and selection as well as the connections for multi-step machine learning systems, we find that (1) such a system can quickly and successfully select highly performant model combinations, and (2) the selected models are consistent with baseline algorithms, such as grid search, and their outputs are well controlled.
© The Authors, published by EDP Sciences, 2021
This is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.