PoS - Proceedings of Science
Volume 444 - 38th International Cosmic Ray Conference (ICRC2023) - Cosmic-Ray Physics (Indirect, CRI)
Reconstruction procedure of the Fluorescence detector Array of Single-pixel Telescopes
F. Bradfield*, J. Albury, J. Bellido, L. Chytka, J. Farmer, T. Fujii, P. Hamal, P. Horvath, M. Hrabovsky, V. Jilek, J. Kmec, J. Kvita, M. Malacari, D. Mandat, M. Mastrodicasa, J.N. Matthews, S. Michal, H. Nagasawa, H. Namba, L. Nozka, M. Palatka, M. Pech, P. Privitera, S. Sakurai, F. Salamida, P. Schovanek, R. Smida, D. Staník, Z. Svozilikova, A. Taketa, K. Terauchi, S.B. Thomas, P. Travnicek𝑐 and M. Vaculaet al. (click to show)
Full text: pdf
Pre-published on: July 25, 2023
Published on: September 27, 2024
Abstract
The Fluorescence detector Array of Single-pixel Telescopes (FAST) is one of several proposed designs for a next-generation cosmic-ray detector. Such detectors will require enormous collecting areas whilst also needing to remain cost-efficient. To meet these demands, the FAST collaboration has designed a simplified, low-cost fluorescence telescope consisting of only four photomultiplier tubes (PMTs). Since standard air shower reconstruction techniques cannot be used with so few PMTs, FAST utilises an alternative two-step approach. In the first step, a neural network is used to provide a first estimate of the true shower parameters. This estimate is then used as the initial guess in a minimisation procedure where the measured PMT traces are compared to simulated ones, and the best-fit shower parameters are found. A detailed explanation of these steps is given, with the expected performance of FAST prototypes at the Telescope Array experiment acting as a demonstration of the technique.
DOI: https://doi.org/10.22323/1.444.0303
How to cite

Metadata are provided both in "article" format (very similar to INSPIRE) as this helps creating very compact bibliographies which can be beneficial to authors and readers, and in "proceeding" format which is more detailed and complete.

Open Access
Creative Commons LicenseCopyright owned by the author(s) under the term of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.