Tracking particles ejected from active asteroid Bennu with event-based vision

Tracking particles ejected from active asteroid Bennu with event-based vision

Loïc James Azzalini, Dario Izzo

download PDF

Abstract. Early detection and tracking of ejecta in the vicinity of small solar system bodies is crucial to guarantee spacecraft safety and support scientific observation. During the visit of active asteroid Bennu, the OSIRIS-REx spacecraft relied on the analysis of images captured by onboard navigation cameras to detect particle ejection events, which ultimately became one of the mission’s scientific highlights. To increase the scientific return of similar time-constrained missions, this work proposes an event-based solution that is dedicated to the detection and tracking of centimetre-sized particles. Unlike a standard frame-based camera, the pixels of an event-based camera independently trigger events indicating whether the scene brightness has increased or decreased at that time and location in the sensor plane. As a result of the sparse and asynchronous spatiotemporal output, event cameras combine very high dynamic range and temporal resolution with low-power consumption, which could complement existing onboard imaging techniques. This paper motivates the use of a scientific event camera by reconstructing the particle ejection episodes reported by the OSIRIS-REx mission in a photorealistic scene generator and in turn, simulating event-based observations. The resulting streams of spatiotemporal data support future work on event-based multi-object tracking.

Keywords
Neuromorphic Vision, Event-Based Sensing, Multi-Object Tracking

Published online 11/1/2023, 5 pages
Copyright © 2023 by the author(s)
Published under license by Materials Research Forum LLC., Millersville PA, USA

Citation: Loïc James Azzalini, Dario Izzo, Tracking particles ejected from active asteroid Bennu with event-based vision, Materials Research Proceedings, Vol. 37, pp 567-571, 2023

DOI: https://doi.org/10.21741/9781644902813-124

The article was published as article 124 of the book Aeronautics and Astronautics

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 license. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

References
[1] Lauretta, D. S. et al. (2019). Episodes of particle ejection from the surface of the active asteroid (101955) Bennu. Science, 366 (6470), https://doi.org/10.1126/science.aay3544
[2] Chesley, S. R. et al. (2020). Trajectory estimation for particles observed in the vicinity of (101955) Bennu. Journal of Geophysical Research: Planets, 125, e2019JE006363. https://doi.org/10.1029/2019JE006363
[3] Liounis, A. J. et al. (2020). Autonomous detection of particles and tracks in optical images. Earth and Space Science, 7, e2019EA000843. https://doi.org/10.1029/2019EA000843
[4] Liu, S.-C., & Delbruck, T. (2010). Neuromorphic sensory systems. Current Opinion in Neuro- biology, 20 (3), 288-295. https://doi.org/10.1016/j.conb.2010.03.007
[5] Gallego, G. et al. (2022). Event-based vision: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44 (1), 154-180. https://doi.org/10.1109/TPAMI.2020.3008413
[6] Lagorce, X. et al. (2015). Asynchronous Event-Based Multikernel Algorithm for High-Speed Visual Features Tracking. IEEE Transactions on Neural Networks and Learning Systems, vol. 26, no. 8, pp. 1710-1720. https://doi.org/10.1109/TNNLS.2014.2352401
[7] Izzo, D. et al. (2022). Neuromorphic computing and sensing in space. arXiv preprint arXiv:2212.05236. https://doi.org/10.48550/arXiv.2212.05236
[8] Chin, T.-J., et al. (2019). Star tracking using an event camera. In 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) (p. 1646-1655). https://doi.org/10.1109/CVPRW.2019.00208
[9] Cheung, B. et al. (2018). Probabilistic multi hypothesis tracker for an event based sensor. In 2018 21st International Conference on Information Fusion (fusion) (p. 1-8). https://doi.org/10.23919/ICIF.2018.8455718
[10] Afshar, S. et al. (2020). Event-Based Object Detection and Tracking for Space Situational Awareness. IEEE Sensors Journal, vol. 20, no. 24, pp. 15117-15132. https://doi.org/10.1109/JSEN.2020.3009687
[11] Oliver, R. et al. (2022). Event-based sensor multiple hypothesis tracker for space domain awareness. In AMOS Conference 2022. https://doi.org/10.5167/uzh-231276
[12] Hergenrother, C. W. et al. (2020). Photometry of particles ejected from active asteroid (101955) Bennu. Journal of Geophysical Research: Planets, 125, e2020JE006381. https://doi.org/10.1029/2020JE006381
[13] Pajusalu M. et al. (2022) SISPO: Space Imaging Simulator for Proximity Operations. PLOS ONE 17(3): e0263882. https://doi.org/10.1371/journal.pone.0263882
[14] Gehrig, D. et al. (2020). Video to events: Recycling video datasets for event cameras. In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (p. 3583-3592). https://doi.org/10.1109/CVPR42600.2020 .00364
[15] Hu, Y. et al. (2021). v2e: From Video Frames to Realistic DVS Events. 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Nashville, TN, USA, 2021, pp. 1312-1321, https://doi.org/10.1109/CVPRW53098.2021.00144
[16] Prophesee.ai. Metavision SDK Docs – Recordings and Datasets, https://docs.prophesee.ai/