26 - 30 April 2026
National Harbor, Maryland, US
Conference 14031 > Paper 14031-15
Paper 14031-15

Blink and you’ll miss it: event-based data synthesis for fast-moving object detection in constrained platforms

28 April 2026 • 2:10 PM - 2:30 PM EDT | National Harbor 5

Abstract

Detecting fast-moving objects in dynamic scenes is a fundamental challenge in event-based vision, with applications ranging from terrestrial surveillance to space-based tracking of Resident Space Objects (RSOs). Conventional frame-based imaging and convolutional pipelines struggle to meet the latency and power requirements of resource-constrained platforms. In this work, we present a scalable method for generating labeled training data by superimposing real event streams from stationary-camera foreground recordings and ego-motion background recordings, and demonstrate that a lightweight CNN trained on this data generalizes to real, unmodified event streams. This provides a practical foundation for future deployment in autonomous detection systems, including space-based applications where event cameras are well-suited to the constraints of spacecraft payloads.

Presenter

Nathan Nelson
Utah State University (United States)
Nathan Nelson is a graduate researcher in the Computer Science program at Utah State University, where he works in the DIRECT Lab on applied machine learning and sensing systems. His research interests include event-based vision, neuromorphic sensing, and the development of efficient algorithms for resource-constrained platforms. He contributes to ongoing research at the intersection of novel sensor technologies and real-time detection systems.
Application tracks: Space
Author
Utah State Univ. (United States)
Presenter/Author
Nathan Nelson
Utah State University (United States)
Author
Jarrett Parry
Utah State University (United States)
Author
Mario Harper
Utah State Univ. (United States)