26 - 30 April 2026
National Harbor, Maryland, US
Conference 14031 > Paper 14031-17
Paper 14031-17

Frequency-domain event-based imaging for selective surveillance

28 April 2026 • 2:30 PM - 2:50 PM EDT | National Harbor 5

Abstract

Event-based cameras (EBCs) encode pixel-level radiance changes with microsecond latency and high dynamic range, enabling robust motion extraction while suppressing static background clutter. However, their sparse, asynchronous output requires algorithms that operate directly in event space. We present Frequency Rate Information for Event-Space (FRIES), a neuromorphic framework that detects periodic structure in event streams to discriminate man-made objects, such as drones, via rotor and vibration signatures. FRIES temporally filters events, forms activity-based regions of interest, and applies localized Fourier analysis to extract dominant frequencies, which are then tracked using a Resonant Time Surface. Indoor and outdoor experiments demonstrate recovery of mechanical chopper and drone rotor rates from ~74–451 Hz, highlighting frequency-domain event processing as a promising front end for selective surveillance

Presenter

Megan Birch
Georgia Tech Research Institute (United States)
Event-based cameras (EBCs) encode pixel-level radiance changes with microsecond resolution and high dynamic range, enabling motion extraction while suppressing static background. Their asynchronous, sparse output, calls for algorithms that identify targets directly in event space. We introduce Frequency Rate Information for Event-Space (FRIES), a neuromorphic framework for discriminating and monitoring man-made objects by detecting periodicity in pixel-level events, such as rotor rotation. FRIES performs temporal gating as a low-pass filter to suppress high-frequency noise, then aggregates events into a pixel-wise activity map and clusters them into Regions of Interest (ROIs). A localized Fourier analysis on each ROI extracts dominant frequencies to separate structured object signatures from background. Objects are then tracked using a Resonant Time Surface (RTS), which weights events by phase coherence with the target frequency, enhancing the periodic signal while suppressing clutter.
Application tracks: AI/ML
Presenter/Author
Megan Birch
Georgia Tech Research Institute (United States)
Author
Georgia Tech Research Institute (United States)
Author
Georgia Tech Research Institute (United States)
Author
Georgia Tech Research Institute (United States)
Author
Joseph Greene
Georgia Tech Research Institute (United States)