Date of Award
Master of Science in Electrical Engineering
Department of Electrical and Computer Engineering
Robert C. Leishman, PhD
Event-based cameras are a new type of visual sensor that operate under a unique paradigm. These cameras provide asynchronous data on the log-level changes in light intensity for individual pixels, independent of other pixels' measurements. Through the hardware-level approach to change detection, these cameras can achieve microsecond fidelity, millisecond latency, ultra-wide dynamic range, and all with very low power requirements. The advantages provided by event-based cameras make them excellent candidates for visual odometry (VO) for unmanned aerial vehicle (UAV) navigation. This document presents the research and implementation of an event-based visual inertial odometry (EVIO) pipeline, which estimates a vehicle's 6-degrees-of-freedom (DOF) motion and pose utilizing an affixed event-based camera with an integrated Micro-Electro-Mechanical Systems (MEMS) inertial measurement unit (IMU). The front-end of the EVIO pipeline uses the current motion estimate of the pipeline to generate motion-compensated frames from the asynchronous event camera data. These frames are fed the back-end of the pipeline, which uses a Multi-State Constrained Kalman Filter (MSCKF)  implemented with Scorpion, a Bayesian state estimation framework developed by the Autonomy and Navigation Technology (ANT) Center at Air Force Institute of Technology (AFIT) . This EVIO pipeline was tested on selections from the benchmark Event Camera Dataset ; and on a dataset collected, as part of this research, during the ANT Center's first flight test with an event-based camera.
DTIC Accession Number
Nelson, Kaleb J., "Event-Based Visual-Inertial Odometry on a Fixed-Wing Unmanned Aerial Vehicle" (2019). Theses and Dissertations. 2276.