Particle-filter based Vehicle Localization by Detecting Lane Markings

When navigating on a painted road, it is important for an autonomous vehicle to know which lane it is in and how far it is from the lane markings. GPS and IMUs are often too noisy for localization at lane-level precision. We propose using a top-mounted camera to detect lane markings and use them to infer much more accurate location information. Particle filter is used to model the dynamics of the vehicle, in whch the observation model is based on comparing extracted lanes with lane-level maps. Sub-meter lateral error is achieved with this method. (US Patent #9483700, slides)

3D Structure-from-Motion in Infrared

Structure-from-motion systems reconstruct a 3D object model given images of the object from multiple views. While most such system uses RGB images, we develop an algorithm that fuses thermal images taken by an infrared camera into the RGB model. Such multi-modality environment models can help fire-fighting robots locate fire victims and fire seats. Details of the algorithm and the robot can be found in these documents.

This project won the Best Overall Project prize in 2013 DRS infrared imaging competition, and received broad media coverage (CNET, Gizmodo, IEEE Spectrum, Huffinton Post UK, etc).

 

Stereo Image-based Stairs Modeling Using Sample Consensus

Modeling stairs is an important task for autonomous robot navigation in a multilevel building. The proposed algorithm uses stereo images of the stairs to estimate the dimensions like the width and slope, which help robots choose the optimal paths and maneuvers to climb the stairs..

Given stereo images of a staircase, step edges are first detected from RGB images and projected in 3D space. The best model is then found using a sample consensus approach. The complete algorithm is explained in this poster.

Lidar-based Simultaneous Localization and Mapping with Mobile Robot iSee

I led the embedded development for iSee, an Arduino-based mobile robot designed by UCSD Coordinated Robotics Lab. The robot uses an Arduino Uno to control four holonomic wheels and uses an Arduino Mega to collect and transmit range data obtained from a Hokuyo URG-04LX laser rangefinder through WiFi. Below is an occupancy grid map produced by iSee, using a binary Bayesian filter. The robot plans its route using the A* algorithm.