In this paper, experiments are presented in support of an adaptive color-depth (RGB-D) camera-based visual odometry algorithm. The goal of visual odometry is to estimate the egomotion of a robot using images from a camera attached to the robot. This type of measurement can be extremely useful when position sensor information, such as GPS, in unavailable and when error from other motion sensors (e.g., wheel encoders) is inaccurate (e.g., due to wheel slip). In the presented method, visual odometry algorithm parameters are adapted to ensure that odometry measurements are accurate while also considering computational cost. In this paper, live experiments are performed that show the feasibility of implementing the proposed algorithm on small wheeled mobile robots.
- Dynamic Systems and Control Division
Adaptive RGB-D Visual Odometry for Mobile Robots: An Experimental Study
- Views Icon Views
- Share Icon Share
- Search Site
Anderson, JW, Fabian, JR, & Clayton, GM. "Adaptive RGB-D Visual Odometry for Mobile Robots: An Experimental Study." Proceedings of the ASME 2015 Dynamic Systems and Control Conference. Volume 3: Multiagent Network Systems; Natural Gas and Heat Exchangers; Path Planning and Motion Control; Powertrain Systems; Rehab Robotics; Robot Manipulators; Rollover Prevention (AVS); Sensors and Actuators; Time Delay Systems; Tracking Control Systems; Uncertain Systems and Robustness; Unmanned, Ground and Surface Robotics; Vehicle Dynamics Control; Vibration and Control of Smart Structures/Mech Systems; Vibration Issues in Mechanical Systems. Columbus, Ohio, USA. October 28–30, 2015. V003T49A005. ASME. https://doi.org/10.1115/DSCC2015-9829
Download citation file: