A real-time image understanding system for an autonomous mobile robot

Approved for public release, distribution is unlimited === Yamabico-11 is an autonomous mobile robot used as a research platform with one area in image understanding. Previous work focused on edge detection analysis on a Silicon Graphics Iris (SGI) workstation with no method for implementation on th...

Full description

Bibliographic Details
Main Author: Remias, Leonard V.
Other Authors: Kanayama, Yukata
Language:en_US
Published: Monterey, California. Naval Postgraduate School 2012
Online Access:http://hdl.handle.net/10945/8887
Description
Summary:Approved for public release, distribution is unlimited === Yamabico-11 is an autonomous mobile robot used as a research platform with one area in image understanding. Previous work focused on edge detection analysis on a Silicon Graphics Iris (SGI) workstation with no method for implementation on the robot. Yamabico-11 does not have an on-board image processing capability to detect straight edges in a grayscale image and a method for allowing the user to analyze the data. The approach taken for system development is partly based on edge extraction and line fitting algorithms of (PET92) with a 3-D geometric model of the robot's world (STE92). Image grabbing routines of (KIS95) were used to capture images with the robot's digital output camera and processed using image understanding routines developed for a SGI workstation. The routines were modified and ported onto the robot. The new method of edge extraction produces less ambient noise and more continuous vertical line segments in the gradient image which enhances pattern matching analysis of the image. Yamabico-11's computer system can capture an image with a resolution of 739 x 484 active picture elements. Edge detection analysis is performed on the robot which generates a list structure of edges and stored in the robot's memory for user analysis.