The fusion and integration of virtual sensors

There are numerous sensors from which to choose when designing a mobile robot: ultrasonic, infrared, radar, or laser range finders, video, collision detectors, or beacon based systems such as the Global Positioning System. In order to meet the need for reliability, accuracy, and fault tolerance, mob...

Full description

Bibliographic Details
Main Author: Litant, Thomas F.
Format: Others
Language:English
Published: W&M ScholarWorks 2002
Subjects:
Online Access:https://scholarworks.wm.edu/etd/1539623397
https://scholarworks.wm.edu/cgi/viewcontent.cgi?article=3188&context=etd
id ndltd-wm.edu-oai-scholarworks.wm.edu-etd-3188
record_format oai_dc
spelling ndltd-wm.edu-oai-scholarworks.wm.edu-etd-31882019-05-16T03:34:34Z The fusion and integration of virtual sensors Litant, Thomas F. There are numerous sensors from which to choose when designing a mobile robot: ultrasonic, infrared, radar, or laser range finders, video, collision detectors, or beacon based systems such as the Global Positioning System. In order to meet the need for reliability, accuracy, and fault tolerance, mobile robot designers often place multiple sensors on the same platform, or combine sensor data from multiple platforms. The combination of the data from multiple sensors to improve reliability, accuracy, and fault tolerance is termed Sensor Fusion.;The types of robotic sensors are as varied as the properties of the environment that need to be sensed. to reduce the complexity of system software, Roboticists have found it highly desirable to adopt a common interface between each type of sensor and the system responsible for fusing the information. The process of abstracting the essential properties of a sensor is called Sensor Virtualization.;Sensor virtualization to date has focused on abstracting the properties shared by sensors of the same type. The approach taken by T. Henderson is simply to expose to the fusion system only the data from the sensor, along with a textual label describing the sensor. We extend Henderson's work in the following manner. First, we encapsulate both the fusion algorithm and the interface layer in the virtual sensor. This allows us to build multi-tiered virtual sensor hierarchies. Secondly, we show how common fusion algorithms can be encapsulated in the virtual sensor, facilitating the integration and replacement of both physical and virtual sensors. Finally, we provide a physical proof of concept using monostatic sonars, vector sonars, and a laser range-finder. 2002-01-01T08:00:00Z text application/pdf https://scholarworks.wm.edu/etd/1539623397 https://scholarworks.wm.edu/cgi/viewcontent.cgi?article=3188&context=etd © The Author Dissertations, Theses, and Masters Projects English W&M ScholarWorks Artificial Intelligence and Robotics Computer Sciences
collection NDLTD
language English
format Others
sources NDLTD
topic Artificial Intelligence and Robotics
Computer Sciences
spellingShingle Artificial Intelligence and Robotics
Computer Sciences
Litant, Thomas F.
The fusion and integration of virtual sensors
description There are numerous sensors from which to choose when designing a mobile robot: ultrasonic, infrared, radar, or laser range finders, video, collision detectors, or beacon based systems such as the Global Positioning System. In order to meet the need for reliability, accuracy, and fault tolerance, mobile robot designers often place multiple sensors on the same platform, or combine sensor data from multiple platforms. The combination of the data from multiple sensors to improve reliability, accuracy, and fault tolerance is termed Sensor Fusion.;The types of robotic sensors are as varied as the properties of the environment that need to be sensed. to reduce the complexity of system software, Roboticists have found it highly desirable to adopt a common interface between each type of sensor and the system responsible for fusing the information. The process of abstracting the essential properties of a sensor is called Sensor Virtualization.;Sensor virtualization to date has focused on abstracting the properties shared by sensors of the same type. The approach taken by T. Henderson is simply to expose to the fusion system only the data from the sensor, along with a textual label describing the sensor. We extend Henderson's work in the following manner. First, we encapsulate both the fusion algorithm and the interface layer in the virtual sensor. This allows us to build multi-tiered virtual sensor hierarchies. Secondly, we show how common fusion algorithms can be encapsulated in the virtual sensor, facilitating the integration and replacement of both physical and virtual sensors. Finally, we provide a physical proof of concept using monostatic sonars, vector sonars, and a laser range-finder.
author Litant, Thomas F.
author_facet Litant, Thomas F.
author_sort Litant, Thomas F.
title The fusion and integration of virtual sensors
title_short The fusion and integration of virtual sensors
title_full The fusion and integration of virtual sensors
title_fullStr The fusion and integration of virtual sensors
title_full_unstemmed The fusion and integration of virtual sensors
title_sort fusion and integration of virtual sensors
publisher W&M ScholarWorks
publishDate 2002
url https://scholarworks.wm.edu/etd/1539623397
https://scholarworks.wm.edu/cgi/viewcontent.cgi?article=3188&context=etd
work_keys_str_mv AT litantthomasf thefusionandintegrationofvirtualsensors
AT litantthomasf fusionandintegrationofvirtualsensors
_version_ 1719187240979005440