Indoor AR-based Multi-user Navigation for Merchandise Shopping Using Down-looking Omni-cameras

碩士 === 國立交通大學 === 資訊科學與工程研究所 === 101 === When people enter unfamiliar indoor environments, like shopping malls, supermarkets, grocery stores, etc., they generally have to rely on staff members to guide them to the locations of desired merchandise items. In this study, an indoor multi-user navigation...

Full description

Bibliographic Details
Main Authors: Yang, Shu-Lin, 楊舒琳
Other Authors: Tsai, Wen-Hsiang
Format: Others
Language:en_US
Published: 2013
Online Access:http://ndltd.ncl.edu.tw/handle/35507810212955920128
id ndltd-TW-101NCTU5394133
record_format oai_dc
spelling ndltd-TW-101NCTU53941332016-05-22T04:33:54Z http://ndltd.ncl.edu.tw/handle/35507810212955920128 Indoor AR-based Multi-user Navigation for Merchandise Shopping Using Down-looking Omni-cameras 以俯視式環場攝影機作多人擴增實境式室內商品導覽 Yang, Shu-Lin 楊舒琳 碩士 國立交通大學 資訊科學與工程研究所 101 When people enter unfamiliar indoor environments, like shopping malls, supermarkets, grocery stores, etc., they generally have to rely on staff members to guide them to the locations of desired merchandise items. In this study, an indoor multi-user navigation system based on augmented reality (AR) and computer vision techniques by the use of a mobile device like an HTC Flyer is proposed. At first, an indoor vision infra-structure is set up by attaching fisheye cameras on the ceiling of the navigation environment. The locations and orientations of multiple users are detected from the acquired images using the fisheye cameras by a remote server-side system, and the analysis results are sent to the client-side system on each user’s mobile device. Meanwhile, the server-side system also analyzes the acquired images to recognize merchandise items and sends the information of the surrounding environment and the merchandises, as well as the navigation path to the client-side system. The client-side system then displays the information in an AR way on the mobile device, which provides clear information for each user to conduct the navigation. For multi-user identification, a method is proposed to attach a multicolor-edge mark on top of each user’s mobile device and the server-side system analyzes them in each consecutive image frame captured by the closest fisheye camera and classifies the edge mark according to its color pattern to obtain the identification number of each user. For multi-user localization, a method is proposed to analyze the omni-image captured from the fisheye cameras and detect human activities in the environment. The server-side system separates the foreground from the background in the image and detects the location of each user. Furthermore, three techniques are proposed and integrated together to conduct user orientation detection effectively. The first technique is analysis of user motions in consecutive images. The second is utilization of the orientation sensor on the user’s mobile device. The last is estimation for the direction of the multi-color edge mark attached on the top of the mobile device using the omni-image. For AR-based merchandise guidance, the client-side system sends the image captured from client device camera to the server-side system. Then, the server-side system analyzes it by the SURF algorithm, matches the resulting features against a pre-constructed merchandise image database, and transmits the corresponding information to the client-side system for display in an AR way. Also, a path planning technique is used for generating a collision-free path from the current user’s position to a selected merchandise item via the use of an environment map. Finally, the navigation and merchandise information is overlaid onto the images shown on the mobile devices. In this way, the system can accomplish the AR functions and provide a convenient guidance for merchandise shopping or other similar activities. Good experimental results show the feasibility of the proposed system and methods. Tsai, Wen-Hsiang 蔡文祥 2013 學位論文 ; thesis 120 en_US
collection NDLTD
language en_US
format Others
sources NDLTD
description 碩士 === 國立交通大學 === 資訊科學與工程研究所 === 101 === When people enter unfamiliar indoor environments, like shopping malls, supermarkets, grocery stores, etc., they generally have to rely on staff members to guide them to the locations of desired merchandise items. In this study, an indoor multi-user navigation system based on augmented reality (AR) and computer vision techniques by the use of a mobile device like an HTC Flyer is proposed. At first, an indoor vision infra-structure is set up by attaching fisheye cameras on the ceiling of the navigation environment. The locations and orientations of multiple users are detected from the acquired images using the fisheye cameras by a remote server-side system, and the analysis results are sent to the client-side system on each user’s mobile device. Meanwhile, the server-side system also analyzes the acquired images to recognize merchandise items and sends the information of the surrounding environment and the merchandises, as well as the navigation path to the client-side system. The client-side system then displays the information in an AR way on the mobile device, which provides clear information for each user to conduct the navigation. For multi-user identification, a method is proposed to attach a multicolor-edge mark on top of each user’s mobile device and the server-side system analyzes them in each consecutive image frame captured by the closest fisheye camera and classifies the edge mark according to its color pattern to obtain the identification number of each user. For multi-user localization, a method is proposed to analyze the omni-image captured from the fisheye cameras and detect human activities in the environment. The server-side system separates the foreground from the background in the image and detects the location of each user. Furthermore, three techniques are proposed and integrated together to conduct user orientation detection effectively. The first technique is analysis of user motions in consecutive images. The second is utilization of the orientation sensor on the user’s mobile device. The last is estimation for the direction of the multi-color edge mark attached on the top of the mobile device using the omni-image. For AR-based merchandise guidance, the client-side system sends the image captured from client device camera to the server-side system. Then, the server-side system analyzes it by the SURF algorithm, matches the resulting features against a pre-constructed merchandise image database, and transmits the corresponding information to the client-side system for display in an AR way. Also, a path planning technique is used for generating a collision-free path from the current user’s position to a selected merchandise item via the use of an environment map. Finally, the navigation and merchandise information is overlaid onto the images shown on the mobile devices. In this way, the system can accomplish the AR functions and provide a convenient guidance for merchandise shopping or other similar activities. Good experimental results show the feasibility of the proposed system and methods.
author2 Tsai, Wen-Hsiang
author_facet Tsai, Wen-Hsiang
Yang, Shu-Lin
楊舒琳
author Yang, Shu-Lin
楊舒琳
spellingShingle Yang, Shu-Lin
楊舒琳
Indoor AR-based Multi-user Navigation for Merchandise Shopping Using Down-looking Omni-cameras
author_sort Yang, Shu-Lin
title Indoor AR-based Multi-user Navigation for Merchandise Shopping Using Down-looking Omni-cameras
title_short Indoor AR-based Multi-user Navigation for Merchandise Shopping Using Down-looking Omni-cameras
title_full Indoor AR-based Multi-user Navigation for Merchandise Shopping Using Down-looking Omni-cameras
title_fullStr Indoor AR-based Multi-user Navigation for Merchandise Shopping Using Down-looking Omni-cameras
title_full_unstemmed Indoor AR-based Multi-user Navigation for Merchandise Shopping Using Down-looking Omni-cameras
title_sort indoor ar-based multi-user navigation for merchandise shopping using down-looking omni-cameras
publishDate 2013
url http://ndltd.ncl.edu.tw/handle/35507810212955920128
work_keys_str_mv AT yangshulin indoorarbasedmultiusernavigationformerchandiseshoppingusingdownlookingomnicameras
AT yángshūlín indoorarbasedmultiusernavigationformerchandiseshoppingusingdownlookingomnicameras
AT yangshulin yǐfǔshìshìhuánchǎngshèyǐngjīzuòduōrénkuòzēngshíjìngshìshìnèishāngpǐndǎolǎn
AT yángshūlín yǐfǔshìshìhuánchǎngshèyǐngjīzuòduōrénkuòzēngshíjìngshìshìnèishāngpǐndǎolǎn
_version_ 1718274777499566080