Synthesizing Context Aware Animation for Street Navigation

碩士 === 國立暨南國際大學 === 資訊工程學系 === 104 === Due to the advance of science and technology, we can take advantage of variety of mobile device to get the latest information. For the purpose of map navigations, the research field of Route Recommendation has proposed several results in recent years; however,...

Full description

Bibliographic Details
Main Authors: JIH, CHAO-TING, 季昭霆
Other Authors: CHEN, LIEU-HEN
Format: Others
Language:zh-TW
Published: 2016
Online Access:http://ndltd.ncl.edu.tw/handle/39482290396254875648
Description
Summary:碩士 === 國立暨南國際大學 === 資訊工程學系 === 104 === Due to the advance of science and technology, we can take advantage of variety of mobile device to get the latest information. For the purpose of map navigations, the research field of Route Recommendation has proposed several results in recent years; however, these research results are merely from observing the linear variation of 2D planar map. On the other hand, Google Map, a world-widely used online map service, provides street view navigation with real-world pictures. In many cases, these images are too complex and contain too much useless noise (e.g. image distortion, exposure or ghost) for users to manipulate. Furthermore, these scenes are unchangeable and static regardless of day/night time, weather, or seasons. In this research, we propose a novel navigation system which visualizes the results of route recommendation with NPR animation effects. In our previous works, we need to extract street view images manually because Google Map doesn’t provide download service. To solve the above problem, in this project, we have developed an automatic street view capture system, which makes our system more convenient and practical for usage. Users only need to enter start/end points and our system will capture all the street view automatically. In addition, by integrating the real-time information of Open Weather Map, our system is able to show instant weather conditions on the street view image and create more interesting scenes. As shown in the current experiment results, our system can create very different and interesting user experiences by providing vivid navigation videos with various NPR effects.