An intelligent multi-user office environment using neural network
碩士 === 朝陽科技大學 === 資訊與通訊系碩士班 === 100 === This study used artificial neutral network to learn users’ behaviors, make judgments according to the individuals’ behavior characteristics, predict the individuals’ next actions based on their characteristics of imitative behavior, in order to pre-control the...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | zh-TW |
Published: |
2012
|
Online Access: | http://ndltd.ncl.edu.tw/handle/24985545671342861925 |
id |
ndltd-TW-100CYUT5652007 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-100CYUT56520072015-10-13T21:17:24Z http://ndltd.ncl.edu.tw/handle/24985545671342861925 An intelligent multi-user office environment using neural network 利用類神經網路構建多使用者智慧型辦公環境 Liang-Fu Chen 陳亮甫 碩士 朝陽科技大學 資訊與通訊系碩士班 100 This study used artificial neutral network to learn users’ behaviors, make judgments according to the individuals’ behavior characteristics, predict the individuals’ next actions based on their characteristics of imitative behavior, in order to pre-control the electric appliances in the environment. In a multi-user office environment, it is necessary to install sensors for data mining. Due to the large scope and quantity of sensors required, and the sensors cannot identify the users, this study divided the sensors into two types, namely private and global. By comparing the data, this study judged and tracked the users, and sorted data of multiple persons into individual data, and then recorded their behaviors and reactions for learning. Through the feature values of users, their user behaviors could be distinguished in order to achieve multi-user intelligent control. Kuo-An Hwang 黃國安 2012 學位論文 ; thesis 50 zh-TW |
collection |
NDLTD |
language |
zh-TW |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 朝陽科技大學 === 資訊與通訊系碩士班 === 100 === This study used artificial neutral network to learn users’ behaviors, make judgments according to the individuals’ behavior characteristics, predict the individuals’ next actions based on their characteristics of imitative behavior, in order to pre-control the electric appliances in the environment.
In a multi-user office environment, it is necessary to install sensors for data mining. Due to the large scope and quantity of sensors required, and the sensors cannot identify the users, this study divided the sensors into two types, namely private and global. By comparing the data, this study judged and tracked the users, and sorted data of multiple persons into individual data, and then recorded their behaviors and reactions for learning. Through the feature values of users, their user behaviors could be distinguished in order to achieve multi-user intelligent control.
|
author2 |
Kuo-An Hwang |
author_facet |
Kuo-An Hwang Liang-Fu Chen 陳亮甫 |
author |
Liang-Fu Chen 陳亮甫 |
spellingShingle |
Liang-Fu Chen 陳亮甫 An intelligent multi-user office environment using neural network |
author_sort |
Liang-Fu Chen |
title |
An intelligent multi-user office environment using neural network |
title_short |
An intelligent multi-user office environment using neural network |
title_full |
An intelligent multi-user office environment using neural network |
title_fullStr |
An intelligent multi-user office environment using neural network |
title_full_unstemmed |
An intelligent multi-user office environment using neural network |
title_sort |
intelligent multi-user office environment using neural network |
publishDate |
2012 |
url |
http://ndltd.ncl.edu.tw/handle/24985545671342861925 |
work_keys_str_mv |
AT liangfuchen anintelligentmultiuserofficeenvironmentusingneuralnetwork AT chénliàngfǔ anintelligentmultiuserofficeenvironmentusingneuralnetwork AT liangfuchen lìyònglèishénjīngwǎnglùgòujiànduōshǐyòngzhězhìhuìxíngbàngōnghuánjìng AT chénliàngfǔ lìyònglèishénjīngwǎnglùgòujiànduōshǐyòngzhězhìhuìxíngbàngōnghuánjìng AT liangfuchen intelligentmultiuserofficeenvironmentusingneuralnetwork AT chénliàngfǔ intelligentmultiuserofficeenvironmentusingneuralnetwork |
_version_ |
1718059425855438848 |