|
|
|
|
LEADER |
01648 am a22001333u 4500 |
001 |
49592 |
042 |
|
|
|a dc
|
100 |
1 |
0 |
|a Basori, Ahmad Hoirul
|e author
|
245 |
0 |
0 |
|a Emotion walking for humanoid avatars using brain signals
|
260 |
|
|
|b Sage Journals,
|c 2013.
|
856 |
|
|
|z Get fulltext
|u http://eprints.utm.my/id/eprint/49592/1/AhmadHoirulBasori2013_Emotionwalkingforhumanoid.pdf
|
520 |
|
|
|a Interaction between humans and humanoid avatar representations is very important in virtual reality and robotics, since the humanoid avatar can represent either a human or a robot in a virtual environment. Many researchers have focused on providing natural interactions for humanoid avatars or even for robots with the use of camera tracking, gloves, giving them the ability to speak, brain interfaces and other devices. This paper provides a new multimodal interaction control for avatars by combining brain signals, facial muscle tension recognition and glove tracking to change the facial expression of humanoid avatars according to the user's emotional condition. The signals from brain activity and muscle movements are used as the emotional stimulator, while the glove acts as emotion intensity control for the avatar. This multimodal interface can determine when the humanoid avatar needs to change their facial expression or their walking power. The results show that humanoid avatar have different timelines of walking and facial expressions when the user stimulates them with different emotions. This finding is believed to provide new knowledge on controlling robots' and humanoid avatars' facial expressions and walking
|
546 |
|
|
|a en
|
650 |
0 |
4 |
|a QA76 Computer software
|