Research on Intelligent Experimental Equipment and Key Algorithms Based on Multimodal Fusion Perception
The application of virtual reality technology in science experiment education is a research with practical significance and value in human-computer interaction. However, in some existing education tools based on virtual reality, due to the single interaction mode, the complexity of user intention an...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9154693/ |
id |
doaj-01d003d639304703846da3e472e122d3 |
---|---|
record_format |
Article |
spelling |
doaj-01d003d639304703846da3e472e122d32021-03-30T03:44:08ZengIEEEIEEE Access2169-35362020-01-01814250714252010.1109/ACCESS.2020.30139039154693Research on Intelligent Experimental Equipment and Key Algorithms Based on Multimodal Fusion PerceptionBotao Zeng0https://orcid.org/0000-0002-7270-9997Zhiquan Feng1https://orcid.org/0000-0002-7704-4197Tao Xu2Mengting Xiao3https://orcid.org/0000-0002-8133-8286Rui Han4School of Information Science and Engineering, University of Jinan, Jinan, ChinaSchool of Information Science and Engineering, University of Jinan, Jinan, ChinaSchool of Information Science and Engineering, University of Jinan, Jinan, ChinaSchool of Information Science and Engineering, University of Jinan, Jinan, ChinaSchool of Information Science and Engineering, University of Jinan, Jinan, ChinaThe application of virtual reality technology in science experiment education is a research with practical significance and value in human-computer interaction. However, in some existing education tools based on virtual reality, due to the single interaction mode, the complexity of user intention and the non-physical interaction characteristics brought by virtualization, their experimental teaching ability is limited, resulting in the lack of practical value and popularity. In order to solve these problems, a multimodal interaction model is constructed by fusing gesture, speech and pressure information. Specifically, our tasks include: 1) collecting user input information and time series information to construct basic data input tuples. 2) The basic interaction information is used to identify the user's basic intention, and the correlation degree between the user's intentions is considered to determine the correctness of the current identification intention. 3) It allows users to alternate between multi-channel and single channel interaction. Based on this model, we build a multi-modal intelligent interactive virtual experiment platform (MIIVEP), and design and implement a kind of dropper with strong perception ability, which has been verified, tested, evaluated and applied in the intelligent virtual experiment system. In addition, in order to evaluate this work more effectively, we developed a fair scoring criterion for the virtual experimental system (Evaluation scale of virtual experiment system, ESVES), and invited middle school teachers and students to participate in the verification of the results of this work. Through the user's actual use effect verification and result research, we prove the effectiveness of the proposed model and the corresponding implementation.https://ieeexplore.ieee.org/document/9154693/Virtual experimentintelligent droppermultimodal fusionpressure sensorshuman-computer interaction |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Botao Zeng Zhiquan Feng Tao Xu Mengting Xiao Rui Han |
spellingShingle |
Botao Zeng Zhiquan Feng Tao Xu Mengting Xiao Rui Han Research on Intelligent Experimental Equipment and Key Algorithms Based on Multimodal Fusion Perception IEEE Access Virtual experiment intelligent dropper multimodal fusion pressure sensors human-computer interaction |
author_facet |
Botao Zeng Zhiquan Feng Tao Xu Mengting Xiao Rui Han |
author_sort |
Botao Zeng |
title |
Research on Intelligent Experimental Equipment and Key Algorithms Based on Multimodal Fusion Perception |
title_short |
Research on Intelligent Experimental Equipment and Key Algorithms Based on Multimodal Fusion Perception |
title_full |
Research on Intelligent Experimental Equipment and Key Algorithms Based on Multimodal Fusion Perception |
title_fullStr |
Research on Intelligent Experimental Equipment and Key Algorithms Based on Multimodal Fusion Perception |
title_full_unstemmed |
Research on Intelligent Experimental Equipment and Key Algorithms Based on Multimodal Fusion Perception |
title_sort |
research on intelligent experimental equipment and key algorithms based on multimodal fusion perception |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2020-01-01 |
description |
The application of virtual reality technology in science experiment education is a research with practical significance and value in human-computer interaction. However, in some existing education tools based on virtual reality, due to the single interaction mode, the complexity of user intention and the non-physical interaction characteristics brought by virtualization, their experimental teaching ability is limited, resulting in the lack of practical value and popularity. In order to solve these problems, a multimodal interaction model is constructed by fusing gesture, speech and pressure information. Specifically, our tasks include: 1) collecting user input information and time series information to construct basic data input tuples. 2) The basic interaction information is used to identify the user's basic intention, and the correlation degree between the user's intentions is considered to determine the correctness of the current identification intention. 3) It allows users to alternate between multi-channel and single channel interaction. Based on this model, we build a multi-modal intelligent interactive virtual experiment platform (MIIVEP), and design and implement a kind of dropper with strong perception ability, which has been verified, tested, evaluated and applied in the intelligent virtual experiment system. In addition, in order to evaluate this work more effectively, we developed a fair scoring criterion for the virtual experimental system (Evaluation scale of virtual experiment system, ESVES), and invited middle school teachers and students to participate in the verification of the results of this work. Through the user's actual use effect verification and result research, we prove the effectiveness of the proposed model and the corresponding implementation. |
topic |
Virtual experiment intelligent dropper multimodal fusion pressure sensors human-computer interaction |
url |
https://ieeexplore.ieee.org/document/9154693/ |
work_keys_str_mv |
AT botaozeng researchonintelligentexperimentalequipmentandkeyalgorithmsbasedonmultimodalfusionperception AT zhiquanfeng researchonintelligentexperimentalequipmentandkeyalgorithmsbasedonmultimodalfusionperception AT taoxu researchonintelligentexperimentalequipmentandkeyalgorithmsbasedonmultimodalfusionperception AT mengtingxiao researchonintelligentexperimentalequipmentandkeyalgorithmsbasedonmultimodalfusionperception AT ruihan researchonintelligentexperimentalequipmentandkeyalgorithmsbasedonmultimodalfusionperception |
_version_ |
1724182913188102144 |