Sensory Feedback and Sensorimotor Adaptation in Human-Computer Interface for a Gesture-Based Contactless Musical Instrument
A study is presented of a human-computer interface (HCI) for an expressive contactless musical instrument (using a ToF depth camera) that considers sensory feedback and sensorimotor adaptation in comparison with a conventional contact instrument. The design uses an intuitive ‘drum membrane’ paradigm...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
SAGE Publishing
2012-05-01
|
Series: | i-Perception |
Online Access: | https://doi.org/10.1068/id253 |
id |
doaj-bf8c102dfbc0403eae481c864944e874 |
---|---|
record_format |
Article |
spelling |
doaj-bf8c102dfbc0403eae481c864944e8742020-11-25T04:10:41ZengSAGE Publishingi-Perception2041-66952012-05-01310.1068/id25310.1068_id253Sensory Feedback and Sensorimotor Adaptation in Human-Computer Interface for a Gesture-Based Contactless Musical InstrumentAdar PelahPhilip GreenhalghA study is presented of a human-computer interface (HCI) for an expressive contactless musical instrument (using a ToF depth camera) that considers sensory feedback and sensorimotor adaptation in comparison with a conventional contact instrument. The design uses an intuitive ‘drum membrane’ paradigm for striking musical notes using simple pressing hand gestures on a notional keyboard in free air. In Experiment 1, 5 subjects were asked to complete a range of musical tasks using two forms of sensory feedback: Auditory-Only, where subjects could only hear the consequences of their pressing gestures, and Visual+Auditory, where subjects could both hear the sounds and receive visual feedback on a computer display. Results showed that Auditory+Visual feedback produced more precise performance (SD = 0.894) in comparison to Auditory-Only feedback (SD = 3.507), supporting the importance of a visual feedback element as an aid to natural gesture-based control in HCI. In Experiment 2, a comparison was made between sensorimotor adaptation in the contactless instrument (Visual-Only) and a conventional contact (Visual+Haptic) keyboard instrument. For each instrument, 7 subjects were asked to maintain tones at a perceived constant level (baseline) whilst a parameter (gain) was altered and then later restored. Once restored, the number of presses required to return to baseline was quantified as the after-effect of the adaptation. Results indicated that while design requirements for a contactless instrument may be very different from one that includes physical contact, similar neural mechanisms mediate a user's dynamic adaptation to both types of instrument.https://doi.org/10.1068/id253 |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Adar Pelah Philip Greenhalgh |
spellingShingle |
Adar Pelah Philip Greenhalgh Sensory Feedback and Sensorimotor Adaptation in Human-Computer Interface for a Gesture-Based Contactless Musical Instrument i-Perception |
author_facet |
Adar Pelah Philip Greenhalgh |
author_sort |
Adar Pelah |
title |
Sensory Feedback and Sensorimotor Adaptation in Human-Computer Interface for a Gesture-Based Contactless Musical Instrument |
title_short |
Sensory Feedback and Sensorimotor Adaptation in Human-Computer Interface for a Gesture-Based Contactless Musical Instrument |
title_full |
Sensory Feedback and Sensorimotor Adaptation in Human-Computer Interface for a Gesture-Based Contactless Musical Instrument |
title_fullStr |
Sensory Feedback and Sensorimotor Adaptation in Human-Computer Interface for a Gesture-Based Contactless Musical Instrument |
title_full_unstemmed |
Sensory Feedback and Sensorimotor Adaptation in Human-Computer Interface for a Gesture-Based Contactless Musical Instrument |
title_sort |
sensory feedback and sensorimotor adaptation in human-computer interface for a gesture-based contactless musical instrument |
publisher |
SAGE Publishing |
series |
i-Perception |
issn |
2041-6695 |
publishDate |
2012-05-01 |
description |
A study is presented of a human-computer interface (HCI) for an expressive contactless musical instrument (using a ToF depth camera) that considers sensory feedback and sensorimotor adaptation in comparison with a conventional contact instrument. The design uses an intuitive ‘drum membrane’ paradigm for striking musical notes using simple pressing hand gestures on a notional keyboard in free air. In Experiment 1, 5 subjects were asked to complete a range of musical tasks using two forms of sensory feedback: Auditory-Only, where subjects could only hear the consequences of their pressing gestures, and Visual+Auditory, where subjects could both hear the sounds and receive visual feedback on a computer display. Results showed that Auditory+Visual feedback produced more precise performance (SD = 0.894) in comparison to Auditory-Only feedback (SD = 3.507), supporting the importance of a visual feedback element as an aid to natural gesture-based control in HCI. In Experiment 2, a comparison was made between sensorimotor adaptation in the contactless instrument (Visual-Only) and a conventional contact (Visual+Haptic) keyboard instrument. For each instrument, 7 subjects were asked to maintain tones at a perceived constant level (baseline) whilst a parameter (gain) was altered and then later restored. Once restored, the number of presses required to return to baseline was quantified as the after-effect of the adaptation. Results indicated that while design requirements for a contactless instrument may be very different from one that includes physical contact, similar neural mechanisms mediate a user's dynamic adaptation to both types of instrument. |
url |
https://doi.org/10.1068/id253 |
work_keys_str_mv |
AT adarpelah sensoryfeedbackandsensorimotoradaptationinhumancomputerinterfaceforagesturebasedcontactlessmusicalinstrument AT philipgreenhalgh sensoryfeedbackandsensorimotoradaptationinhumancomputerinterfaceforagesturebasedcontactlessmusicalinstrument |
_version_ |
1724419629281968128 |