|
|
|
|
LEADER |
01979 am a22001933u 4500 |
001 |
121521 |
042 |
|
|
|a dc
|
100 |
1 |
0 |
|a Worgan, Paul
|e author
|
100 |
1 |
0 |
|a Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
|e contributor
|
700 |
1 |
0 |
|a Reuss, Kevin
|e author
|
700 |
1 |
0 |
|a Mueller, Stefanie
|e author
|
245 |
0 |
0 |
|a Integrating electronic components into deformable objects based on user interaction data
|
260 |
|
|
|b Association for Computing Machinery,
|c 2019-07-08T17:12:45Z.
|
856 |
|
|
|z Get fulltext
|u https://hdl.handle.net/1721.1/121521
|
520 |
|
|
|a A key challenge when designing deformable user interfaces is the integration of rigid electronic components with the soft deformable device. In this paper, we propose to place electronic components based on how the user is interacting with the device, i.e., in which way the device is being deformed when the user performs gestures. To identify optimum locations for placing electronic components, we developed a design tool that takes as input a 3D model of the deformable device and a set of captured user gestures. It then visualizes the stress distribution resulting from the gestures applied to the deformable device and suggests where not to place components because the location is highly deformed when users interact (e.g., a rigid battery that would constraint interaction); or alternatively where to place components to sense deformation more accurately (e.g., a bend sensor to detect a specific gesture) and efficiently (e.g., an energy harvesting component). We evaluated our approach by collecting interaction data from 12 users across three deformable devices (a watch, a camera, and a mouse) and applied the resulting stress distributions to the placement of selected electronic components.
|
546 |
|
|
|a en
|
655 |
7 |
|
|a Article
|
773 |
|
|
|t 10.1145/3294109.3295629
|
773 |
|
|
|t Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction
|