Aladin, Mohamad Yahya Fekri and Ismail, Ajune Wanis (2019) Designing user interaction using gesture and speech for mixed reality interface. International Journal of Innovative Computing, 9 (2). pp. 71-77. ISSN 2180-4370
|
PDF
567kB |
Official URL: https://dx.doi.org/10.11113/ijic.v9n2.243
Abstract
Mixed Reality (MR) is the next evolution of humans interacting with computer as MR can combine the physical environment and digital environment and making them coexist with each other [1]. Interaction is still a valid research area in MR, and this paper focuses on interaction rather than other research areas such as tracking, calibration, and display [2] because the current interaction technique still not intuitive enough to let the user interact with the computer. This paper explores the user interaction using gesture and speech interaction for 3D object manipulation in mixed reality environment. The paper explains the design stage that involves interaction using gesture and speech inputs to enhance user experience in MR workspace. After acquiring gesture input and speech commands, MR prototype is proposed to integrate the interaction technique using gesture and speech. The paper concludes with results and discussion.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | Mixed Reality, Gesture Recognition, Speech Recognition, User Interaction, Multimodal Interaction |
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
Divisions: | Computing |
ID Code: | 85245 |
Deposited By: | Fazli Masari |
Deposited On: | 17 Mar 2020 08:10 |
Last Modified: | 17 Mar 2020 08:10 |
Repository Staff Only: item control page