Universiti Teknologi Malaysia Institutional Repository

An integration framework for haptic feedback to improve facial expression on virtual human

Basori, Ahmad Hoirul and Abdullah, Bade and Sunar, Mohd. Shahrizal and Nadzaari, Saari and Daman, Daut and Salam, Md. Sah (2012) An integration framework for haptic feedback to improve facial expression on virtual human. International Journal of Innovative Computing, Information and Control, 8 (11). pp. 7829-7851. ISSN 1349-4198

Full text not available from this repository.

Abstract

Most of the latest 3D humanoid models that are currently available can produce emotions only through facial expressions, gestures and voice. Only a few humanoid models are capable of manipulating haptic tactile emotions through vibrations. This study proposes a system, in which haptic feedback is integrated based on visual, acoustic, and haptic cues. This integrated framework is based on two major techniques: emotion vibration mapping and facial expression synthesis. In emotion vibration mapping, mapping is carried out by first scaling the joystick wavelength to the visible light spectrum. Then, a linear equation describing the magnitude force is created by using joystick wavelength and magnitude force data. Finally, the wavelength of visible light spectrum is used as parameter to compute the joystick wavelength by using a linear interpolation method, and then, emotions are generated and a complete classification table is stored for each emotion value. In facial expression synthesis, a combination of Action Units (AUs) is used to generate certain emotion expressions in the 3D humanoid model face based on Facial Action Coding System (FACS). Each action unit is characterized by its specific face region, and each face region has a special lighting colour to differentiate its appearance. The colour of light is obtained from the emotion classification table generated in the emotionvibration mapping process. Furthermore, the integration proceeds with rendering the facial expression, generating an acoustic effect from an emotional sound, and adjusting the loudness level according to the emotion value. Finally, the magnitude force of the haptic device, which is simultaneously adjusted after the synchronization of visual and acoustic cues, is integrated. In this study, a mind controller and a glove are used to capture user emotions in real time. The mind controller determines the type of emotion according to the brain activity of the user, whereas the glove controls the intensity of an emotion. The results from experiment show that 6r/% of the participants gave strongly positive responses to the system,. In addition, 15 of 21 (71%) participants agreed with the classification of the magnitude force into the emotion representation. Most of the users remarked that a high magnitude force created a sensation similar to anger, whereas a low magnitude force created a more relaxing sensation.

Item Type:Article
Uncontrolled Keywords:Computing
Subjects:Q Science > QA Mathematics > QA76 Computer software
Divisions:Computer Science and Information System
ID Code:46600
Deposited By: Haliza Zainal
Deposited On:22 Jun 2015 05:56
Last Modified:17 Sep 2017 01:37

Repository Staff Only: item control page