Affective Multimodal Feedback
Introduction
Emotional experience plays a central role in social communication, motivation and memory, so it is important to support effective emotional expression in HCI. Human displays of emotion are complex and multifaceted, including facial expressions, vocal elements, body movements, tactile (touch, push, squeeze) and thermal (hug, hold hands) sensations. In the absence of physical presence during digital communication, emotion needs to be conveyed through different means. In synchronous communication, facial expressions can be conveyed through video and voice through audio, but these signals are limited and devoid of tactile cues, and people with visual or hearing impairments miss out on cues. During asynchronous communication, such as text-based messaging, emotion is frequently conveyed using stylized pictorial expressions such as “emoji”, but these are difficult to interpret and unsuitable for visual impairments. They are also different to real affective signals, lacking body movement, sound, touch and temperature. We have conducted research systematically mapping the perceived emotion from multimodal feedback, leveraging inherent social and emotional interpretations of temperature and visual feedback, and combining these with vibrotactile and audio feedback.
Multimodal Affective Feedback: Thermal + Visual + Vibrotactile
Each modality has been studied individually (see research on thermal and visual feedback below), but they can only convey a limited range of emotions within two-dimensional valence-arousal space. Our study is the first to systematically combine multiple modalities to expand the available affective range of an interface in HCI. Three studies were conducted: Study 1 measured the emotionality of vibrotactile feedback by itself; Study 2 measured the perceived emotional content of three bimodal combinations: vibrotactile + thermal, vibrotactile + visual and visual + thermal. Study 3 then combined all three modalities. Results show that combining modalities increases the available range of emotional states, particularly in the problematic top-right and bottom-left quadrants of the dimensional model. We also provide a novel lookup resource for designers to identify stimuli to convey a range of emotions.
Perceived Emotion in Thermal Feedback
There are inherent links between emotion and thermal sensation and so thermal sensation is a key component of the conceptualisation and experience of emotion: physical warmth increases interpersonal warmth and the experience of physical temperatures helps to ground and process emotional experience. However, there exists no systematic mapping of thermal feedback to models of emotion that could be used by designers and users to convey a range of emotions in HCI. A common way of classifying emotions and quantifying emotional experience is through ratings along valence and arousal dimensions, originating from Russell’s circumplex model. Therefore, the research in this paper mapped subjective ratings of a range of thermal stimuli to the circumplex model to understand the range of emotions that might be conveyed through thermal feedback. However, as the suitability of the model varies depending on the type of emotional stimuli, we also compared the goodness of fit of ratings between the circumplex and vector models of emotion. The results showed that thermal feedback was interpreted as representing a limited range of emotions concentrated in just two quadrants or categories of the circumplex: high valence, low arousal and low valence, high arousal. Warm stimuli were perceived as more pleasant/positive than cool stimuli and altering either the rate or extent of temperature change affected both valence and arousal axes simultaneously. The results showed a significantly better fit to a vector model than to the circumplex.
Perceived Emotion in Abstract Visual Feedback
Recent HCI research has looked at conveying emotions through non-visual modalities, such as vibrotactile and thermal feedback. However, emotion is primarily conveyed through visual signals, and so this research aims to support the design of emotional visual feedback. We adapt and extend the design of the “pulsing amoeba” from the University of Washington, and measure the emotion conveyed through the abstract visual designs. It is a first step towards more holistic multimodal affective feedback combining visual, auditory and tactile stimuli. An online survey garnered valence and arousal ratings of 32 stimuli that varied in colour, contour, pulse size and pulse speed. The study both confirmed earlier findings and provided new results: like others, blue and green were found to be more pleasant than red and grey, and higher pulse speed led to higher arousal and lower valence. However, we also showed strong effects of pulse size, with larger pulses conveying lower valence and higher arousal. Unlike previous research contour did not influence valence ratings, possibly because the differences between our contours were less obvious than others. We present a mapping of all stimulus combinations onto the common two-dimensional valence-arousal model of emotion.
Thermal Feedback for Web Security Warnings
Today’s web security warnings often rely on visual cues such as colour, e.g., red URL highlighting indicates a security risk. However, such cues often go unnoticed by users and, even when noticed, are ignored. Our aim is to investigate the potential for using other modalities to improve comprehension of, and adherence to, security warnings, starting with thermal feedback. Thermal stimulation has inherent links to emotion and danger, so may provide unique advantages over current visual cues. However, interpretation of feedback varies, so research is needed to measure associations. We used an online questionnaire (n=45) and lab study (n=12) to investigate whether people associate a particular temperature range with different states of web security. Our results indicate that people generally associate a cold temperature with a secure page and warm with an insecure page, findings we will take forward into future work on the effect of thermal feedback on security-related behaviour.
Emotionally Resonant Vibrotactile Stimuli
Prior research of vibrotactile stimuli found a narrow emotional range, particularly in terms of valence. Our aim was to explore a novel category of stimuli which may demonstrate a wider range, Emotionally Resonant Vibrotactile Stimuli. These stimuli aim to evoke real world sensations that the user already has an emotional reaction to, for example pleasant memories of cat purring. We conducted a lab study measuring emotional responses to 8 different potentially emotionally resonant vibrotactile stimuli during which participants recorded their valence and arousal to each stimuli and discussed them in short qualitative interviews.
We found that there was potential for stimuli to emotionally resonant when labelled, but that which stimuli were resonant to which participants was very personal, as was whether their responses to that resonance was neutral, positive or negative. Two stimuli stood out as recognisable and resonant with the most participants, a Heartbeat stimulus and a Cat Purring stimulus. Our future research will explore other possible emotionally resonant stimuli and whether the addition or other modalities like temperature effect their resonance.
Publications
- Macdonald, S., Brewster, S. & Pollick, F. (2020) “Eliciting Emotion with Vibrotactile Stimuli Evocative of Real-World Sensations”, ICMI 2020 – Proceedings of the 2020 International Conference on Multimodal Interaction.
- Alotaibi, Y., Williamson J. & Brewster, S. (2020) “Investigating Electrotactile Feedback on the Hand”, IEEE Haptics Symposium 2020.
- Wilson, G. & Brewster, S. (2017) “Multi-moji: Combining Thermal, Vibrotactile & Visual Stimuli to Expand the Affective Range of Feedback”, Proceedings of CHI 2017, May 6-11, Denver, CO.
- Wilson, G., Maxwell, H. & Just, M. (2017) “Everything’s Cool: Extending Security Warnings with Thermal Feedback”, Proceedings of CHI 2017 LBW, May 6-11, Denver, CO.
- Wilson, G., Freeman, E. & Brewster, S. (2016) “Multimodal Affective Feedback: Combining Thermal, Vibrotactile, Audio & Visual Signals”, Proceedings of ICMI 2016 Demonstrations, Nov 12-16, Tokyo, Japan.
- Wilson, G., Dobrev, D. & Brewster, S. (2016) “Hot Under the Collar: Mapping Thermal Feedback to Dimensional Models of Emotion”, Proceedings of CHI 2016, May 7-12, San Jose, CA.
- Wilson, G., Romeo, P. & Brewster, S. (2016) “Mapping Abstract Visual Feedback to a Dimensional Model of Emotion”, Proceedings of CHI 2016 Extended Abstracts, May 7-12, San Jose, CA.