PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of transbThe Royal Society PublishingPhilosophical Transactions BAboutBrowse By SubjectAlertsFree Trial
 
Philos Trans R Soc Lond B Biol Sci. 2009 December 12; 364(1535): 3585–3595.
PMCID: PMC2781899

Affective loop experiences: designing for interactional embodiment

Abstract

Involving our corporeal bodies in interaction can create strong affective experiences. Systems that both can be influenced by and influence users corporeally exhibit a use quality we name an affective loop experience. In an affective loop experience, (i) emotions are seen as processes, constructed in the interaction, starting from everyday bodily, cognitive or social experiences; (ii) the system responds in ways that pull the user into the interaction, touching upon end users' physical experiences; and (iii) throughout the interaction the user is an active, meaning-making individual choosing how to express themselves—the interpretation responsibility does not lie with the system. We have built several systems that attempt to create affective loop experiences with more or less successful results. For example, eMoto lets users send text messages between mobile phones, but in addition to text, the messages also have colourful and animated shapes in the background chosen through emotion-gestures with a sensor-enabled stylus pen. Affective Diary is a digital diary with which users can scribble their notes, but it also allows for bodily memorabilia to be recorded from body sensors mapping to users' movement and arousal and placed along a timeline. Users can see patterns in their bodily reactions and relate them to various events going on in their lives.

The experiences of building and deploying these systems gave us insights into design requirements for addressing affective loop experiences, such as how to design for turn-taking between user and system, how to create for ‘open’ surfaces in the design that can carry users' own meaning-making processes, how to combine modalities to create for a ‘unity’ of expression, and the importance of mirroring user experience in familiar ways that touch upon their everyday social and corporeal experiences.

But a more important lesson gained from deploying the systems is how emotion processes are co-constructed and experienced inseparable from all other aspects of everyday life. Emotion processes are part of our social ways of being in the world; they dye our dreams, hopes and bodily experiences of the world. If we aim to design for affective interaction experiences, we need to place them into this larger picture.

Keywords: affective interaction, emotion, affective loop, co-construction of emotion

1. Introduction

A premise that underlies our work is that bodily experiences are integral to how we come to interpret and thus make sense of the world. Our work draws heavily on the notion of embodiment. Playing a central role in phenomenology, embodiment offers a way of explaining how we create meaning from our interactions with the everyday world we inhabit (Dourish 2004). Our experience of the world depends on our human bodies, both in a physical, biological way, through our experiential body, and also through our cultural bodies (Fallman 2003), that is, our learnt, cultural-determined behaviours and experiences. This has interesting implications for how we can design for emotional expressivity in communication systems, or in systems where our emotions are mirrored back to us.

In considering the design of affective computing applications, there is a growing body of work drawing upon what might be called a socially situated perspective of emotion (Boehner et al. 2005; Höök et al. 2008). Prior work on affective computing systems has typically tried to identify users' emotions as discrete information units, isolated from their context and their grounding in our everyday life. In an effort to counter this trend, we provide a set of requirements for systems that engage users in embodied emotional processes in what we named an interactional approach. Emotion processes are viewed as ‘culturally grounded, dynamically experienced, and to some degree constructed in action and interaction’ (Boehner et al. 2005).

But it is not enough to address the socio-cultural aspects of emotion. In our work, we also want to address directly the everyday, physical, bodily experiences of emotion processes (e.g. Sundström et al. 2007; Ferreira et al. 2008; Höök et al. 2008; Ståhl & Höök 2008; Sundström et al. 2009). Early on Darwin made a strong coupling between emotion and bodily movement (Darwin 1872). Since then, researchers in areas as diverse as neurology (LeDoux 1996; Davidson et al. 2003) to philosophy and dance (Laban & Lawrence 1974; Sheets-Johnstone 1999) describe the close coupling between readiness for action, muscular activity and the co-occurrence of emotion. Sheets-Johnstone makes the case that:

Without the readiness to act in a certain way, without certain corporeal tonicities, a certain feeling would not, and indeed, could not be felt, and a certain action would not, and indeed, could not be taken, since the postural dynamic of the body are what make the feeling and the action possible.

Our efforts have in essence been to build systems that unite the physical and cultural features of our embodied experiences. The systems mirror some of the aspects of physical and social everyday bodily experiences while, at the same time, leaving room for users to actively interpret them.

Involving our corporeal bodies in the interaction can create powerful experiences that also affect users. Systems that both can be influenced by and influence users corporeally exhibit a use quality,1 which we name an affective loop experience. Let us turn to the systems we have designed and the studies of them before we come back to some of our design insights.

2. Design cases: emoto, friendsense, affective diary and affective health

(a) eMoto: a communication service

The first example deals with personal communication in general, and communication of emotions in particular, in a mobile setting. It is an extended SMS service for the mobile phone named eMoto (Sundström et al. 2007). It was designed from an interactional view on communication between friends where users learn about each other's emotional expressions step by step as their friendships and use of eMoto develop. In short, eMoto lets users send text messages between mobile phones, but in addition to text the messages also have colourful and animated shapes in the background (e.g. figure 1). The user writes the text message and then chooses which expression to have in the background from a big palette of expressions mapped on a circle. The expressions are designed to convey emotional content along two axes: arousal and valence (Russell 1980). For example, aggressive expressions have high arousal and negative valence and are portrayed as sharp, edgy shapes, in strong red colours, with quick sharp animated movements. Calm expressions have low arousal and positive valence and are portrayed as slow, billowing movements of big, connected shapes in calm blue–green colours.

Figure 1.

Interacting with eMoto, using gestures to move around in a palette of colours, shapes and animations.

To move around in the circle the user has to perform gestures using the stylus pen (that comes with some mobile phones) extended with sensors that can pick up on pressure and shaking movements. With more pressure you get more negative expressions, and with more shaking you get more energy in the animations of the expression. Within the limits of shaking and pressuring, the shape of the gesture can be in any form, allowing for users' personal preferences (Sundström et al. 2007).

Studies of eMoto (with a group of five friends over several weeks of usage) showed that the circle was not used in a simplistic one-emotion–one-expression manner, mapping emotions directly to what you are experiencing at the time of sending an emoto. Instead the graphical expressions were appropriated and used innovatively to convey mixed emotions, empathy, irony, expectations of future experiences, resembling the surrounding environment (expressing the darkness of the night) or in general a mixture of their total embodied experiences of life and, in particular, their friendship. The ‘language’ of colours, shapes and animations juxtapositioned against the text of the message was open-ended enough for our users to appropriate them in ways that made sense to them. The colours, shapes and animations produced by the gestures became an open ‘surface’ that users ascribed meaning to.

Just to provide one example of how eMoto was used, consider the messages in figure 2. In the first message, Agnes expresses her love to her boyfriend. The background she picked for her message comes from a part of the circle that we had intended to be somewhere between angry and happy, but Agnes interprets it in her own way:

This looks almost angry, but it is not, really. It is like, but oh … […] It looks somewhat edgy but at the same time the way it is feels like it could be some kind of warm streams with like love like this.

Figure 2.

Two eMoto messages.

She was trying to express how much passion her love for her boyfriend entails. When Mona communicated her love to her boyfriend (second emoto message in figure 2), she instead used her favourite colour, green, to express herself:

Green is my favourite colour and my boyfriend knows that, so this is why it is green because he knows that I think that green is a lovely colour, just as lovely as he is.

To make it absolutely clear, eMoto does not extract emotional information from users but lets users directly express emotions to the system, a process over which they have total control. They can, for example, express emotions that they are not feeling through shaking and squeezing the sensors of the stylus pen in different ways. While this may seem like lying, it is in fact crucial in any communication situation in order to make human relations work—it is a social responsibility (Aoki & Woodruff 2005). However, the idea of eMoto is to make the gestures reinforce whatever emotion the user expresses by reacting to the expressive gestures performed by the user. As one of the users in the study expressed it:

I leave out things I think are implicit due to the colour … the advantage is that you don't have to write as much, it is like a body language. Like when you meet someone you don't say ‘I'm sulky’ or something like that, because that shows, I don't need to say that. And it's the same here, but here it's colour.

Hence, in the end, users may come to experience the emotion that they are expressing physically through shaking and pressing the extended stylus. Or was expressed by the partner to one of our study participants:

When she was happy she showed that with her whole body. Not only her arm was shaking but her whole body. Meanwhile a huge smile appeared on her lips.

Perhaps more interesting here is whether our study participants could make sense of and enjoy the physical, sensual aspects of the interaction. While being inside the interaction experience, it is hard to reflect on the physical aspects of the interaction, especially in those instances when it works well because then the interaction disappears into the activity. But as we came back to our users and interviewed them afterwards, such a reflection could take place as they could see the video clips2 of themselves using eMoto and also look back at the emotos they had sent. An example is when Isabella constructs an emoto. She is emotionally engaged in the gestures and in the emoto she is in the process of creating, but also by the loud music she is listening to at the same time. In figure 3 we provide a series of snapshots from this clip where one sees her body dancing to the music, singing along at the same time as she is gesturing with the eMoto pen.

Figure 3.

Snapshots from a video clip of Isabella constructing an eMoto while listening to music at the same time.

(b) FriendSense

FriendSense is a system for sensor-based synchronous communication with a whole group of friends who are co-located (Sundström et al. 2009). As a first step in designing the system, we created a technical probe to find out more about the relationships and activities that constitute a group of work colleagues or friends at work. A technology probe is a fairly simple but fully working technical system designed to uncover and learn from real life practices and experiences (Hutchinson et al. 2003). The idea is to place the system with potential users to be used in their everyday environment, outside the laboratory, away from some of the obstacles of a staged set-up and get informed user feedback early in the design process.

The FriendSense probe worked as follows. Each user was given a sensor node (from Freie Universität, Berlin) that picks up on temperature and vibration (figure 4). By manipulating the sensor node, users create a graphical expression consisting of a sphere-shaped object resembling a marble or soap bubble. A vibration sensor determines the animated movement of the marble, and a temperature sensor determines its colour. The expressions were chosen to roughly resemble the bodily experience of manipulating the sensor node. Our intention was also that the possible colours and movements would be varied and expressive enough for users to express their moods, emotions or some other experiences relevant for their friends to see.

Figure 4.

The FriendSense system: sensor node, local client on user's PC and uploading to the big screen.

Users could make their marble be close or far away from their friends' marbles.

Users first create their expression locally on their own PC and then upload it to a public and shared display where a collective expression of all users' expressions is formed. The shared display was placed so that all users could see it from their desk when sitting in an open office landscape.

As we deployed the FriendSense system in two different groups of colleagues, consisting of six and nine persons, respectively, sitting in open landscape offices, we learnt a lot about the sensitivities of friendships and the importance of corporeal expressions.

The colleagues were involved in, and very often wanted to be involved in, a kind of companionable awareness of the emotional dramas and various work and leisure activities of the others. Within the limits of what is admissible in a work context, they expressed themselves, interpreted others and innovatively created meanings from the scraps and bits of information they had about each other. The information they use in this meaning-making puzzle included their physical presence in the open office landscape, their knowledge of ongoing work or leisure activities, together with their expressions in FriendSense.

For example, in one of the work groups, one of the participants was about to defend his thesis. He was over-worked and nervous about the defence. One of his colleagues, his supervisor, felt a strong need to be close to him. She would sometimes roll her office chair closer to his and peek over his shoulder at his dissertation, to show him her support and presence. Friday morning, right before his defence seminar, his supervisor and another colleague both moved their marbles very close to his on the public display to show him their support and empathy (figure 5). They used their sensor nodes to express a kind of emphatic nervousness. While this did not make him feel less nervous, he became aware of their support.

Figure 5.

Friday morning before the thesis defence. Everyone is gathering around his vibrating and red marble.

Around lunchtime, the colleagues in one of the work groups would sometimes playfully bounce their sensor nodes on their tables to indicate that it was time to go off and have lunch together. The sound of the bouncing nodes, and the noise created, added to the feeling of urgency to stop working and have lunch.

The display was used not only to express ongoing processes, but also to influence the whole group. One Friday afternoon, one of the colleagues tried to get everyone into a cocktail party mood. She heated and pounded her node so that her marble became red and jumpy. After a while two other colleagues picked up on what she was attempting to do, and joined her in her party mood (figure 5); one moved closer to her, and the other changed her picture to a disco ball, expressing that she was game for whatever the first user had in mind for the evening.

The FriendSense system is currently being designed and implemented to reflect some of the lessons learnt from the probe experience. Instead of putting a screen in between the friends, we now aim to build a bracelet that can be used as both input and output (using haptics). By this, we hope to bypass the detour over the screen representation, and move closer to users' sensual experience of other's presence.

(c) Affective Diary: a personal logging system

The third example system deals with personal logs in general, and in our case a diary in particular. An ordinary paper-based diary provides a useful means of expressing inner thoughts and recording experiences of past events, and becomes a resource for reflection. In Affective Diary we wanted to explore reflection that goes beyond purely intellectual experiences and aids users in remembering, and reflecting on, aspects of the bodily side of their emotional experiences (Ståhl & Höök 2008). The aim was to provide users with material working as a bridge to their everyday experiences, using sensor data picked up from users' bodies, allowing them to go back in time and see their physical and emotional reactions.

In this digital diary users can scribble their notes on top of bodily memorabilia recorded from body sensors and mobile memorabilia collected from users' mobile phones. In short, during the day, a sensor armband collects sensor data indicating movement and arousal levels. The mobile phone logging system logs text messages (SMS) sent and received, photographs taken and Bluetooth presence of other mobile phones in the vicinity. The logged data are transferred to Affective Diary and placed along a timeline. The logged sensor data are presented as somewhat ambiguously shaped and coloured figures mapped out along the timeline (figure 6). Movement activity, as registered by a pedometer in the sensor armband, is represented by how upright the character is. Arousal is computed from a galvanic skin response (GSR) measurement (GSR measures how much electricity the skin conducts—the more we sweat, the more electricity) and is represented by the colour of the character—going from calm blue passing a whole colour scale up to bright red.

Figure 6.

One colleague (upper right) tries to work up a Friday party feeling and the others (except for one) join the ‘party’.

The coloured figures provide users with means to remember previous experiences, but their interpretation is not given once and for all. The colours are, again, ‘inscribable’ surfaces where users can put their own meaning-making. As they can also scribble their own notes on top of the materials, they can put meaning into the patterns they discover.

An in-depth study with four users over about a month of usage indicated that users were able to make sense of the diary material and relate it to different events in their life (Ståhl & Höök 2008). There was also evidence that they were able to recognize their bodily experiences through seeing the representation in the diary. By recognizing and re-living some experiences (and on occasion and somewhat paradoxically by not recognizing their own bodily reactions), they sometimes even learnt something about themselves that they did not know before. Something we initially had not anticipated seeing so strongly in these reflective processes was the extent to which the Diary influenced learning and changes in behaviour. This occurred especially when our participants had used Affective Diary several times and could look back, consider and compare interpretations of different events.

For example, by using the diary, Erica discovered that certain events affected her mood (e.g. a meeting with her boss that made her very agitated, figure 7). She could see that this mood persisted for a long time after the meeting. She says:

We had a discussion about having vacation in July although I really didn't want to have vacation then, because I had nothing to do then. That made me somewhat annoyed.

Figure 7.

Affective Diary.

When Erica became aware of this, she used Affective Diary to change her own behaviour in stressful situations and even monitor how well she was doing. For instance on midsummer's eve, a holiday that usually made her very stressed, she had decided to take it easy. For that day/night the diary showed blue low-energy shapes, which she interpreted as having succeeded in staying calm and enjoying the day and the midsummer party at her house.

Another example of reflection, learning and change was when Ulrica reflected on her closest relationships using the Affective Diary. Looking through her diary, she came to associate emotionally upsetting situations with figures that were coloured blue and thus calm. She found, however, that a few hours after an event the figures would change colour and indicate a lot of movement. This she associated with her usual coping mechanism, jogging. Her interpretation was that she held back on her emotional reactions in the moment (quarrelling with her boyfriend or boss, or when her son left to live in Paris for several months). Instead, she let off steam when alone, jogging. On one occasion during the interviews, for example, Ulrica reasoned about her calmness as her son was telling her that he was moving to France:

And then I become like this kind of. I am sort of both happy and sad in some way. I like him and therefore it is sad that we see each other so little, I think. Then [at this time] I cannot really show it. Or there is no reason really, since there is nothing wrong about anything, it is just kind of sad.

Reflecting on the diary's content, Ulrica expresses surprise that she was able to see a pattern emerge:

But that it shows me how I work. That I … [In fact] I get quite surprised by that. By the fact that I can see this so clearly here [points to the figures on the screen]. Or that is how I interpret it anyway. That I'm not, that I am not so emotionally engaged in, eh … when I interact with people. That I am [emotionally engaged] only when I am alone, kind of.

She continues to say: ‘I have gone back several times to the earlier days, and when I … read the figures I think that it sort of confirms what I have talked about now, what I said about how I function.’

For Ulrica then, her reflections using the diary provided an explanation of why people sometimes misunderstood her and her emotional reactions. Further, it led her to conclude that she should let more of her inner feelings be expressed in the moment with the people they concerned. In short, Ulrica used the diary to reflect on her past actions and, as a consequence, to decide to change some of her behaviours.

(i) Affective health

Based on the experience of the Affective Diary system, we are currently building a system named Affective Health that provides real-time feedback on mobile phones. Affective Health allows users to get into a biofeedback loop through mirroring their physical data on their mobile phones in real time (figure 8), but they can also scroll back in the data and discover a pattern in their behaviours and reactions. The data are collected from Bluetooth-enabled sensors worn on the body (Ferreira et al. 2008). The sensors pick up on movement, pulse and skin conductivity (figure 9).

Figure 8.

Erica's meeting with her boss. (a) Screendump depicts 15.00–16.00 when the meeting that agitated Erica happened (lilac and red characters). (b) Screendump depicts 16.00–17.00, where Erica is still agitated (red and lilac characters)—until ...

Figure 9.

Affective Health interface showing pulse, skin conductivity and movement.

Our preliminary data from end-user experiences show that users are made aware of processes in their bodies that they might not always be aware of. The system becomes a ‘crutch’ for listening more intently to bodily signs and signals in situations where we sometimes might be too focused on our cognitive presence, and too little on our physical, bodily, presence.

We are currently looking deeper into the interface representation that best captures an open-ended, ambiguous, organic way of conveying bodily states.

3. Affective loops—design lessons

As discussed in §1, systems that can both be influenced by and influence users corporeally, but still privilege users to make meaning out of the representations, exhibit a use quality that we named affective loop experience. Let us re-visit the systems and discuss to what extent they create affective loop experiences and to what extent they are embodied with users' social and bodily practices.

(a) Embodiment?

As discussed above, our aim was to create embodiment (Dourish 2004). That is, systems that allow users' social and bodily practices to be created, negotiated, unfold between users and users and systems. As Dourish expresses it:

Embodiment is not a property of systems, technologies, or artifacts; it is a property of interaction. It is rooted in the ways in which people (and technologies) participate in the world. In contrast to Cartesian approaches that separate mind from body and thought from action, embodied interaction emphasizes their duality. We act in a world that is suffused with social meaning, which both makes are activities meaningful and is itself transformed by them.

(Dourish 2004, p. 189)

In eMoto, the gestures with the sensor-enabled stylus allowed users to express themselves at the same time as the very same gestures influenced their experience of composing messages. Together with the feedback in colours, shapes and animations that resonate with the gestures, the experience becomes strong and touches on their corporeal experience of what they are communicating. As the system does not prescribe which colourful background to use when, but flexibly allows users to create their own expressions, sometimes only making sense to the two friends communicating, it can become part of a social practice arising between the users. The bodily expression of this, the performance of the gestures as such, also allows for quite some individual freedom while at the same time creating for meaning-making in a physical sense. But gesturing with the stylus, you feel what you are expressing. However, now and then eMoto failed to involve users. An initial pressure had to be applied to the stylus to start it, the Bluetooth communication was sometimes slow in picking up on the gestures and, sometimes, users could not find an appropriate expression for mixed, bland or several emotion expressions in a row. In those cases, users were thrown out of the experience of being one with the system, and became aware of the system as such—the stylus became ‘present at hand’ rather than an embodied part of the communication.

In the FriendSense probe, the (admittedly rough) sensor nodes allowed users to express energy and warmth, and the big display in the room conveyed physical nearness between users in a ‘parallel’ universe to that of the physical co-location in the office. Again, users could influence and become influenced by the interaction in an affective loop. But the roughness of the sensor nodes made them better at affording energetic and negative expression than soft and caressing gestures. And the intermediate step of first creating an expression on your own desktop before uploading it to the big screen sometimes threw users out of their feeling of being inside the parallel universe of colleagues, expressing themselves physically, there and then.

In Affective Diary, we see a slower affective loop where users are encouraged to reflect and relive their experiences from both a bodily and social perspective as they engage with the materials entered in the diary. Affective Health, on the other hand, allows for an immediate biofeedback loop. Both systems allow users to inscribe their own meaning and create their own practice for analysing and living with the bio-sensor data provided. We speculate that Affective Diary made use of a too anthropomorphic representation that sometimes became an obstacle to identifying with it. Affective Health is therefore making use of a more abstract representation. We are struggling to find the right representation that feels alive, allows users to identify with it, but not so strong that it starts living its own life.

In summary, even if these systems sometimes failed in involving users, they frequently succeeded. And all the systems address emotion as a process embodied in the interaction, involving physical, bodily processes. None of the systems tries to represent these emotion processes inside the system or to diagnose users' emotions based on their facial expressions or some other human emotion expression. The mapping from user input, such as GSR or gesture, to emotion is done by the user—not the system. Instead, they build upon users' own capabilities as meaning-making, intelligent, active co-constructors of meaning, emotional processes, and bodily and social practices.

(b) Design lessons

To create these designs we had to go through many iterations, and even so, we would not like to claim that they can perfectly support or mirror users' experiences. But there are some design insights that we can share.

Open surfaces. An important lesson from these designs is that they have all left space, or ‘inscribable surfaces’, open for users to fill with content (Höök 2006). The activities of others need to be visible and users should be allowed to shape what can and should be expressed over time.

In eMoto, users could choose from a large palette of expressions and the colourful, animated expressions were not labelled with some specific emotion label. Instead the intended meaning arose from the interaction between the two friends.

In FriendSense, colours, movement and the distance between them provided users with an ambiguous but still rich surface that they could project their interpretations onto. But this surface was used together with their understanding of what was simultaneously going on in the open space office.

In Affective Diary, the colourful characters were not labelled as happy or angry, but were an open-ended way of portraying bodily traces that users projected their own experiences onto. By scribbling on top of them, they could modify their meaning to make sense vis-à-vis their own interpretation.

Familiar expressions. If users recognize themselves or others through the activities they perform at the interface—if they look familiar to the user through the social or bodily practice they convey—they can more easily be influenced by as well as appropriate these open surfaces. The mapping from gesture to colour and animation in eMoto, the mapping from marble movement and colour in FriendSense, and the mapping from movement and arousal to the colourful characters in Affective Diary need to be understandable and clear to the user. Their shape and form need to remind our users of their own bodily and social practices.

In Affective Diary, the blobby character has a body shape reminding users of their own body and their movement. The intensity of the colour of the character reminds users of the intensity of their experience.

In eMoto, the animations resonate with the gestures, that is, the more intense the gestural movements, the more rapid the animation in the interface, and vice versa. This in turn resonates with their experience—the more intense the movement, the more intense the emotional experience. Likewise, the tenser the gesture is, the more negative the expression looks, with darker colours and more edgy shapes. A less tense gesture renders smoother, rounder shapes with clearer colours.

But before arriving at these particular mappings from user input to expression, we went through many design iterations that did not work (Ståhl et al. 2005; Ståhl & Höök 2008). For example, in eMoto, the negative expressions initially had too bright colours. The dark side needed more darkness.

Turn-taking. What might not be so clear from the accounts above is the importance of getting the turn-taking between user input and system output to work smoothly. In eMoto we had to fine-tune the timing of the interaction to allow for the sad movements to take their time, while happy or angry movements must render the corresponding emotional expression much faster. In the FriendSense probe, we failed to make the connection between the sensor-node manipulation and the screen representation properly. Users first created their expression locally before uploading it to the big screen. This layer in between interfered with the feeling of being ‘one’ with their expression.

‘Unity’ between modalities. In these systems, we used mainly gestures, colour, shapes and animations. But we could have also been using music, sounds, haptics and other modalities. No matter which modalities are used, it is very important that they harmonize (Ståhl 2006). You cannot combine a clear blue colour with angry animations of big shapes—the modalities speak against one another. In an early version of the Affective Diary system we tried using music to portray the body memorabilia. But we could not get the musical expressions to harmonize with the figures. While we could control the emotional expression of the music (deWitt & Bresin 2007), the actual choice of music interfered with users' interpretation of the colours and shapes of the characters. How the different parts come together into an aesthetic experience is key when designing these kinds of systems (Dewey 1934; McCarthy & Wright 2004).

Other systems that exhibit some of these properties are, for example, the VIO system (Kaye 2006) that leaves meaning-making entirely in the hands of its users, or the feather, shaker and scent systems (Strong & Gaver 1996), where communication between the two participants is based on shaking, blowing or sending a scent to one another, or Affector, connecting two offices through a distorted video-stream reflecting the moods of the two office workers (Sengers et al. 2005). The SenToy doll (Paiva et al. 2003), where users manipulated a plush doll to control their avatar in a game, also made use of emotionally related movements as a way of touching users' emotional experiences.

4. Emotion processes—‘in the wild’

Besides learning how to design for affective loop experiences, a more important lesson gained from deploying the systems with real users ‘in the wild’ for several weeks was how emotion processes are co-constructed and experienced inseparable from all other aspects of life. The subtle nuances and uniqueness of everyday emotion experiences make it virtually impossible to generalize over them or interpret them as separate entities. It is not only the social context that affects emotion processes. Our corporeal experiences also shape and interact with our understanding.

(a) Embodiment and the corporeal body

An important point of the systems we have built is how they touch our senses corporeally and sensually. While most researchers in affective computing are aware of how mind, body and emotion are tightly connected (Damasio 1994), it is not always clear how we should be designing for this (Moen 2006; Djajadiningrat et al. 2007; Hummels et al. 2007). The position we take here is that corporeal experiences are also interactively constructed in our everyday practices.

When Merleau-Ponty writes about the body, he begins by stating that the body is not an object (Merleau-Ponty 1962). It is instead the condition and context through which I am in the world. Our bodily experiences are integral to how we come to interpret and thus make sense of the world. This premise draws heavily on the notion of embodiment. He attempts to get away from the perspective of the doctrine that treats

[…] perception as a simple result of the action of external things on our body as well as against those which insist on the autonomy of consciousness. These philosophies commonly forget—in favour of a pure exteriority or of a pure interiority—the insertion of the mind in corporeality, the ambiguous relation without body, and correlatively, with perceived things.

(Merleau-Ponty 1962, pp. 3–4)

Feminists have attempted to deal with the actual physical body in more concrete terms, highlighting in particular the differences between male and female bodies. Grosz, for example, makes an interesting journey through the various philosophies, such as Freud's psychoanalysis and phenomenology, throughout the last century, showing that most of them speak, in a sense, vaguely about the actual corporeal body (Grosz 1994). As a feminist, she sees very little of the female body, but instead, in anything, a ‘normal’, male body in the theories on e.g. perception. Grosz makes the case that female bodies are different from male bodies—both corporeally and through their ‘cultural completion’:

[…] as an essential internal condition of human bodies, a consequence of perhaps their organic openness to cultural completion, bodies must take the social order as their productive nucleus. Part of their own ‘nature’ is an organic or ontological ‘incompleteness’ or lack of finality, an amenability to social completion, social ordering and organisation.

(Grosz 1994, p. xi)

This perspective rhymes well with Merleau-Ponty's (and Fällman's) experiential and cultural bodies mentioned above, even if he, according to Grosz, never really dealt with the fact that some bodies are different from the male body—both corporeally and also in terms of their cultural completion.

Relevant to our investigation here is Grosz's emphasis on bodily completion by culture or practice. This is where our designs of digital tools come into play. Through new tools, we interfere with users' practices, with the social ordering and organization. Our bodies are shaped by the tools we surround ourselves with—not only in a metaphorical or ‘cultural body’ sense but also in a concrete corporeal sense. The tools we have make us experience the world in certain ways, they make our muscles be used in certain ways, and they stimulate our nervous system in certain ways. Just like dancers, riders or runners shape their bodies into certain forms, making them sensitive to balance, position and rhythm, computer gamers or office workers will shape their bodies into fitting with gaming, desktop activities or the expressive capacity of affective interactive systems.

By designing for physical interaction with systems like FriendSense, eMoto, Affective Diary or Affective Health, we are interfering with our corporeal experiences. In particular, wearing sensors, and thereby adding to or mediating our ways of being in the world and our experiences of our own bodies, may profoundly affect us (Michael 2000; Troshynski et al. 2008). As designers we need to consider what and how we mediate and what we choose not to mediate?

(b) Aesthetic experiences

As mentioned above, systems that create an affective loop can result in aesthetic experiences. Dewey distinguishes aesthetic experiences from other aspects of our everyday life through placing them in between two extremes on a scale (Dewey 1934): on one end of that scale, there are many experiences where we just drift and experience an unorganized flow of events, and on the other end of the scale we experience events that do have a clear beginning and end but that only mechanically connect the events with one another. Aesthetic experiences exist between those extremes. They have a clear beginning and an end; they can be uniquely named afterwards (e.g. ‘when I first heard jazz at the Village Vanguard’; McCarthy & Wright 2004); but in addition, the experience has a unity that goes beyond a mechanic linking of one event to another—there is a single quality that pervades the entire experience:

An experience has a unity that gives it its name, that meal, that storm, that rupture of a friendship. The existence of this unity is constituted by a single quality that pervades the entire experience in spite of the variation of its constituent parts.

(Dewey 1934, pp. 36–57)

In Dewey's perspective, emotion is

the moving and cementing force. It selects what is congruous and dyes what is selected with its color, thereby giving qualitative unity to materials externally disparate and dissimilar. It thus provides unity in and through the varied parts of an experience.

(Dewey 1934, p. 44)

However, emotions are not static but change in time with the experience itself just as a dramatic experience does.

Joy, sorrow, hope, fear, anger, curiosity, are treated as if each in itself were a sort of entity that enters full-made upon the scene, an entity that may last a long time or a short time, but whose duration, whose growth and career, is irrelevant to its nature. In fact emotions are qualities, when they are significant, of a complex experience that moves and changes.

(Dewey 1934, p. 43)

While an emotion process is not enough to create an aesthetic experience, emotions will be part of the experience and inseparable from the intellectual and bodily experiences. In such a holistic perspective, it will not make sense to talk of emotion processes as something separate from our embodied experience of being in the world.

5. Discussion

In the field of affective computing, there is a tendency to try to make emotion communication more organized, error free, non-ambiguous and less error prone. The prevailing vision is that of an information channel that has become too narrow and needs to be broadened to allow for richer and clearer communication. But what is ‘rich’ communication really? Is it only a matter of providing more data or more modalities? There is, in our view, a very strong belief that adding some emotion interpretation and emotion expressions to computers will result in fewer communication problems.

The studies described here instead show that emotion communication is just like so many other human activities: it is a continuous, creatively produced improvization. It is not the case that we plan what emotion to convey to whom, and similarly we do not plan exactly which action to do next (Suchman 1997). And emotion experiences are not states out there to be detected but fluid, interactive processes. We are situated in the world, acting, moving, experiencing. And we enjoy being creative in how we express ourselves and how that in turn affects us. We do not want to automatize those creative aspects of modulating emotion expression and being involved with others (cf. Harper and colleagues' argument that while there are some activities where we want to see ourselves as machines, as when driving a car, in most walks of life this is not how we act; Harper et al. 2007).

Acknowledgements

If we aim to design for affective interaction experiences, we need to place them into this larger picture.

I would like to thank Peter Robinson and Rana el Kaliouby who invited me to the seminar ‘Emotion in Man and Machine’ at the Royal Society. I also want to thank the anonymous reviewer who provided detailed and insightful comments on this paper. This work was done in collaboration with Anna Ståhl, Petra Sundström, Jarmo Laaksolahti, Martin Svensson, Alex Taylor, Pedro Ferreira, Tove Jaensson, Pedro Ferreira and others.

Endnotes

One contribution of 17 to a Discussion Meeting Issue ‘Computation of emotions in man and machines’.

1Use qualities are those hard-to-describe experiential qualities that arise in and through interaction (Löwgren & Stolterman 2004; Löwgren 2007).

2The participant's partner or roommate filmed the videos. The partner or roommate was recruited to help us study our participants' use of the system in their everyday life. We name this method in situ informants (Sundström et al. 2007).

References

  • Aoki P. M., Woodruff A. 2005. Making space for stories: ambiguity in the design of personal communication systems. Proc. SIGCHI Conf. on Human Factors in Computing Systems, pp. 181–190 New York, NY: ACM Press.
  • Boehner K., DePaula R., Dourish P., Sengers P. 2005. Affect: from information to interaction. Critical Computing Conf. Århus, Denmark, 2005
  • Damasio A. R. 1994. Descartes' error: emotion, reason and the human brain New York, NY: Grosset/Putnam
  • Darwin C. London, UK: John Murray.; 1872. The expression of emotions in man and animals.
  • Davidson R. J., Pizzagalli D., Nitschke J. B., Kalin N. H. 2003. Parsing the subcomponents of emotion and disorders of emotion: perspectives from affective neuroscience. In Handbook of affective sciences (eds Davidson R. J., Scherer K. R., Goldsmith H. H., editors. ), pp. 8–24 New York, NY: Oxford University Press
  • Dewey J. 1934. Art as experience New York, NY: Perigee.
  • DeWitt A., Bresin R. 2007. Sound design for affective interaction. In Affective computing and intelligent interaction (eds Paiva A., Prada R., Picard R. W., editors. ), pp. 523–533 Berlin/Heidelberg, Germany: Springer
  • Djajadiningrat T., Matthews B., Stienstra M. 2007. Easy doesn't do it: skill and expression in tangible aesthetics. Pers. Ubiquit. Comput., special issue on movement-based design11, 657–676 (doi:10.1007/s00779-006-0137-9)
  • Dourish P. 2004. Where the action is: the foundations of embodied interaction Cambridge, MA: MIT Press.
  • Fallman D. 2003. In romance with the materials of mobile interaction: a phenomenological approach to the design of mobile information technology. Doctoral thesis.Umeå University, Sweden: Larsson & Co:s Tryckeri
  • Ferreira P., Sanches P., Höök K., Jaensson T. 2008. License to chill! How to empower users to cope with stress. Proc. Nordic Forum for Human–Computer Interaction Research (NordiCHI) Lund, Sweden: ACM Press.
  • Grosz E. Volatile bodies: toward a corporeal feminism. Bloomington, Indiana: Indiana University Press.; 1994.
  • Harper R., Randall D., Smyth N., Evans C., Heledd L., Moore R. 2007. Thanks for the memory. Interact: HCI 2007, September, Lancaster
  • Hummels C., Overbeeke K. C., Klooster S. 2007. Move to get moved: a search for methods, tools and knowledge to design for expressive and rich movement-based interaction. Pers. Ubiquit. Comput. 11, 677–690
  • Hutchinson H., et al. 2003. Technology probes: inspiring design for and with families. Proc. of the SIGCHI Conf. on Human Factors in Computing Systems Ft. Lauderdale, Florida, USA
  • Höök K. 2006. Designing familiar open surfaces. Proc. of NordiCHI 2006, Oslo, Norway, October 2006 ACM Press.
  • Höök K., Ståhl A., Sundström P., Laaksolahti J. 2008. Interactional empowerment. ACM SIGCHI Conf. Computer–Human Interaction (CHI2008) Florence, Italy, ACM Press
  • Kaye J. J. 2006. I just clicked to say i love you: rich evaluations of minimal communication. Alt.chi, Ext. Abs. CHI 2006 New York, NY: ACM Press.
  • Laban R., Lawrence F. C. 1974. Effort, economy of human effort, 2nd edn London, UK: Macdonald & Evans Ltd
  • LeDoux J. E. 1996. The emotional brain: the mysterious underpinnings of emotional life New York, NY: Simon & Schuster.
  • Löwgren J. 2007. Inspirational patterns for embodied interaction. J. Knowl. Technol. Policy 20, 165–177
  • Löwgren J., Stolterman E. 2004. Thoughtful interaction design Cambridge, MA: The MIT Press
  • McCarthy J., Wright P. 2004. Technology as experience Cambridge, MA: The MIT Press.
  • Merleau-Ponty M. 1962. Phenomenology of perception Smith C., translator. (translator). Routledge & Kegan Paul.
  • Michael M. 2000. These boots are made for walking … : mundane technology, the body and human–environment relations. J. Body Soc. 6, 107–126 (doi:10.1177/1357034X00006003006)
  • Moen J. 2006. KinAesthetic movement interaction. Designing for the pleasure of motion. Unpublished doctoral thesis in HCI, KTH, Sweden
  • Paiva A., Costa M., Chaves R., Piedade M., Mourão D., Sobral D., Höök K., Andersson G., Bullock A. 2003. SenToy: an affective sympathetic interface. Int. J. Hum. Comput. Stud. 59, 227–235(doi:10.1016/S1071-5819(03)00048-X)
  • Russell J. A. 1980. Circumplex model of affect. J. Pers. Soc. Psychol., 39, 1161–1178, APA. (doi:10.1037/h0077714)
  • Strong R., Gaver W. W. 1996. Feather, scent, and shaker: supporting simple intimacy. Proc. CSCW' 96, November 16–20, Boston
  • Sengers P., Boehner K., Warner S., Jenkins T. 2005. Evaluating affector: co-interpreting what ‘works’. CHI 2005 Workshop on Innovative Approaches to Evaluating Affective Systems
  • Sheets-Johnstone M. 1999. Emotion and movement: a beginning empirical-phenomenological analysis of their relationship. J. Conscious. Stud. 6, 259–277
  • Ståhl A. 2006. Designing for emotional expressivity. Licentiate thesis in design, Umeå Design Institute. Umeå University Sweden.
  • Ståhl A., Höök K., 2008. Reflecting on the design process of the Affective Diary. Proc. ACM NordiCHI20–22 October 2008 Lund, Sweden, pp. 559–564 New York, NY: ACM
  • Ståhl A., Sundström P., Höök K., 2005. A foundation for emotional expressivity. DUX 2005 New York, NY: American Institute of Graphic Arts.
  • Ståhl A., Höök K., Svensson M., Taylor A., Combetto M. 2008. Experiencing the affective diary. J. Pers. Ubiquit. Comput.
  • Suchman L. A. 1997. From interactions to integrations. In Proc. Human–Computer Interaction INTERACT'97 (eds. Howard S., Hammond J., Lindegaard G., editors. ). London, UK: Chapman & Hall.
  • Sundström P., Ståhl A., Höök K. 2007. In situ informants exploring an emotional mobile messaging system in their everyday practice. Int. J. Hum. Comput. Stud., in a special issue on evaluating affective interfaces, 65, 388–403(doi:10.1016/j.ijhcs.2006.11.013)
  • Sundström P., Jaensson T., Höök K., Pommeranz A. 2009. Probing the potential of non-verbal group communication. Proc. Group ACM Press.
  • Troshynski E., Lee C., Dourish P. 2008. Accountabilities of presence: reframing location-based systems. Proc. ACM CHI 2008 Conf. on Human Factors in Computing SystemsApril 5–10, 2008, pp. 487–496

Articles from Philosophical Transactions of the Royal Society B: Biological Sciences are provided here courtesy of The Royal Society