And how do you resonate with your customers at an emotional level without excessive human interaction? You design it in. If you create a positive relationship between the product and the end user, you don’t have the expense of someone needing to explain, “Here’s why this is important, this is how you use it, and this is why it matters to you.”
— Bruner & Emery, “Do you matter?”

Emotional Multitasking

This morning, I was chatting on IM, checking email, and reading an article for class (a typical Friday morning). As I was doing this, I found myself wondering about how people multitask at an emotional level. Since working memory is limited to just a couple items (7 +- 2) at a time, people who are good at multitasking are those who are good at quickly swapping task-related data in and out of memory. What sort of effect does this have on emotion? For example, if you were IMing with someone about happy news, but reading a very sad email, would your emotions fluctuate as you flipped between the two items? Would the stronger emotion dominate, or would the other emotion help to temper it?

Also, there must be some sort of cost for trying to mediate the different emotions associated with each task. With so many concurrent forms of emotional stimuation, its no wonder stess levels keep going up.

Singing with Machines

Kelly Dobson doesn’t just “work with machines”– she sings with them. Her website features some of interactive machines she’s made, such as the “ScreamBody” (scream into it and it will silence your scream, but record it and allow you to play it back later). For more overviews of some of her past work, check out a video of her talk at GEL 2008. In the first few minutes, she sings (snarls?) with a blender, and relates they story of how she learned to make machines an extension of herself by singing with them.

Kelly’s research is interesting because it focuses on mutual empathy in human-machine, and even machine-machine, pairs. As you listen to her speak, it’s easy to forget that line between what makes humans and machines different. Singing in harmony is one of those things that seem so distinctly human– if you can start to do this with a machine, how can you not start to feel some sort of empathy? I wonder what other sorts of activities humans do to relate to each other that can be extended to machines. Also, what are the benefits of strengthening relationships between humans and machines? Kelly mentions therapy– if we trust in machines, perhaps we can allow them to console us and provide support.

Another interesting thought she brings up: in the future, will there be a need for machines to console other machines? This may sound far-fetched, but how many times have we contemplated machines that “feel” emotions? I think this leads to another question– does feeling emotions simply mean having needs that must be satisfied externally? The typical view of creating emotional machines is that we need to build systems that mimic how people emotionally respond to different situations. A sophisticated system might be able to pass a Turing test if it were able to detect and respond to situations in an appropriate way. However, does this mean that a machine is really “feeling”?

It is also important to consider how people learn emotions, and include this in such a model. Social learning theory might suggests that emotions are really learned during childhood as children view the world around them for cues about how to respond to things emotionally. Other theories suggest that emotions are inborn traits– perhaps born out of an evolutionary need for survival. For example, the feeling of “loneliness” might push people to connect with others, which builds relationships that are beneficial to the individual as well as society. Can we build machines that have base “instincts” that guide their behavior, but are also capable of learning appropriate emotional responses? Can machines use some sort of social referencing in order to learn appropriate reactions to situations based on both the context and their emotional state? I’m curious about how much of machine-emotion research is about capturing the ways that people learn and express emotions. An alternative may be to determine how people judge others’ emotions based on their words and behavior. This could lead to the design of machines that cause us to perceive them as emotional beings, based on our own emotional reactions to them.