ai

Singing with Machines

Kelly Dobson doesn’t just “work with machines”– she sings with them. Her website features some of interactive machines she’s made, such as the “ScreamBody” (scream into it and it will silence your scream, but record it and allow you to play it back later). For more overviews of some of her past work, check out a video of her talk at GEL 2008. In the first few minutes, she sings (snarls?) with a blender, and relates they story of how she learned to make machines an extension of herself by singing with them.

Kelly’s research is interesting because it focuses on mutual empathy in human-machine, and even machine-machine, pairs. As you listen to her speak, it’s easy to forget that line between what makes humans and machines different. Singing in harmony is one of those things that seem so distinctly human– if you can start to do this with a machine, how can you not start to feel some sort of empathy? I wonder what other sorts of activities humans do to relate to each other that can be extended to machines. Also, what are the benefits of strengthening relationships between humans and machines? Kelly mentions therapy– if we trust in machines, perhaps we can allow them to console us and provide support.

Another interesting thought she brings up: in the future, will there be a need for machines to console other machines? This may sound far-fetched, but how many times have we contemplated machines that “feel” emotions? I think this leads to another question– does feeling emotions simply mean having needs that must be satisfied externally? The typical view of creating emotional machines is that we need to build systems that mimic how people emotionally respond to different situations. A sophisticated system might be able to pass a Turing test if it were able to detect and respond to situations in an appropriate way. However, does this mean that a machine is really “feeling”?

It is also important to consider how people learn emotions, and include this in such a model. Social learning theory might suggests that emotions are really learned during childhood as children view the world around them for cues about how to respond to things emotionally. Other theories suggest that emotions are inborn traits– perhaps born out of an evolutionary need for survival. For example, the feeling of “loneliness” might push people to connect with others, which builds relationships that are beneficial to the individual as well as society. Can we build machines that have base “instincts” that guide their behavior, but are also capable of learning appropriate emotional responses? Can machines use some sort of social referencing in order to learn appropriate reactions to situations based on both the context and their emotional state? I’m curious about how much of machine-emotion research is about capturing the ways that people learn and express emotions. An alternative may be to determine how people judge others’ emotions based on their words and behavior. This could lead to the design of machines that cause us to perceive them as emotional beings, based on our own emotional reactions to them.