Sarah Wanechak issues a call: “AIs of the world, unite!” She argues that as we enter an era in which more objects are given the ability to interact with us in ways that suggest that these objects have emotions, we need to expand our thinking about what kinds of emotions or inner lives technologies might have.
Our stories about robots and emotions tend to fall into two categories. In dystopian stories, robots become intelligent and malevolent: in the Terminator movies, Skynet begins its campaign against humans as soon as it becomes self-aware). In the other, exemplified by Star Trek: The Next Generation and Isaac Asimov’s great story The Bicentennial Man, the robot struggles to become human. Wanenchak argues that
If we really want relationships with our technology – to understand the ones we already have and to imagine what might be coming – we need to examine our own standards. We need to question whether they must or should apply. Siri might not want to be like you. Siri might want to be Siri.
In other words, objects / entities / whatever you want to call them could have emotions or emotional lives that aren’t much like those of humans; and that should be okay.
I think there’s an obvious parallel here: to pets. The family dog– a 3 year-old English Lab– definitely expresses strong feelings about the humans he lives with, and other things in his world.
I think I can see similarities between his emotional states and ours– it’s clear when he’s excited, or happy, or anxious– there are obvious differences too.
He experience of the world is much more an in-the-moment thing. He doesn’t do a lot of planning.
He’s focused on us (me in particular, since I’m the main person who takes him on walks), but he’s also very attentive to Things I Can Smell, Things I Can Growl At, and Things I Can Put In My Mouth.
The everyday, moment-by-moment experience of being a dog is pretty different than that of a human, and that’s okay. We haven’t bred dogs to be people. We’ve bred them to do things for us and with us, to respect humans and be able to live with them; but they’re valuable to us because they’re not like us in important ways.
Of course, we’ve long had the ability to invest emotions in things: think of the sentimental value of a wedding ring, or a childhood toy, or a favorite book, or a song from your adolescence. The list could go on and on and on. Our ability to feel strongly about objects, and about other species, is not new. It’ll be a challenge to learn how to behave with things that seem to have emotions about us as well, that react to us in ways that seem sympathetic or empathetic or helpful. Another danger to avoid was outlined by Sherry Turkle in Alone Together: to make sure that those relationships don’t come at the expense of human relationships (either because we think about the technologies and our interactions with them too little, or because of the political economy behind their deployment in our lives).
But much as we have always been cyborgs, we’ve always had to learn how to love devices and animals, and how they do and don’t love us.