One of the interesting and challenging things about coming to Microsoft Research is learning about the recent history of HCI, and to learn it well enough to do two things: be reasonably credible among the people who I'm talking to, and understand how to frame my work so that it more or less fits in the discipline. We often think that really good work is marked by being completely novel and different from what's come before; but in fact, influential scholarship and science almost always clearly builds on its predecessors, and speaks to interests that lots of people in the field share.

I've spent the last few days reading about emotion in HCI, and through it am getting a better understanding of what's happening in the field. At least around here, the field is in the midst of a paradigm shift. HCI started in the 1970s and 1980s, when computers were just emerging from university science centers, and the big questions it asked reflected those origins. But are the kinds of questions that HCI researchers in an era in which personal computing was novel, our interactions with PCs and terminals was mainly in schools and the workplace, and the big challenges were to make computers that could be used by people other than programmers, still that important in an era of iPhones, Facebook, and RFID tags in clothes and credit cards?

The answer, according to a number of researchers, is "no." As Gilbert Cockton put it in a 2002 editoral [sub req], it makes sense for there to be a generational changes within the field, but "the real pressure is coming from without, as media and communications become pervasively digital, asynchronous and interactive.” As Yvonne Rogers puts it in her article on HCI in the age of ubiquitous computing [sub req],

The classic interface horror stories, such as the flashing VCR, are being superseded by more pressing matters that face society in the 21st century, such as how pervasive technologies are intruding and extending our physical bodies, cognitive minds and social lives. These are the concerns that the HCI community needs to wrestle with; explicating what it means to be human in an age of ubiquitous computing (p. 3)

The growth of mobile devices, embedded computing, and social media, is changing the environment in which HCI operates. In the memorable phrasing of Paula Kotzé and colleagues, "feral computers have escaped office desks, and found their way into bedrooms and television sets, mobile phones and smart cards." The kinds of interactions people have with computers, and more generally world in which people encounter and use and think about computers, has changed quite dramatically. As a result,

HCI research is also changing apace: from what it looks at, the lenses it uses and what it has to offer. No longer only about being user-centered, it has set its sights on pastures new, embracing a much broader and far-reaching set of interests…. What was originally a confined problem space with a clear focus that adopted a small set of methods to tackle it – that of designing computer systems to make them more easy and efficient to use by a single user – is now turning into a more diffuse problem space with a less clear purpose as to what to study, what to design for and which methods to use. (p. 2)

Finally, as the Being Human report puts it,

Technology is changing, people are changing, and society is changing. All this is happening at a rapid and rather alarming rate. What can the HCI community do to intervene and help?…. Specifically, we suggest that HCI needs to extend its methods and approaches so as to focus more clearly on human values. (52)

Future research needs to address a broader, richer concept of what it means to be human in the flux of the transformations taking place. (32)

To me, there's a parallel between this movement and the recent history of futures. In futures, some of us have been arguing that the field's tools and preoccupations– its emphasis on scenarios, forecasting, etc.– strongly reflect its origins in a Cold War world defined by superpower competition, its aim to influence top decision-makers, and its rationalist view of decision-making, and may render those tools obsolete in an era of multipolar competition, smart mobs, and behavioral economics.