A little while ago I got an invitation to contribute to Medium, and decided to try it out this morning. I posted there an essay on Nigel Thrift’s new piece on attention in the digital age, which I kind of went Morozov on. But it was a nice, brief example of how not to think about attention, media, and education.

I’ve copied it below the fold, in case Medium goes belly up. But they do lovely work, and I’m curious to see what kind of traffic one gets there.

Nigel Thrift has an essay in the Chronicle of Higher Education (it’s behind a firewall) on “Paying Attention in the Digital Age.” I had high hopes for the piece– I’ve heard Thirft speak, and I find him alternately brilliant and dazzlingly under-organized; let’s just say that the TED style of super-smooth presentation hasn’t conquered the entire world– but this essay is very depressing.

“There is an issue,” he begins, “that is certainly gaining a lot of attention in Britain and the United States—namely, attention itself.” Students who’ve grown up with digital media have “much shorter attention spans, much greater attention to visual modes of understanding, greater modulation of time, more and more reliance on interfaces, and so on.” However, he continues, the task of the academic is not to resist these changes, but to accommodate them, for three reasons.

First, “all of this can be seen as a transition to new modes of literacy,” and in such transitions “something is gained and something is lost.”

Second, because there’s a lot more information produced today than anyone could ever make sense of, the rise of Big Data, search engines, semantic Web, etc. to deal with it shouldn’t be seen as a big problem. Even if Google makes you stupid, direct access to the unfiltered, unindexed world would make you even stupider.

Finally, “it is possible to derive new modes of interrogation:”

So far as their own practices are concerned there are several efforts to produce search engines which can trawl material in much more specific and useful ways and code for depth of content rather than quasi-commercial imperatives. Then there is the fact that academics are among those who have some of the best practical and theoretical grasp of the new visual grammar that is now unfolding, and have the ability to do things about it. Finally, many academics have real influence—and responsibility—in this new world and can use it accordingly.  For example, the British scientist and TV personality Brian Cox gets significantly more searches on some days than the (large) university in which he is based. [Ed: This conflation of “influence– and responsibility” with fame is not surprising, but in a world in which Niall Ferguson and David Starkey are considered better models than Christopher Hill or Michael Oakenshott of the vita contemplativa, it is still depressing.]

So why does this me wishing I’d put some whiskey in my coffee rather than cream (and I put a LOT of cream in my coffee)?

First of all, the whole “media have always been changing” argument, after all these years of talking, is one that is at least incomplete, if not lazy. Yes, media have always been changing, and at no time in history have intelligent people been passive agents in that process. Someone is always pushing those changes (even if they have unexpected effects), whether it’s Gutenberg or royal presses or Google, and smart people have always asked how they can preserve their attention and intellectual ability in the face of these changes.

The good things don’t happen when you simply accept the transformation; they happen when you ask why these new media are emerging, whose interests are being served by them, and how you can synthesize, adapt, hack, and reuse them for your own purposes.

Assuming that it’s all historically contingent, and that as educators we have no responsibility to make students aware of a long tradition of cognitive training, is as lazy as believing that the history of music became irrelevant when Nikki Minaj released her first CD.

Second, somewhere along the way, the essay conflated human attention and machine attention, and never noticed. Notice that the solution to the problem of attention is more technology, not making students more self-aware, more aware of their own minds, more thoughtful about how they could develop their attention, or smarter about how to extend their minds through skillful use (rather than passive consumption) of technologies.

Don’t get me wrong. I’m as big a fan of the extended mind thesis as you’ll ever meet. But in my reading, the fact that we have always been cyborgs, that it’s perfectly natural for us to want to offload cognitive activities onto technologies and our environments, does not lead to the conclusion that we want to let whatever technologies we happen to be born with guide that process.

Rather, the extended mind thesis places on us the burden of making sense of ourselves and our technologies; understanding how we use them to make worlds; and being more conscious about how we choose to use our devices.

I also think that the idea that attention is infinitely variable, and hence that there’s really no definition of it that can guide either our evaluation of new media or our responses to them, is just wrong. Attention has a history, and it’s more stable than we think. We can find evidence of multitasking in the Stone Age; Pierre Hadot made a persuasive case for contemplative practices being at the heart of Greek philosophy; and Buddhist contemplative practices continue to be popular among hundreds of millions of people for a reason. What this suggests is that we can understand attention as a more fundamental human activity, and its cultivation as akin to a human right.

Finally, and most depressing for the academy, the essay suggests that there’s little that the university has to teach students about the value or attention and the pleasures of contemplation, nor can we help them develop those abilities. There is, I worry, an underlying sense that kids today are just incomprehensible, and we academics are obsolete: they think so differently than we do, their interactions with media are so different, that we can’t truly reach them. This is part of a bigger retreat from the university’s long but under-recognized mission of serving as a space that supports serious thinking, and being a space in which contemplative work can be pursued. One reason students have short attention spans is that they’re rarely exposed to the value of thinking long-term: if they’re in business, their heroes are entrepreneurs barely older than they are who’ve won the billion-dollar exit strategy after 18 months, not people who’ve spent decades building careers and businesses (with the possible exception of Warren Buffett, who’s easy to both admire and treat as an oddity).

Rather than assume that the university needs to become even more like the world, less concerned with building and sustaining attention and long-term thinking, I think we need to recognize how unique a space the university is; how we can play a critical role in not just training students to “think” in some abstract way, but help them learn how to think long and hard; and how no other institution in modern life comes close to being able to do this. But instead, we get Big Data and Brian Cox. Now where’s that Jamesons…