A few years ago I wrote an online column for Red Herring. The gig was interesting, but after a change of editorial regime, they decided to stop the experiment. The pieces all kind of disappeared after a while, and I realized that some of them were actually pretty good. Heaven knows I spent plenty of time on them.

So if for no other reason than to have easily-accessible copies of them, I'm going to start reposting them here. Most were from 2004, so they might seem a bit dated; but I think some of the ideas are still worth playing with.

Knowledge is power. For a long time we thought it was something immaterial, cerebral, almost otherworldly. No less a figure than Plato argued that the world of things and appearances was but a dim reflection of another world of ideal types, more real than reality itself. But Plato's theory is too good for this world. Knowledge is also things, and actions.

One of the key events in twentieth-century philosophy was the discovery that the Platonic model of knowledge was incomplete. In mathematics, Kurt Gödel demonstrated that mathematics could never be a perfectly self-contained, exhaustively proven system. For decades, philosophers and mathematicians had worked to find the fundamental foundations of mathematics; Gödel's incompleteness theorem showed that the search was fruitless.

The critique continued in philosophy. Cambridge University's Ludwig Wittgenstein, arguably the twentieth century's most influential philosophical mind, argued that the meaning of language arises from its use, rather than from its logical properties. A few years later, British philosopher Michael Polyani coined the term "tacit knowledge" to describe things that we can know but can't effectively communicate. Tacit knowledge, Polyani argued, is an important component of skilled work, and even shapes activities that we traditionally have thought of as entirely logical (like science).

Historian Thomas Kuhn's Structure of Scientific Revolutions took Polyani one step further, and opened up a whole new front in the assault against traditional notions of knowledge. Structure reconceived science as a puzzle-solving activity guided by a mix of formal methods and cultural norms, and punctuated by dizzying revolutions and paradigm shifts. Sociologists of science, cultural anthropologists, literary and gender theorists, all used Kuhn as an inspiration to their critiques of objectivity.

You would think that after all this, the Platonic model of knowledge would be dead and gone. But it lives on in information technologies.

The words "Platonic" and "information technology" don't normally appear together. But the experience of using computers—and the way we think about that experience—has breathed new life into the Platonic idea that information exists separate from the world. Using desk-bound computers gave many users a sense that information and knowledge reside in an alternate dimension, a separate, sometimes mysterious or forbidden, "cyberspace." IBM acknowledged this sense of computer-as-portal by naming one line of its monitors the "infowindow."

Information technologies are great at handling formal knowledge. They go gangbusters on anything that can be reduced to quantitative form, expressed as logical rules or instructions, and the like. Indeed, they're as good at such tasks as we are bad at them; it's one of the things that makes computers seem not just useful, but authoritative and even infallible. But computers are also hopeless at dealing with information that is deeply connected to things, to places, or to practices.

This hasn't led to a new appreciation of the relative strengths and weaknesses of computers and people. Instead, for most people it has had the unintended and unfortunate effect of encouraging us to think of "information" strictly as stuff that computers can handle. If it can be poured into a database of best practices, it's information; it if can't, it's not. The problem is that this kind of information is like the visible part of the iceberg. The other seven-eights are underwater and invisible. But that doesn't mean that your ship can't be sunk by it.

The great challenge of the future will be to turn information technologies from things that encourage us to narrow our conception of what knowledge is, to things that can more fully reflect the diverse ways that knowledge is produced and performed. You might say that we have to learn to give information technologies social lives.

What does it mean to say that information is also things? It means that the physical properties of objects can encourage users to build information storage and management practices around them.

Consider something as low-tech as the refrigerator door. Its main purpose is to help keep refrigerators cold. But add little magnets, and the door becomes a display space and holder for all kinds of documents and data. So far as anyone can tell, once refrigerators stopped being designed like Airstream trailers and became big metal boxes (in the late 1950s and 1960s), it took almost no time for them to acquire a second life as the family bulletin board.

So ubiquitous is this practice today that some appliance makers have experimented with creating refrigerators with Internet access and color screens built into the door. Palm's ill-fated family information device Audrey was designed to be attached to the refrigerator—another example of an attempt to build on a well-established practice.

But why did the refrigerator ever develop a second life as an information appliance? First, physical location—which means social location—matters a lot. Every family member interacts with the refrigerator; in most homes, kitchens are an especially high-traffic area; and the kitchen is also a common space, unlike bathrooms or bedrooms.

Second, affordances are critical. Refrigerator doors are big and flat, so there's lots of room to put stuff. They're also pretty tall, so you can have the bills and coupons on the freezer door, and the fingerpaints and photos below.

Finally, social practices will shape what's accessible to whom. The refrigerator is accessible to most of the family, unlike other appliances: most children are allowed to open the refrigerator long before they can touch other appliances.

Books are a more familiar information technology, but even they communicate information on several levels because of their design and physical properties. This is one reason that they haven't become obsolete, despite the predictions of many futurists.

Books—and paper more generally—have been the whipping-boy of innovators and technologists since the late nineteenth century. For a few, predicting the death of the book at the hands of the Memex, microfilm, tape, or CD was merely the most efficient way to highlight the disruptiveness of the new technology; it was a side-effect, not the main point. For others, however, the book is a confining, constraining technology. Information wants to be free, but it's stuck on the page. Readers want to follow their own interests, but books force them into linear reading patterns. For these critics, the death of the book can't come too soon.

So why have books survived? It's not just because of cultural inertia or the stubbornness of users. It's because the book itself isn't an inefficient carrier of information when compared to the computer. It's because books carry—or even create—kinds of information that computers don't.

First, there's what literary theorists call paratexts, the information surrounding and interwoven with a book's content: everything from the paragraph breaks to page numbers and chapter titles. We use these cues to organize our reading, but they also affect what we take from a book, for better or for worse. British philosopher John Locke argued that numbering the verses of the Bible—an innovation that was intended to make reading and citing passages easier—was a bad idea because it encouraged readers to focus on the individual verses rather than the whole. Too much attention to a few easily-remembered lines and too little to the grand sweep of the book, Locke worried, could lead readers to heretical views.

Other things provide context for readers, even though they have no formal connection to the information stored in a book. A book's publisher tells you a lot about what you should expect from a book, and how seriously you should take it. A Harlequin book on geopolitics isn't any more credible than a romance published by Harvard University Press.

Books acquire another patina of information in the course of being handled and read. Dog-eared pages and underlining can mark important or favorite passages. Handwritten annotations provide clues to how a reader responds to an argument or turn of phrase; a book handled by several readers can carry a virtual conversation in its margins. These can become useful social navigation cues for later readers (though having to read a book too much through someone else's eyes can be distracting); after centuries, such notes become invaluable grist for the historians' mill, material for reconstructing reading patterns and practices.

So how do you solve the problem of creating technologies that are more responsive to informal knowledge? You'd pursue different strategies depending on whether you're designing tools to be used just by individuals, or by groups.

For individually-oriented tools, the first thing to do is to copy metaphors. Both the Macintosh and Windows user interfaces made themselves easier to use by drawing upon some familiar metaphors: the screen was a desktop, documents were files, files were organized into folders, and so on. One notable quality of emerging technologies like electronic paper is that they allow us to copy the affordances that make conventional information technologies useful. E-paper shares some of the qualities of a computer display (most notably its ability to be instantly erased and rewritten), but it can be folded, rolled, carried around, and read like paper. This will push it off the desktop and into application areas that computers couldn't dream of going.

Another approach is to be sociable. Recommendation agents, the Alexa toolbar, and the good old hit counter attach a little social context to a page, giving you a sense of how many other people have accessed it, and what they thought of it. Researchers have experimented with more graphical ways of bringing social context to a page, for example by having a Web page change color with its age and the number of times it was read, just like real books.

An even subtler strategy is to not try to make computers more like people, but to create tighter connections between people and information technologies. You don't have to imitate people if you can embed them in the system, and you solve the tacit knowledge problem by letting humans deal with it.

There's an interesting parallel here to nanotechnology, in which scientists confronting the challenge of figuring out how to make nanoscale motors or manufacturing processes, are learning to use proteins to create tiny wires, or harnessing flagellates (a kind of bacteria with a long, whiplike tail) as pumps in fluidic systems. They've have realized that evolution has been working at the nanoscale a lot longer than humans; the results are nanoscale chimeras—part technology, part organism—but they work.

Likewise, some promising health monitoring systems have a limited capacity to infer health from behavioral data, but leave to humans (adult children of elderly parents, for example) the job of interpreting the patterns. Grandma may be watching a lot of TV, but if there's a Cary Grant retrospective or she just got a DVD player, that's not a problem. If she's never liked TV but suddenly has it on for hours on end, that's a cause to worry. Humans are much better-equipped than computers to make such fine judgments.

Such systems have the added virtue of reinforcing social connections and family ties, rather than undermining or replacing them. Further, they recognize that information has always had a physical dimension, and has always been a social thing. We're about to reach a point where computers can work with those facts, not against them. Information technologies, in other words, are going to be more like information itself.