But that is not where I'm going with this. The situation I describe, here, could be bad, but doesn't have to be bad and isn't necessarily bad now: It simply is the case that we are way too busy, nowadays, to comprehend everything in detail. And it's better to comprehend it dimly, through an interface, than not at all. Better for ten million Eloi to go on the Kilimanjaro Safari at Disney World than for a thousand cardiovascular surgeons and mutual fund managers to go on "real" ones in Kenya. The boundary between these two classes is more porous than I've made it sound. I'm always running into regular dudes--construction workers, auto mechanics, taxi drivers, galoots in general--who were largely aliterate until something made it necessary for them to become readers and start actually thinking about things. Perhaps they had to come to grips with alcoholism, perhaps they got sent to jail, or came down with a disease, or suffered a crisis in religious faith, or simply got bored. Such people can get up to speed on particular subjects quite rapidly. Sometimes their lack of a broad education makes them over-apt to go off on intellectual wild goose chases, but, hey, at least a wild goose chase gives you some exercise. The spectre of a polity controlled by the fads and whims of voters who actually believe that there are significant differences between Bud Lite and Miller Lite, and who think that professional wrestling is for real, is naturally alarming to people who don't. But then countries controlled via the command-line interface, as it were, by double-domed intellectuals, be they religious or secular, are generally miserable places to live. Sophisticated people deride Disneyesque entertainments as pat and saccharine, but, hey, if the result of that is to instill basically warm and sympathetic reflexes, at a preverbal level, into hundreds of millions of unlettered media-steepers, then how bad can it be? We killed a lobster in our kitchen last night and my daughter cried for an hour. The Japanese, who used to be just about the fiercest people on earth, have become infatuated with cuddly adorable cartoon characters. My own family--the people I know best--is divided about evenly between people who will probably read this essay and people who almost certainly won't, and I can't say for sure that one group is necessarily warmer, happier, or better-adjusted than the other.

MORLOCKS AND ELOI AT THE KEYBOARD

Back in the days of the command-line interface, users were all Morlocks who had to convert their thoughts into alphanumeric symbols and type them in, a grindingly tedious process that stripped away all ambiguity, laid bare all hidden assumptions, and cruelly punished laziness and imprecision. Then the interface-makers went to work on their GUIs, and introduced a new semiotic layer between people and machines. People who use such systems have abdicated the responsibility, and surrendered the power, of sending bits directly to the chip that's doing the arithmetic, and handed that responsibility and power over to the OS. This is tempting because giving clear instructions, to anyone or anything, is difficult. We cannot do it without thinking, and depending on the complexity of the situation, we may have to think hard about abstract things, and consider any number of ramifications, in order to do a good job of it. For most of us, this is hard work. We want things to be easier. How badly we want it can be measured by the size of Bill Gates's fortune.

The OS has (therefore) become a sort of intellectual labor-saving device that tries to translate humans' vaguely expressed intentions into bits. In effect we are asking our computers to shoulder responsibilities that have always been considered the province of human beings--we want them to understand our desires, to anticipate our needs, to foresee consequences, to make connections, to handle routine chores without being asked, to remind us of what we ought to be reminded of while filtering out noise.

At the upper (which is to say, closer to the user) levels, this is done through a set of conventions--menus, buttons, and so on. These work in the sense that analogies work: they help Eloi understand abstract or unfamiliar concepts by likening them to something known. But the loftier word "metaphor" is used.

The overarching concept of the MacOS was the "desktop metaphor" and it subsumed any number of lesser (and frequently conflicting, or at least mixed) metaphors. Under a GUI, a file (frequently called "document") is metaphrased as a window on the screen (which is called a "desktop"). The window is almost always too small to contain the document and so you "move around," or, more pretentiously, "navigate" in the document by "clicking and dragging" the "thumb" on the "scroll bar." When you "type" (using a keyboard) or "draw" (using a "mouse") into the "window" or use pull-down "menus" and "dialog boxes" to manipulate its contents, the results of your labors get stored (at least in theory) in a "file," and later you can pull the same information back up into another "window." When you don't want it anymore, you "drag" it into the "trash."

There is massively promiscuous metaphor-mixing going on here, and I could deconstruct it 'til the cows come home, but I won't. Consider only one word: "document." When we document something in the real world, we make fixed, permanent, immutable records of it. But computer documents are volatile, ephemeral constellations of data. Sometimes (as when you've just opened or saved them) the document as portrayed in the window is identical to what is stored, under the same name, in a file on the disk, but other times (as when you have made changes without saving them) it is completely different. In any case, every time you hit "Save" you annihilate the previous version of the "document" and replace it with whatever happens to be in the window at the moment. So even the word "save" is being used in a sense that is grotesquely misleading---"destroy one version, save another" would be more accurate.

Anyone who uses a word processor for very long inevitably has the experience of putting hours of work into a long document and then losing it because the computer crashes or the power goes out. Until the moment that it disappears from the screen, the document seems every bit as solid and real as if it had been typed out in ink on paper. But in the next moment, without warning, it is completely and irretrievably gone, as if it had never existed. The user is left with a feeling of disorientation (to say nothing of annoyance) stemming from a kind of metaphor shear--you realize that you've been living and thinking inside of a metaphor that is essentially bogus.

So GUIs use metaphors to make computing easier, but they are bad metaphors. Learning to use them is essentially a word game, a process of learning new definitions of words like "window" and "document" and "save" that are different from, and in many cases almost diametrically opposed to, the old. Somewhat improbably, this has worked very well, at least from a commercial standpoint, which is to say that Apple/Microsoft have made a lot of money off of it. All of the other modern operating systems have learned that in order to be accepted by users they must conceal their underlying gutwork beneath the same sort of spackle. This has some advantages: if you know how to use one GUI operating system, you can probably work out how to use any other in a few minutes. Everything works a little differently, like European plumbing--but with some fiddling around, you can type a memo or surf the web.

Most people who shop for OSes (if they bother to shop at all) are comparing not the underlying functions but the superficial look and feel. The average buyer of an OS is not really paying for, and is not especially interested in, the low-level code that allocates memory or writes bytes onto the disk. What we're really buying is a system of metaphors. And--much more important--what we're buying into is the underlying assumption that metaphors are a good way to deal with the world.


Перейти на страницу:
Изменить размер шрифта: