Reprinted from Personal Computing, issue 4/90, pp. 37-38.
In computing today, the interface is ever with us.
You’ll read my colleague John Blackford’s assessment of the
significance of an emergent, dominant Unix interface for personal computers in
the Technology column in this issue. In this issue’s
Special report, you’ll also read about certain “executive
interface” issues that govern a class of systems used to keep top
management’s finger on the pulse of far-flung enterprises. Among them
is the insistence by executive users that personal computers must present information
in graphic form, to the specific extent that “problem areas should
glow or blink, in red.”
No problem. Software companies with the bucks have no trouble hiring brilliant
people to design any kind of screen you want. But in a way, the ability of
developers to accommodate this kind of demand and then to celebrate the fact that
they did it doesn’t represent a progressive triumph of the computing arts –
it’s a kind of stagnation.
You need to know where I’m coming from here: I like graphical environments;
I’m probably in Microsoft Windows half the time my personal computer is
running. From time to time, however, there’s this little nagging feeling
that somehow I have actually moved further away from what it was that pulled me
into a working symbiosis with this box of circuits in the first place. The other
day, I realized this occasional empty feeling was exactly like the delayed letdown
I experienced when I was a kid watching television one day and realized I missed radio.
Radio, in its full spectrum heyday of variety, drama, and world events, was
like the DOS prompt in that it asked you to come in and meet it halfway – the
understanding was that the really important stuff was going to happen right in your
head. Television, whose computing analog is the graphical interface, configures your head
into more of a dumb terminal. Whether you’re watching TV, doing Windows, or
communing with your Macintosh, the subliminal refrain, in the lexicon of Oz,
is: “Pay no attention to the man behind the curtain...”
Permit one further stretch of multimedia analogy: The first time I used a cassette
recorder in an interview was the worst job of reporting I ever did. I asked no
questions; there was no need to draw anything out, I reckoned, since through this
small miracle of technology I was getting it all. On playback, however, it
was all gross double-speak. Conclusion: The act of writing down notes had
become, through training and habit, the self-induced graphical interface through
which whatever analysis I did of a speaker’s veracity, content, and
intentions occurred. So to effectively use a tape recorder, paradoxically, you
have to listen harder.
What I have been postulating, then, while staring through the Window, is that
five years, a hundred seminars, and thousands of vapor trails later, this
intense collective obsession with The Interface may have made computing easier,
but it has not necessarily made it better.
In fact it hasn’t always made it as easy as it could: I look at –
play with – a lot of software programs, maybe four or five new ones a
month. There’s only one (a scheduling program) I’ve seen (and now
use) that let’s me type “NTH” and “4D” and has
no trouble interpreting those, respectively, in the context of its application,
as “next Thursday” and “four days.” This is, of
course, a text-based DOS program, written by somebody so out of it
he or she doesn’t realize the qwerty keyboard tops out the 20th Century’s
junkpile.
The fastest, most efficient way to commune with a machine never had a
chance in the Age of The Interface. Try this fictional scenario: Instead of
waiting until the 8th Century to decide that language has to be written, and
borrowing Chinese characters in the most grotesque misapplication of graphic
interfaces in history, the Japanese pick up the Roman alphabet from Rome. No
question they beat us to the moon after, most probably, MacArthur agrees to
terms on the deck of the Yamato. In that kind of a Japan, of course, it would
be a simple matter of decree that all students spend a few months in typing
class to sustain the population’s national keyboard average of 45wpm. Just
as it would be a simple matter in U.S. schools today. Simple, but impossible.
But if it took personal computing only a decade to junk the QWERTY interface,
perhaps others will be even more transitory. David Liddle, chairman of Metaphor
Computer Systems, a company that has certainly upheld its end of the interface
obsession, observes: “One should not confuse product identity
with usability. For example, there are companies right now fighting in court over
features that are demonstrably bad, in usability terms, but important from a
product identity point of view.”
This past fall, Liddle told the following story to a conference and with it,
admittedly out of his context, I rest my case:
“The real danger in confusing computing abstractions with the
abstractions of users’ jobs is illustrated in the story of the Caribou
Eskimo. [They] were tremendously efficient bow and arrow hunters; then came the
Hudson’s Bay Company. It gave them rifles, which made them much better
hunters. Eventually, the Hudson’s Bay Company left. The Caribou Eskimo were
no longer good at listening, smelling, tracking, and getting close enough
to something to hit it with a bow and arrow.
“They lost all the abstractions of their jobs as Eskimo hunters, immersed
as they were in the user interface abstractions associated with pointing rifles... there
was no one living among [them] who knew how to hunt with a bow and arrow. Luckily,
some anthropologists had filmed and interviewed some tribesmen 50 years
earlier; so, a team went up and retrained the Caribou Eskimos in their old ways.”
by Robin Nelson
|