A GRANDER GOAL: A THOUSAND-FOLD INCREASE IN HUMAN CAPABILITIES

Ben Shneiderman Director, Human-Computer Interaction Lab
Professor, Computer Science
University of Maryland, College Park, MD 20742
301-405-2680 ben @cs.umd.edu http://www.cs.umd.edu/projects/hcil
Educom Review, 32, 6 (Nov/Dec 1997), 4-10.
Copyright 1997 EDUCOM, Reprinted with permission.
 
 

The public furor over chessmaster Garry Kasparov's loss to IBM's Deep Blue computer revealed the shallow perception that some people have of the differences between people and computers. Their false assumptions would be merely amusing if it were not for the prevalence and influence of such notions. The astounding and mind-opening reality is simple: people are not computers and computers are not people.

The 1947 press conference presenting the ENIAC computer launched the metaphor of the giant electronic brain, and many people have maintained that simplistic concept. While it may be useful as inspiration for some new ideas, it is startling to see how long some people have held on to the childish Mimicry Game, when there are grander goals to pursue.

The Mimicry Game -- construction of computers to carry out human tasks or behave like humans -- was long ago shown to be misleading, except for crash-test dummies and Disney's audio-animatronic figures. Lewis Mumford, in his 1934 classic book on *Technics and Civilization*, described the "obstacle of animism" and pointed out that no mature technology resembles human form. Cars do not run with feet, tape recorders do not speak with an artificial larynx, and chess programs do not think with a brain. Deep Blue's programmers deserve credit for their accomplishment, but the computer is merely a tool with no more intelligence than a wooden pencil.

The circus-like match between Kasparov and Deep Blue was great entertainment for those who like contests. It has as much scientific importance as P. T. Barnum's bearded lady. The chess-playing hardware extensions and software are so specialized that they seem unlikely to influence other applications. In fact, IBM's Web site avoids mentioning such spin-offs and deflects the reader by pointing out that Deep Blue's standard hardware is also employed for other useful tasks. No spin-off, just spin.

Some scientists, journalists and other industry experts have viewed the Mimicry Game as the ultimate goal of computer science, the Holy Grail. This initially compelling dream draws people in, like a bright spotlight so alluring that they miss grander goals. We now smile at the quaint but failed designs: five-fingered robot arms that swiveled only 270 degrees, talking cars or soda machines, and more recent disasters such as Postal

Buddy (a $1 billion anthropomorphic postal machine) or Microsoft's BOB (a "social interface" with 14 cute characters to guide users).

The reality is that users don't want an electronic buddy or chatty bank machine. On the first encounter such a machine seems cute, the second time silly, and the third an annoying distraction from the task. Users want powerful tools that amplify their capabilities and enable them to do their tasks. A grander goal for computer scientists and technology designers is to give users a thousand-fold increase in their capabilities.

Beyond the Mimicry Game

Bulldozers amplify our abilities to do physical work, and airplanes enable us to travel much faster and higher than the birds. Calculators are useful because they enable us to find square roots or to do other chores a thousand times faster and more accurately than on our own. The World Wide Web, digital libraries, and newsgroups are attractive because they enable users to find information and communicate with people a thousand times faster than other media. Desktop publishing, medical imaging, and computer-assisted design software are used because they give users almost supernatural powers.

Computer science's goals should not be to create medical diagnosis programs that perform as well as the best doctor -- that is the Mimicry Game. A grander goal is to enable the average physician to perform diagnoses far better than even the best physician. This can be accomplished by creating comprehensive clinical databases of patient histories, validated simulation models of disease patterns, and groupware to allow easy consultation. If you are brought to an emergency room anywhere in the world, your medical records should be available within 15 seconds.

Races between horses and iron horses or between people and computers are fine entertainment, but the thrill of victory and the satisfaction of accomplishing meaningful tasks are uniquely human.

Designing for the Future

Many designers are so entranced by the Mimicry Game that they find it hard to break the spell. They cannot see past the alluring Star Trek scenarios or the compelling fantasy of HAL in Arthur C. Clarke's 2001. While designers may accept notions of empowering users, they need more than encouragement to apply tool-like strategies. Designers need a fresh philosophical foundation, novel design patterns, and specific metaphors to help them create the interfaces and visualizations that will be compelling to users.

A philosophical foundation might be fashioned around the cognitive objectives of comprehensible, predictable and controllable interfaces and the affective goals of mastery, satisfaction and responsibility. I believe that users want comprehensible systems in which they understand the features and can readily take action to complete their tasks. The interfaces should be predictable, so that users know what to expect when they press Return, and controllable so that they can realize their intentions, monitor their progress, and recover from errors. These cognitive attributes lead to a sense of mastery over the system and pride in accomplishment. Then users can feel responsible for their actions. Such responsibility is necessary in life-critical applications such as air traffic control, and highly valued in common office or educational systems.

The design patterns that arise from this philosophical foundation include the direct manipulation style and visual information seeking. The 15-year old description of direct manipulation is still valid:

Visual presentation of the world of action: show users the available objects and actions

Rapid, incremental, and reversible actions with 100-millisecond updates

Selection by pointing, not typing

Continuous visual display of status

These principles are embodied in the Macintosh and later graphical user interfaces, air traffic control, videogames, and other successful products. The Visual Information Seeking mantra extends these principles to database browsing:

Overview first, zoom and filter, then details-on-demand

In several projects over the past six years (www.cs.umd.edu/projects/hcil) we had to relearn this principle, so I wrote it down to solidify it in my mind. Users should be able to see the full database, the table of contents, or the course outline to orient themselves first. Then they can zoom in on what they want, filter out what they don't want, and navigate to get the specific details.

New Metaphors

New metaphors are emerging, especially from information visualization research. The starfield display is one successful approach: a two-dimensional scattergram showing thousands of color-, size-, and shape- coded data points, with zooming navigation and sliders to filter out unwanted data points. This was originated in the Human-Computer Interaction Lab's HomeFinder and FilmFinder (Figure 1) projects and has become commercialized in Spotfire (www.ivee.com). For tree-structured data, the traditional node and link diagrams can now be shown with zooming and filtering, or the novel treemap with nested color- and size-coded rectangles can reveal patterns or outliers. For temporal data, Xerox PARC's perspective wall, Yale computer science professor David Gelernter's LifeStreams, or our LifeLines can be used to tell life histories or show medical records.

The excitement of discovery is substantial in many research labs, but there is turbulence as designers break free from older metaphors. Paradigm shifts have strong undertows that generate resistance to change. However, the goal of improving human capabilities a thousand-fold will bring a tidal wave of creative technologies. Since these innovations can be dangerous in many

ways, we will also need to teach and interpret Lewis Mumford's wisdom, reminding these new creators that the purpose of technology is to "serve human needs."

Figure 1: The FilmFinder prototype enabled users to get an overview first, and then focus on what they wanted using the zoombars on the left and bottom. They could filter using the alphasliders to select actors or directors, or the double-box slider to limit the length of films. Category buttons on the bottom allowed further filtering of the color coded data points. Finally, users could click on any data point to get a pop-up information box that offered details-on-demand (C. Ahlberg & B. Shneiderman, ACM CHI94 Conference Proceedings)

ftp://ftp.cs.umd.edu/pub/hcil/Screen-dumps/Film/film-alldots.gif

ftp://ftp.cs.umd.edu/pub/hcil/Screen-dumps/Film/film-michele.gif

------------------

Ben Shneiderman is professor of computer science and director of the Human-Computer Interaction Laboratory at the University of Maryland, College Park. He is the author of the recently published third edition of *Designing the User Interface: Strategies for Effective Human-Computer Interaction *(Addison Wesley Longman Publishers, 1998). ben@cs.umd.edu