Tuesday, November 28, 2006

The Future of Human-Computer Interaction

Personal computing launched with the IBM PC. But popular computing - computing for the masses - launched with the modern WIMP (windows, icons, mouse, pointer) interface, which made computers usable by ordinary people. As popular computing has grown, the role of HCI (human-computer interaction) has increased. Most software today is interactive, and code related to the interface is more than half of all code. HCI also has a key role in application design. In a consumer market, a product's success depends on each user's experience with it. Unfortunately, great engineering on the back end will be undone by a poor interface, and a good UI can carry a product in spite of weaknesses inside.

More importantly, however, it's not a good idea to separate "the interface" from the rest of the product, since the customer sees the product as one system. Designing "from the interface in" is the state of the art today. So HCI has expanded to encompass "user-centered design," which includes everything from needs analysis, concept development, prototyping, and design evolution to support and field evaluation after the product ships. That's not to say that HCI swallows up all of software engineering. But the methods of user-centered design - contextual inquiry, ethnography, qualitative and quantitative evaluation of user behavior - are quite different from those for the rest of computer engineering. So it's important to have someone with those skills involved in all phases of a product's development.

In spite of their unfamiliar content and methods, HCI courses are strongly in demand in university programs and should be part of the core curriculum. At a recent industry advisory board meeting for U.C. Berkeley's computer science division, HCI was unanimously cited as the most important priority for future research and teaching by our industry experts. Ease of use remains a barrier to growth and success in IT even in today's business markets. And it is surely the major challenge for emerging markets such as smart phones, home media appliances, medical devices, and automotive interfaces.

Before we explore the future of HCI, it's important to review some key lessons from the past. Many core ideas in HCI trace back to Vannevar Bush's "memex" paper ("As We May Think," Atlantic Monthly, July 1945), J. C. R. Licklider's vision of networked IT as DARPA director in the 1960s, and Douglas Engelbart's amazing NLS (online system) demonstration at the Fall Joint Computer Conference in San Francisco in December 1968. While acknowledging these pioneers, we're going to jump straight to the "modern era" of HCI, which led directly to popular computing. The incubator for this was, not surprisingly, Xerox PARC (Palo Alto Research Center).

In 1970, Alan Kay arrived at the just-formed Xerox PARC inspired by his vision of a laptop computer for ordinary users. Back then, the personal computer was a dream shared by a few wild souls. There were a handful of minicomputers (e.g., the PDP11 appeared in 1970), but those machines were for engineers and scientists, of course. Kay and other PARC engineers (including Butler Lampson and Chuck Thacker) started developing computers with the extraordinary idea of giving them to ordinary people. Kay was also working on Smalltalk (a language for kids), leading to Smalltalk-72 soon after. His laptop-style Dynabook was infeasible in the 1970s, but the group did produce the Xerox Alto desktop computer in 1973. The Alto had a mouse, Ethernet, and an overlapping window display. It was a technical marvel, but not necessarily easy to use. There was mouse functionality, but it was mostly a "text-oriented" machine. It also lacked a killer app (lesson 1). While the Alto was developed for ordinary users, it was not clear at the time what that market really looked like (lesson 2). Most Altos appear to have been sold or given away to engineering labs.

In 1976 Don Massaro from Xerox's office products division pushed ahead a personal computer concept for office environments called the Star. A separate development division was created for the Star and headed by David Liddle. It worked closely with PARC, but was not part of PARC. The Star is rightfully cited as the first "modern" WIMP computer. It's impossible to look at screenshots, or to actually use a machine (which I was able to do at a retrospective event at Interval Research) without being struck by how good it is compared with what came after. Liddle quipped that Star was "a huge improvement over its successors." It's not just its execution of the WIMP interface and desktop metaphor, but its remarkably clean and consistent "object-orientedness" - right-button menus, controls, and embeddable objects today are a rather clumsy echo of Star's design.

The most remarkable aspect of Star, however, is the process its designers used to develop it, which has been widely imitated and which made good interface design a reproducible process. Liddle's first step was to review existing development processes with the help of PARC researchers and produce a best-practices document that Star would follow. It included task analysis, scenario development, rapid prototyping, and users' conceptual models. Much of the design evolution happened before any code was written. Code development itself consisted of many small steps with frequent user testing. It was a textbook example (and it's in Terry Winograd's 1996 landmark textbook, Bringing Design to Software) of user-centered design.

Even the Alto had followed a much more classical design process. It was enough to put the Alto in the right ballpark, but that machine feels like it's from a completely different era. The Star knew what it was trying to be, and included a good suite of office software. For reasons that almost surely had nothing to do with its interface or application design, it failed in the marketplace. Its close reincarnation in the Macintosh was a huge success. So (lesson 3) good mass-market design requires a user-centered design process. And it often involves real social scientists or usability experts, as well as engineers.

The Star design was so good that HCI researchers are regularly the brunt of "Star backlash." It goes something like this: "HCI hasn't produced major innovations in the last 20 years; the WIMP interface today is almost identical to what it was in the 1980s." In many of the "technical arts," that would be a compliment. In computing, we have 20-year-old artifacts in museums and call them "dinosaurs." But it's wrong to apply that thinking to HCI. Humans are the key element in human-computer interaction. As a species, people don't evolve that fast, and we often take years to learn things well. We have interface conventions in automobiles as well (clockwise means turn right, you drive on the right, and so will I). It's just not good to "innovate" with those. For the time being, we can't "reflash" people with an upgrade, so let's not go there. The amazing thing is (lesson 4), when you execute the human-centered design process well (in a real usage context, as the Star designers did), you get a design that endures for decades. Multiple generations can learn it and become computer-empowered without worrying about losing that skill later.

For the same reason, when you design something new, it's much better to copy every well-known convention you can find than to make up a new one. As Picasso said, "Good artists borrow from the work of others, great artists steal." So (lesson 5) good HCI design is evolutionary rather than revolutionary.

Finally, there is an overall lesson (number 6) to take away from these two systems. The modern popular computer required two kinds of innovation: free-wheeling, vision-driven engineering, often technology-centered but ideally informed by high-level principles of human behavior (Alto); and careful, context-driven, human-centered, design evolution (Star). That's a critical point. You need truly creative design and engineering to conceive and execute a radically new idea, but innovation also requires validation. In HCI, validation means that it works well with real users. For that to happen, human-centered design evolution must happen. Innovation in the product is a nice virtue, but it's an option in terms of marketability. Usability is not.