Read The Design of Everyday Things Online
Authors: Don Norman
129
   Â
Cultural frames:
See Roger Schank and Robert B. Abelson's
Scripts, Plans, Goals, and Understanding
(1977) or Erving Goffman's classic and extremely influential books
The Presentation of Self in Everyday Life
(1959) and
Frame Analysis
(1974). I recommend
Presentation
as the most relevant (and easiest to read) of his works.
129
   Â
Violating social conventions:
“Try violating cultural norms and see how uncomfortable that makes you and the other people.” Jan Chipchase and Simon Steinhardt's
Hidden in Plain Sight
provides many examples of how design researchers can deliberately violate social conventions so as to understand how a culture works. Chipchase reports an experiment in which able-bodied young people request that seated subway passengers give up their seat to them. The experimenters were surprised by two things. First, a large proportion of people obeyed. Second, the people most affected were the experimenters themselves: they had to force themselves to make the requests and then felt bad about it for a long time afterward. A deliberate violation of social constraints can be uncomfortable for both the violator and the violated (Chipchase & Steinhardt, 2013).
137
   Â
Light switch panel:
For the construction of my home light switch panel, I relied heavily on the electrical and mechanical ingenuity of Dave Wargo, who actually did the design, construction, and installation of the switches.
156
   Â
Natural sounds:
Bill Gaver, now a prominent design researcher at Goldsmiths College, University of London (UK), first alerted me to the importance of natural sounds in his PhD dissertation and later publications (Gaver, W., 1997; Gaver, W. W., 1989). There has been considerable research on sound since the early days: see, for example, Gygi & Shafiro (2010).
160
   Â
Electric vehicles:
The quotation from the US government rule on sounds for electric vehicles can be found on the Department of Transportation's website (2013).
CHAPTER FIVE: HUMAN ERROR? NO, BAD DESIGN
There has been a lot of work on the study of error, human reliability, and resilience. A good source, besides the items cited below, is the Wiki of Science article on human error (Wiki of Science, 2013). Also see the book
Behind Human Error
(Woods, Decker, Cook, Johannesen, & Sarter, 2010).
Two of the most important workers in human error are British psychologist James Reason and Danish engineer Jens Rasmussen. Also see the books by the Swedish investigator Sidney Dekker, and MIT professor Nancy Leveson (Dekker, 2011, 2012, 2013; Leveson, N., 2012; Leveson, N. G., 1995; Rasmussen, Duncan, & Leplat, 1987; Rasmussen, Pejtersen, & Goodstein, 1994; Reason, J. T., 1990, 2008).
Unless otherwise noted, all the examples of slips in this chapter were collected by me, primarily from the errors of myself, my research associates, my colleagues, and my students. Everyone diligently recorded his or her slips, with the requirement that only the ones that had been immediately recorded would be added to the collection. Many were first published in Norman (1981).
165
   Â
F-22 crash:
The analysis of the Air Force F-22 crash comes from a government report (Inspector General United States Department of Defense, 2013). (This report also contains the original Air Force report as Appendix C.)
170
   Â
Slips and mistakes:
The descriptions of skill-based, rule-based, and knowledge-based behavior is taken from Jens Rasmussen's paper on the topic (1983), which still stands as one of the best introductions. The classification of errors into slips and mistakes was done jointly by me and Reason. The classification of mistakes into rule-based and knowledge-based follows the work of Rasmussen (Rasmussen, Goodstein, Andersen, & Olsen, 1988; Rasmussen, Pejtersen, & Goodstein, 1994; Reason, J. T., 1990, 1997, 2008). Memory lapse errors (both slips and mistakes) were not originally distinguished from other errors: they were put into separate categories later, but not quite the same way I have done here.
172
   Â
“Gimli Glider”:
The so-called Gimli Glider accident was an Air Canada Boeing 767 that ran out of fuel and had to glide to a landing at Gimli, a decommissioned Canadian Air Force base. There were numerous mistakes: search for “Gimli Glider accident.” (I recommend the Wikipedia treatment.)
174
   Â
Capture error:
The category “capture error” was invented by James Reason (1979).
178
   Â
Airbus:
The difficulties with the Airbus and its modes are described in (Aviation Safety Network, 1992; Wikipedia contributors, 2013a). For a disturbing description of another design problem with the Airbusâthat
the two pilots (the captain and the first officer) can both control the joysticks, but there is no feedback, so one pilot does not know what the other pilot is doingâsee the article in the British newspaper
The Telegraph
(Ross & Tweedie, 2012).
181
   Â
The Kiss nightclub fire in Santa Maria, Brazil:
It is described in numerous Brazilian and American newspapers (search the web for “Kiss nightclub fire”). I first learned about it from the
New York Times
(Romero, 2013).
186
   Â
Tenerife crash:
My source for information about the Tenerife crash is from a report by Roitsch, Babcock, and Edmunds issued by the American Airline Pilots Association (Roitsch, Babcock, & Edmunds, undated). It is perhaps not too surprising that it differs in interpretation from the Spanish government's report (Spanish Ministry of Transport and Communications, 1978), which in turn differs from the report by the Dutch Aircraft Accident Inquiry Board. A nice review of the 1977 Tenerife accidentâwritten in 2007âthat shows its long-lasting importance has been written by Patrick Smith for the website
Salon.com
(Smith, 2007, Friday, April 6, 04:00 AM PDT).
188
   Â
Air Florida crash:
The information and quotations about the Air Florida crash are from the report of the National Transportation Safety Board (1982). See also the two books entitled
Pilot Error
(Hurst, 1976; Hurst, R. & Hurst, L. R., 1982). The two books are quite different. The second is better than the first, in part because at the time the first book was written, not much scientific evidence was available.
190
   Â
Checklists in medicine:
Duke University's examples of knowledge-based mistakes can be found at Duke University Medical Center (2013). An excellent summary of the use of checklists in medicineâand the many social pressures that have slowed up its adoptionâis provided by Atul Gawande (2009).
192
   Â
Jidoka:
The quotation from Toyota about
Jidoka
, and the Toyota Production System comes from the auto maker's website (Toyota Motor Europe Corporate Site, 2013). Poka-yoke is described in many books and websites. I found the two books written by or with the assistance of the originator, Shigeo Shingo, to provide a valuable perspective (Nikkan Kogyo Shimbun, 1988; Shingo, 1986).
193
   Â
Aviation safety:
The website for NASA's Aviation Safety Reporting System provides details of the system, along with a history of its reports (NASA, 2013).
197
   Â
Hindsight:
Baruch Fischhoff's study is called “Hindsight â Foresight: The Effect of Outcome Knowledge on Judgment Under Uncertainty” (1975). And while you are at it, see his more recent work (Fischhoff, 2012; Fischhoff & Kadvany, 2011).
198
   Â
Designing for error:
I discuss the idea of designing for error in a paper in
Communications of the ACM
, in which I analyze a number of the slips people make in using computer systems and suggest system design principles that might minimize those errors (Norman, 1983). This philosophy also pervades the book that our research team put together:
User Centered System Design
(Norman & Draper, 1986); two chapters are especially relevant to the discussions here: my “Cognitive Engineering” and the one I wrote with Clayton Lewis, “Designing for Error.”
200
   Â
Multitasking:
There are many studies of the dangers and inefficiencies of multitasking. A partial review is given by Spink, Cole, & Waller (2008). David L. Strayer and his colleagues at the University of Utah have done numerous studies demonstrating rather severe impairment in driving behavior while using cell phones (Strayer & Drews, 2007; Strayer, Drews, & Crouch, 2006). Even pedestrians are distracted by cell phone usage, as demonstrated by a team of researchers from West Washington University (Hyman, Boss, Wise, McKenzie, & Caggiano, 2010).
200
   Â
Unicycling clown:
The clever study of the invisible clown, riding a unicycle, “Did you see the unicycling clown? Inattentional blindness while walking and talking on a cell phone” was done by Hyman, Boss, Wise, McKenzie, & Caggiano (2010).
208
   Â
Swiss cheese model:
James Reason introduced his extremely influential Swiss cheese model in 1990 (Reason, J., 1990; Reason, J. T., 1997).
210
   Â
Hersman:
Deborah Hersman's description of the design philosophy for aircraft comes from her talk on February 7, 2013, discussing the NTSB's attempts to understand the cause of the fires in the battery compartments of Boeing 787 aircraft. Although the fires caused airplanes to make emergency landings, no passengers or crew were injured: the multiple layers of redundant protection maintained safety. Nonetheless, the fires and resulting damage were unexpected and serious enough that all Boeing 787 airlines were grounded until all parties involved had completed a thorough investigation of the causes of the incident and then gone through a new certification process with the Federal Aviation Agency (for the United States, and through the corresponding agencies in other countries). Although this was expensive and greatly inconvenient, it is an example of good proactive practice: take measures before accidents lead to injury and death (National Transportation Safety Board, 2013).
212
   Â
Resilience engineering:
The excerpt from “Prologue: Resilience Engineering Concepts,” in the book
Resilience Engineering
, is reprinted by permission of the publishers (Hollnagel, Woods, & Leveson, 2006).
213
   Â
Automation:
Much of my research and writings have addressed issues of automation. An early paper, “Coffee Cups in the Cockpit,” addresses this problem as well as the fact that when talking about incidents in a large countryâor that occur worldwideâa “one-in-a-million chance” is not good enough odds (Norman, 1992). My book
The Design of Future Things
deals extensively with this issue (Norman, 2007).
214
   Â
Royal Majesty
accident:
An excellent analysis of the mode error accident with the cruise ship
Royal Majesty
is contained in Asaf Degani's book on automation,
Taming HAL: Designing Interfaces Beyond 2001
(Degani, 2004), as well as in the analyses by Lützhöft and Dekker and the official NTSB report (Lützhöft & Dekker, 2002; National Transportation Safety Board, 1997).
CHAPTER SIX: DESIGN THINKING
As pointed out in the “General Readings” section, a good introduction to design thinking is
Change by Design
by Tim Brown and Barry Katz (2009). Brown is CEO of IDEO and Katz a professor at the California College of the Arts, visiting professor at Stanford's d.school, and an IDEO Fellow. There are multiple Internet sources; I like
designthinkingforeducators.com
.
220
   Â
Double diverge-converge pattern:
The double diverge-converge pattern was first introduced by the British Design Council in 2005, which called it the “Double-Diamond Design Process Model” (Design Council, 2005).
221
   Â
HCD process:
The human-centered design process has many variants, each similar in spirit but different in the details. A nice summary of the method I describe is provided by the HCD book and toolkit from the design firm IDEO (IDEO, 2013).
227
   Â
Prototyping:
For prototyping, see Buxton's book and handbook on sketching (Buxton, 2007; Greenberg, Carpendale, Marquardt, & Buxton, 2012). There are multiple methods used by designers to understand the nature of the problem and come to a potential solution. Vijay Kumar's
101 Design Methods
(2013) doesn't even cover them all. Kumar's book is an excellent treatment of design research methods, but its focus is on innovation, not the production of products, so it does not cover the actual development cycle. Physical prototyping, their tests, and iterations are outside the domain, as are the practical concerns of the marketplace, the topic of the last part of this chapter and all of chapter 7.
227
   Â
Wizard of Oz technique:
The Wizard of Oz technique is named after L. Frank Baum's book
The Wonderful Wizard of Oz
(Baum & Denslow, 1900). My use of the technique is described in the resulting paper from the group headed by artificial intelligence researcher Danny Bobrow at what was then called the Xerox Palo Alto Research Center (Bobrow et al., 1977). The “graduate student” sitting in the other room was Allen Munro, who then went on to a distinguished research career.