ebook img

NASA Technical Reports Server (NTRS) 20130014064: Evidence Report: Risk of Inadequate Human-Computer Interaction PDF

0.26 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview NASA Technical Reports Server (NTRS) 20130014064: Evidence Report: Risk of Inadequate Human-Computer Interaction

Evidence Report: Risk of Inadequate Human-Computer Interaction     Kritina Holden, Ph.D. Lockheed Martin, NASA Johnson Space Center Neta Ezer, Ph.D. Futron Corporation, NASA Johnson Space Center Gordon Vos, Ph.D. Wyle Life Sciences, NASA Johnson Space Center       Human Research Program Space Human Factors and Habitability Approved for Public Release: Month DD, YYYY National Aeronautics and Space Administration Lyndon B. Johnson Space Center Houston, Texas TABLE OF CONTENTS I.  RISK OF INADEQUATE HUMAN-COMPUTER INTERACTION ......................... 3  II.  EXECUTIVE SUMMARY .............................................................................................. 3  III.  INTRODUCTION............................................................................................................. 4  A.  Risk Statement ............................................................................................................... 4  B.  Risk Overview ................................................................................................................ 4  C.  Dependencies & Interrelationships with other Risks ................................................. 6  D.  Levels of Evidence .......................................................................................................... 7  IV.  EVIDENCE ....................................................................................................................... 8  A.  Contributing Factor 1: Requirements, Policies, and Design Processes .................... 9  B.  Contributing Factor 2: Informational Resources/Support ...................................... 12  C.  Contributing Factor 3: Allocation of Attention ........................................................ 17  D.  Contributing Factor 4: Cognitive Overload .............................................................. 20  E.  Contributing Factor 5: Environmentally Induced Perceptual Changes ................ 22  F.  Contributing Factor 6: Misperception/Misinterpretation of Displayed Information ........................................................................................................................................ 24  G.  Contributing Factor 7: Spatial Disorientation .......................................................... 28  H.  Contributing Factor 8: Design of Displays and Controls ......................................... 30  V.  COMPUTER BASED MODELING AND SIMULATION ........................................ 33  VI.  RISK IN CONTEXT OF EXPLORATION MISSION OPERATIONAL SCENARIOS ................................................................................................................... 34  VII.  RESEARCH GAPS ......................................................................................................... 35  VIII.  CONCLUSION ............................................................................................................... 36  IX.  REFERENCES ................................................................................................................ 37  X.  TEAM............................................................................................................................... 45  XI.  LIST OF ACRONYMS .................................................................................................. 46  2 I. RISK OF INADEQUATE HUMAN-COMPUTER INTERACTION The Risk of Inadequate Human-Computer Interaction is identified by the National Aeronautics and Space Administration (NASA) Human Research Program (HRP) as a recognized risk to human health and performance in space. The HRP Program Requirements Document defines these risks. This Evidence Report provides a summary of the evidence that has been used to identify and characterize this risk. II. EXECUTIVE SUMMARY Human-computer interaction (HCI) encompasses all the methods by which humans and computer-based systems communicate, share information, and accomplish tasks. When HCI is poorly designed, crews have difficulty entering, navigating, accessing, and understanding information. HCI has rarely been studied in an operational spaceflight context, and detailed performance data that would support evaluation of HCI have not been collected; thus, we draw much of our evidence from post-spaceflight crew comments, and from other safety-critical domains like ground-based power plants, and aviation. Additionally, there is a concern that any potential or real issues to date may have been masked by the fact that crews have near constant access to ground controllers, who monitor for errors, correct mistakes, and provide additional information needed to complete tasks. We do not know what types of HCI issues might arise without this “safety net”. Exploration missions will test this concern, as crews may be operating autonomously due to communication delays and blackouts. Crew survival will be heavily dependent on available electronic information for just-in-time training, procedure execution, and vehicle or system maintenance; hence, the criticality of the Risk of Inadequate HCI. Future work must focus on identifying the most important contributing risk factors, evaluating their contribution to the overall risk, and developing appropriate mitigations. The Risk of Inadequate HCI includes eight core contributing factors based on the Human Factors Analysis and Classification System (HFACS): 1) Requirements, policies, and design processes, 2) Information resources and support, 3) Allocation of attention, 4) Cognitive overload, 5) Environmentally induced perceptual changes, 6) Misperception and misinterpretation of displayed information, 7) Spatial disorientation, and 8) Displays and controls. 3 III. INTRODUCTION A. Risk Statement Given that HCI and information architecture (IA) designs must support crew tasks, and given the greater dependence on HCI in the context of long-duration spaceflight operations, there is a risk that critical information systems will not support crew tasks effectively, resulting in flight and ground crew errors and inefficiencies, failed mission and program objectives, and an increase in crew injuries. B. Risk Overview HCI is a discipline that studies and describes how humans and computer-based systems communicate, share information, and accomplish tasks. IA is the categorization of information into a coherent, intuitive, usable structure. When HCI or IA is poorly designed, crews have difficulty entering, navigating, accessing, and understanding information. Information is presented most effectively when the user’s interests, needs, and knowledge are considered in design. If information displays are not designed with a fully developed operations concept, fine-grained task analysis, and knowledge of human information processing capabilities and limitations, the format, mode, and layout of the information may not optimally support task performance. This may result in users misinterpreting, overlooking, or ignoring the original intent of the information, leading to task completion times that impact the mission timeline, necessitating costly replanning and rescheduling, and/or task execution errors, which endanger mission goals, crew safety, and mission success. The communication delays expected on long-duration missions will likely result in much greater crew dependence on computer-provided information. Crews will have to rely solely on available electronic information for just-in-time training, task procedures, and maintenance more than ever before. The "safety net" of calling ground control for questions, workarounds, and forgotten procedural steps will no longer be as feasible, and in certain circumstances may not be available at all. Although much is known about designing systems that provide adequate HCI, exploration missions bring new challenges and risks. Whereas the space shuttle had hundreds of hard switches and buttons, exploration vehicles will feature primarily glass-based interfaces, 4 requiring crew to rely on an input device to interact with software displays and controls (Ezer, 2011). Due to mass restrictions, the real estate for displayed information is likely to be limited, but the amount of information available for display will be greatly increased, posing challenges for information design and navigation schemes. Future vehicles will also fly many new technologies that must be usable with pressurized gloves, in microgravity, and under vibration. Inadequate HCI can lead to a wide range of potential consequences. There is a significant risk of errors or failure of mission objectives when the crew cannot perform a task because they can neither see nor hear needed information, when wrong information is displayed, when data is unavailable, or when the presented information is confusing. These information-related impacts may be due to IA issues (related to the logical organization of information, allowing users to quickly or easily assimilate data, including what data to display) or related to information presentation (the format of how the information is displayed to the user). There are also risks associated with the design of human interfaces when crewmembers cannot reach controls, have difficulty manipulating them, if controls have unexpected behaviors, are poorly labeled or confusing, or are not available when needed. Additional problems arise when there is improper function allocation between the human and the system, or when the means of interaction with the system is confusing, inefficient, or difficult to learn. These problems are exacerbated when procedures are poor, timelines are challenging or environments are unpredictable or dynamic (e.g., lighting, vibration). Poor HCI can also reduce efficiency and undermine the added value of computer functionality by imposing overhead tasks on the user. Overhead tasks manifest themselves when poor HCI design requires the user to expend cognitive resources on something other than the task at hand (i.e., navigating or managing the user interface, or performing the additional task of reorganizing the information prior to proceeding with the task). Overhead can also occur when the information presentation aspects of the interface are dissonant with the proper cognitive strategy for executing the task. Overhead can also occur when there is a need to integrate information from multiple sources or when controlling the interface necessitates use of significant attentional resources. Unfortunately, these sources of overhead are not easy to detect or to control, often leaving them uncontrolled. These sources of uncontrolled overhead are a risk to users as well as mission objectives. Usability is therefore inversely related to HCI task overhead, and designers should try to minimize overhead when possible (Zhang, J., & Butler, K., 2007). Consequences become potentially far more serious in dynamic flight phases such as launch, docking and landing, when there is very little time available for correcting mistakes. As mission length increases, and ground support decreases, the availability of ground-assisted 5 workarounds will decrease and consequences will increase to a possible failure to achieve mission objectives or potentially loss of mission/loss of crew. C. Dependencies & Interrelationships with other Risks The Risk of Inadequate HCI is highly related to several other Space Human Factors Engineering (SHFE) risks, including the Risk of Inadequate Critical Task Design (TASK), the Risk of Inadequate Design of Human and Automation / Robotic Integration (HARI), and the Risk of Performance Errors Due to Training Deficiencies (TRAIN). In the HCI risk, emphasis is primarily on the structure of information, how it is presented to the user, and the methods by which the user interacts with the information. Allocation of attention, cognitive overload, environmentally induced perceptual changes, misperception or misinterpretation of displayed data, and spatial disorientation all fall under this risk. It is particularly characterized by its emphasis on design of displays and controls. The HARI risk focuses on those issues that are specifically related to semi-autonomous systems – robotics and automation. Since interfaces with such systems are instances of HCI, they rely on HCI to address broad issues that are not specific to robotics and automation. HARI specifically focuses on the assignment of human and automation resources (function allocation) and designs for automation, with emphasis on providing adequate system state information, and over-confidence or lack of trust of automation. Its overall focus is on coordination of humans with robots and automation: the design, function, reliability and use of robotic or automated systems. The TASK risk is concerned with tasks, schedules and procedures. Most tasks are performed using human-computer interfaces, thus there is a heavy interaction between the TASK and HCI risks. The emphasis in TASK is on factors related to the flow of the work: operational tempo and workload; procedural guidance; training for specific procedural knowledge; and reduction of task overhead. Because of their inherent dependence on the task design, situation awareness (SA) and usability are both considered under this risk, though these are also part of HCI. An integral component of task design is the concept of efficiency. Tasks that are efficient minimize the number of steps required to accomplish their goals, while reducing overhead work and the need for tapping limited resources. Counterbalancing this is the need to provide enough information for accomplishing the task, and to do so without limiting the user’s authority to execute the task. Efficiency is therefore a critical component of task design, one that is closely linked to HCI and usability, including the design of displays and controls, IA, and information presentation (Wesson and Greunen, 2002). Another key factor in task design is the effective 6 management of information and technology from the standpoint of complexity. Some tasks have an inherent complexity involved in their execution due to multidisciplinary interactions, valuation of information, and knowledge management, an issue that has been dealt with extensively in the field of health informatics. Spaceflight, including space medicine, entails many tasks and activities which share this inherent complexity, and careful consideration of these concepts will be key in not only developing mitigation strategies for the TASK risk, but also in its interaction with the IA aspects of HCI design (Norris, 2002). The TRAIN risk interacts with HCI in two important ways: 1) systems with poor HCI design may be non-intuitive, and require more training, and 2) the adequacy of computer-based training systems depends heavily on the design of the HCI. The research under TRAIN addresses best methods of training for different purposes, including individual and team activities, for skills and knowledge. SHFE risks also interact with risks from other Elements. For example, within Behavioral Health and Performance, there are risk contributing factors of sleep loss, work overload, cognitive impairment due to medical conditions, operational/task related stressors, and communication. These contributing factors impact crews’ ability to communicate and share information, and interact effectively with computer-based systems, and thus have a potential impact on the HCI risk. SHFE also shares the contributing risk factor of Impaired Manual Control with the Human Health Countermeasures Element. Microgravity, vibration, and deconditioning can affect crews’ ability to perform fine motor control tasks. Fine motor control impairment will impact crews’ ability to interact with computer-based devices, such as cursor control devices. D. Levels of Evidence HRP has established four Categories to describe Levels of Evidence, as shown below: Evidence Category I: At least one randomized, controlled trial. Evidence Category II: At least one controlled study without randomization, including cohort, case-control, or subject operating as own control. Evidence Category III: Non-experimental observations or comparative, correlation, and case or case-series studies. Evidence Category IV: Expert committee reports or opinions of respected authorities based on clinical experiences, bench research, or “first principles.” 7 Evidence for the Risk of Inadequate HCI encompasses lessons learned from 50 years of spaceflight experience, aviation, and ground-based research. A large majority of the evidence comes from crew reports and accident investigation reports. As these include summaries of subjective experience, expert opinions, and non-experimental observations, they are classified as Evidence Categories III and IV. Much of the evidence comes from aviation research and accident reports because the number of commercial, military, and private flights each year far exceeds the number of spaceflights. It should be noted that some evidence in this chapter is derived from the Flight Crew Integration (FCI) International Space Station (ISS) Life Sciences Crew Comments Database and Shuttle External Crew Reports. Although summaries of ISS and Shuttle crew comments are presented as evidence, the FCI ISS Life Sciences Crew Comments Database is protected and not publicly available, due to the sensitive nature of the raw crew data it contains. Data is also presented from the Crew Office approved Space Shuttle Crew Reports. These reports are not publicly available. IV. EVIDENCE The primary focus of integrated human-system design is the integration of human considerations in systems design to reduce costs and optimize system performance, thus leading to improved safety, efficiency, and mission success. This chapter focuses on identifying the causes of risk associated with error due to inadequate HCI, and addressing information presentation standards for reducing operator errors in spaceflight through adequate assessment of the causes. Evidence relevant to the risk of error due to inadequate HCI illustrates that effective information presentation and interaction are critical to mission success. The purpose of the Space Human Factors discipline is to create and maintain a safe and productive environment for spaceflight crewmembers. One method to achieve this is through adequate provision and presentation of information necessary for task execution. Spaceflight crew performance is heavily influenced by the way in which crews are able to obtain SA and safely and effectively perform tasks. Current and future missions will require crews to perform a wide variety of tasks under dramatically different conditions: 1-g, hypergravity, microgravity, unsuited, suited, and pressurized. Mission success will require a more complete understanding of information essential for successful task performance and how this information is best presented, acquired, and processed. As such, it is necessary that the risk of inadequate HCI be thoroughly assessed such that mitigation strategies can be developed and implemented. This evidence is the basis for analysis of the risk likelihood and consequence, and may provide information needed to eventually develop standards for reducing operator errors in spaceflight through adequate understanding of the causes and mitigations of operator errors due to inadequate HCI. 8 The risk of inadequate HCI includes eight core contributing factors: 1) Requirements, policies, and design processes, 2) Information resources/support, 3) Allocation of attention, 4) Cognitive overload, 5) Environmentally induced perceptual changes, 6) Misperception/misinterpretation of displayed information, 7) Spatial disorientation, and 8) Design of displays and controls. The contributing factors were derived from the Department of Defense (DoD) Human Factors Analysis and Classification System, the industry standard for human error categorization. (DoD, 2005; Shappell & Wiegmann, 2000). All of these contributing factors can prevent successful accomplishment of tasks or task objectives by impacting the user’s ability to properly utilize information to make correct decisions regarding the human-computer interface. A. Contributing Factor 1: Requirements, Policies, and Design Processes Requirements, policies and design processes are a factor when the processes through which vehicle, equipment or logistical support are acquired allow inadequacies, or when design deficiencies create an unsafe situation. In the SHFE domain, the key process is the Human- Centered Design (HCD) Lifecycle process. The HCD lifecycle is characterized by three primary phases of activities: Understanding the User and their Domain, Visualizing the Design Solution, and Evaluating the Design (Holden, Malin, and Thronesbery, 1998; ISO TR 18529, 2000). Understanding the User and their Domain involves activities such as task analysis to ensure products are effective and meet user needs; Visualizing the Design Solution involves iterative concept prototyping to mature design alternatives, and Evaluating the Design involves formal usability testing of the designs to ensure usability, efficiency, and acceptance by the users. For information-based products, all phases of the HCD lifecycle focus on ensuring quality IA: the correct information presented intuitively, in the proper format, within a logical organization, easily accessible by the users. When human-centered processes and policies are not in place, the likelihood of inadequate HCI can be significant. The use of some form of the HCD process is standard and widely accepted throughout industry and the DoD. Lifecycle costs for products developed with an HCD process are significantly lower than costs for products developed without such a process. Good process leads to reduced need for expensive redesign and re- certification late in the lifecycle, since major design issues are identified and corrected early rather than late in the lifecycle. There is also a reduced need for training, since a good process helps ensure that learnability and usability are given consideration early. HCD has been only recently gaining attention and acceptance at NASA. While in the past, development may have been centered around the human (in this case, the crew), the process was often relatively unstructured, relying on crew acceptance comments rather than the more formal HCD methods that involve task-based evaluations and objective data. The result has been a mixture of well-designed and not so well-designed products. Many of the negative 9 consequences of this lack of HCD process have been masked in missions to date because of easy access to the ground for questions and workarounds. This approach will not be feasible for long- duration missions. At NASA, many HCI-related issues are informally discussed or complained about, but not formally investigated or pursued, perhaps partially due to the crew culture of – “I can figure it out – it’s workable”. Accepting error-prone products deemed workable by highly confident users is taking unnecessary risk. It is difficult to find documented investigations that consider process-related causes. One exception is described below. A NASA report in 2000 (NASA Office of Inspector General, 2000) clearly identifies lack of proper process as a causal factor for poorly designed ISS portable computer system (PCS) displays. The importance of this finding is evident in the report: “The PCS and the display development process is a recognized area of concern for the ISS program since PCS displays are the primary crew interface or window into ISS systems.” Also, from the report: “There are numerous usability issues that affect the cost and schedule of the display development process and may have a safety impact. These issues affect cost and schedule because additional training and software releases could be required.” Some of the specific concerns cited were: no formal display requirements, weak software engineering practices, lack of prototyping, and lack of human factors engineering. Detailed findings identify display design issues such as missing indicators, erroneous information, inconsistencies, and cumbersome navigation. Although some modifications were made based on this report, poor usability of these displays has resulted in many of the displays being controlled by ground personnel instead of onboard crew. Lack of good process has also been cited as causal in several space-related accidents. In 1967, one of the solar panels on the Russian Soyuz 1 space vehicle failed to deploy. A series of maneuvering failures followed, ending in a decision to bring the craft home. Re-entry was successful, but another failure resulted in the main parachute failing to deploy prior to landing, resulting in a crash that killed the crewmember. A photo of the crash site and the remains of the vehicle after the accident are shown in Figure 1. The post-accident investigation revealed that many of the failures were due to lack of proper consideration and planning during design (inadequate process). Failure of the solar power supply and environmental susceptibilities of the sensors were never considered during design, and redundancies were not built into the system. These are things that might have been addressed during design, had HCD methods such as task analysis and task-based evaluations and simulations been completed during development (Shayler, 2000). 10

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.