HFE RESEARCH AND DEVELOPMENT IN DEFENSE APPLICATIONS
| Title | How to Embed Human Factors and Ergonomics (HF&E) in the Acquisition Process Dr. Robert Bridger |
| Date/Time | Wednesday, 03 December 2014 / 13:20 |
Effective acquisition of new systems depends on the ability of an organisation to:
- Gather requirements
- Evaluate supplier capabilities
- Specify outcomes
- Develop effective customer-supplier relationships
Requirements specify what needs to be done, not how it should be done. One of the first steps needed to embed HF&E into the acquisition process is to specify generic human factors requirements that apply to all systems to be acquired. For example:
- Equipment is operable and safe
- Tasks that are compatible with user expectations, limitations and training
- An environment that is appropriate for the task
- A system of work organisation that accommodates higher level needs
- Optimise whole-life costs due to the use of people
A second step is to evaluate supplier capabilities in HF&E. This can be done by scrutinising the qualifications and training of personnel in the supplier organisation and requiring all tender documents to contain the supplier’s Human Factors Integration Plan for the project. A third step is to specify the outcomes of the project so that whatever is produced can be tested to determine whether it meets the user’s needs as expressed in the requirements. Explicit and detailed HF&E requirements are tested using an Integrated Evaluation and Acceptance Plan. Acceptance tests are conducted by the project team. For example, there may be a user requirement to ‘minimise the training burden on a new system’. Tests with operators would need to be conducted at an early stage to ensure that the requirement was being met.
In practice, there are many intervening stages in which generic requirements are decomposed into increasingly specific requirements as the system concept develops. Organisations wishing to embed HF&E into the acquisition process should employ their own ergonomists to generate specific requirements for HF&E, develop the ITEAP and conduct acceptance tests. They should not rely on suppliers to do this.
Dr RS Bridger is Head of the Human Factors Department at the Institute of Naval Medicine, UK. Prior to this, he was responsible for the postgraduate programme in ergonomics at the University of Cape Town. He is author of the textbook, ‘introduction to Ergonomics’ published by CRC Press (3rd edition in English and 2nd edition in Chinese). He is a fellow and council member of the Institute of Ergonomics and Human Factors and is the Institute’s honorary general treasurer. He also acts as an expert witness in civil litigation. He is author of over 70 articles published in peer-reviewed journals, 80 official reports, over 30 conference papers and articles in non-peer reviewed journals. His main interests are in HFI and in systems-focussed applied research in HF&E. He is also interested in the application of psychometric techniques and concepts in HF&E and more generally, in HF&E education.
| Title | Human Factors Issues with UAS Dr. Valerie Gawron |
| Date/Time | Wednesday, 03 December 2014 / 13:40 |
One of the earliest studies to identify issues with UAS ground stations was the UAV Technologies and Combat Operations, Air Force Scientific Advisory Board, 1996 Summer Study. One of the major findings of the study was “Insufficient emphasis has been given to human systems issues. Particularly deficient are applications of systematic approaches to allocating functions between humans and automation, and the application of human factors principles in system design.” A subsequent DARPA panel identified some of the issues were due to:
1) A variety of missions for the UAVs
2) A variety of UAV platforms
3) A variety of operational concepts
4) A variety of configurations characterized by number of operators and number and types of UAVs
- One operator - one UAV
- One operator - multiple UAVs of same type
- One operator - multiple UAVs of different type
- Multiple operators - one UAV
- Multiple operators - multiple UAVs of same type
- Multiple operators - multiple UAVs of different type (workshop on human Machine Issues in Unmanned Aerial Vehicles, George Mason University, 1997)
Attendees also organized the human machine issues into six areas. The first was automation with the transition between manual and automatic control of critical import. How should the transition occur? What were the rules for over ride? The second issue was decision making with time delays, vigilance decrements, fatigue, and lack of mutual models of engagement (human and UAV) as concerns. The third issue was situational awareness (SA). How can SA be maintained in teleoperation with missing perceptual cues and what frame of reference is needed? The fourth issue was the design of interfaces. How should these differ between manual control and supervisory control? What skills would operators need? How should system failures be managed? The fifth issue was training of individuals as well as crews. When should training occur and when should the operator be aided? How is crisis recognized and avoided or if it cannot be avoided then managed? The final issue was team interaction and performance. How should tasks be allocated between team members and automation? How should shift changes be managed? Most of these have yet to be resolved almost 20 years later.
Valerie Gawron has a PhD in Engineering Psychology from the University of Illinois, and a MS in Industrial Engineering and MBA from the State University of New York at Buffalo. She completed postdoctoral work at the New Mexico State University and worked for Calspan for 26 years. She is presently a human systems integrator at the MITRE Corporation. Dr. Gawron has served on the Air Force Scientific Advisory Board, Army Science Board, Naval Research Advisory Committee, and National Research Council. She is an associate fellow of AIAA, a fellow of HFES, and a fellow of the International Ergonomics Association. Dr. Gawron is the author of six books including the Human Performance, Workload, and Situation Awareness Measures Handbook (second edition) and 2001 Hearts: The Jane Gawron Story. Both of these are being used in graduate classes, the former in human factors and the latter in patient safety.
| Title | Visual Representation of Complex Systems Professor William Wong |
| Date/Time | Wednesday, 03 December 2014 / 14:00 |
What is the role of human factors in the design and development of intelligence analysis systems? Most simplistically, human factors inform us about what information to present and how the information could be organised and rendered to enhance sense-making and decision making.
One important difference between intelligence analysis systems and other complex systems is that key functional relationships used to represent these complex systems (e.g. a power plant) are stable (although the values may not be stable) and the performance of that system can be predicated or modelled from the laws of nature. In the CSE (cognitive systems engineering) domain, we refer to such systems as being causal systems; their outcomes are predictable from the laws of nature. We can compute their optimum performance as well as identify their boundaries of failure.
Intelligence analysis systems on the other hand, are systems where the interesting or important outcomes are the result of the analyst assembling and constructing relationships among the data derived from analysis, in combination with knowledge or understanding within the head of the analyst.
Intelligence analysis systems are not transaction processing systems, and neither are they command and control systems, although the outputs from the IAS can and will be used to direct the tasking of collection systems and perhaps force deployment systems. IAS has two key roles: (i) support the collation of required information, and (ii) support the manipulation and assembly of the information for making sense of the situation, and to provide explanations for various observed phenomenon.
In collating information, intelligence analysis typically deals with large data sets, in mixed formats, from multiple sources. The data arrives out of sequence, missing, ambiguous, and even contain deliberately mis-leading or deceptive information. HUMINT, ELINT, SIGINT, RADINT, MASINT, represent a wide variety of data types where the significance about them need to be created and combined to explain a situation. This is different from representing a typical process control system.
Many of these automated computational and analytical processes tend to take place as black boxes (at least to the analyst). The results are often presented to analysts in ways that makes it impossible or difficult to question the validity of the results, e.g. how are the networks created? What assumptions are made in recommending that Person A is more likely to be the perpetrator than Person B? or on what basis are a set of crime reports “similar” (bearing in mind that crime reports are largely un-structured text descriptions and hold a lot of ambiguity)? What are the weightings, and are the weightings justifiable in the circumstances?
From a Human Factors perspective, one design goal for development of IAS is to make the black-box processes “visible” to the analyst so that they are able to judge for themselves the validity of the system recommended outcome, and whether that makes sense in the context of the situation.
Another aspect of this is the notion of conclusion pathways – the reasoning steps supported by underlying analytics and the combined outcomes at each step that led to the conclusion. With such visual representations, it is the possible to back-track to see possibly where one went wrong, or when interrupted by another task, or re-trace one’s analytic and reasoning steps
When reasoning through a problem, analysts go through a cycle of converge-diverge-converge-diverge as they assess data, understand, and choose paths of investigations. Depending upon the data available, the situation, and their goals analysts invoke a process of inference making that interactively combines abduction, induction, and deduction.
This is where the design of IAS is significantly different from designing for complex process control systems. The user interface needs to fluidly support the playful assembly of information to encourage imagination, insight, and inference making, whilst ensuring that conclusions are based on rigourous analysis. We characterise this as the need for fluidity and rigour in the user interface design for intelligence analysis sytems.
Dr William Wong is Professor of Human-Computer Interaction and Head, Interaction Design Centre, at Middlesex University's School of Science and Technology, London, UK. Prior to academia, Professor Wong worked in the Republic of Singapore Air Force in a number of roles, including as an Air Defence Controller directing fighters and missile systems, and then as Head, Systems and Communications Operations Branch, HQ RSAF. His research interest is in the representation design of information to support decision making in naturalistic environments. From a Cognitive Work Analysis perspective, his research has included air traffic control, hydro-electricity dispatch control, emergency ambulance command and control, intelligence analysis, and visual analytics, with the view of developing user interfaces that enhance information uptake and support decision making and situation awareness in real-time dynamic environments.
| Title | Simulation applications for training Dr. Valerie Rice |
| Date/Time | Wednesday, 03 December 2014 / 14:20 |
Medical simulation is designed so health care procedures and processes can be practiced to proficiency thus reducing errors, enhancing safety, and improving team performance - before bringing the patient into the arena. Yet, as attractive as the highly technical systems and their advanced methods are, there is a trade-off in terms of financial cost. They can be very expensive in terms of financial costs, while saving a life is considered priceless. Military medical simulation uses both commercial (purchased) products, as well as developing products of their own, and is driven by the immediate need of the military services, as well as global threats. During this panel presentation, Dr. Rice will address military medical simulation in health care , in support of military medical training, and the research objectives necessary to move from identified user needs though the transition of technology for military use.
Dr. Rice has graduate degrees in Human Factors Engineering (Ph.D.), Health Care Administration (MHA) and Occupational Therapy (MOT). She currently works as the Chief of the Army Research Laboratory (ARL) – Army Medical Department Field Element, which specializes in conducting applied, medical human factors research. Her current research is on cognitive attention, focus and concentration including human factors investigations of technologies and intervention techniques in terms of effectiveness, ease-of-use, and feasibility. This research has included investigations focused on increasing resiliency among soldiers, evaluating Mindfulness-based Stress Reduction on performance improvements, improving weapon firing performance, identifying traumatic brain injury, improving learning among military health care professionals and paraprofessionals, and development of ergonomics safety instruction programs for all military ranks. Additional focuses have included injury prevention, safety, physical training, physically demanding tasks, and training methods to increase attention and concentration.
| Title | Future of HFE in military Dr. Robert Bridger, Dr. Valerie Gawron, Dr. Valerie Rice, Professor William Wong |
| Date/Time | Wednesday, 03 December 2014 / 14:40 |
For the last item, each panelist will speak for 5 minutes on where they think R&D funds should be placed for the next 10-15 years. Attendees of the panel discussion would get a printed copy of the points raised during the session.