Every effective railway operation has a strong, well-managed control center. Also known as the “nerve centers” of the rail network, control centers serve as hubs of the newest rail technology and help ensure the smooth operation of train services, provide passengers with information, and resolve safety incidents.
At the heart of every control center sits the interface between human operators and technology. The public may believe that technology alone improves efficiency, quality, safety, and cost. However, few people consider that these same technologies may also introduce errors and adverse events.
In the earliest years of the railway industry, train drivers relied heavily on their own perceptions and judgment to perform their jobs safely. Over time, those methods were replaced with technology designed to work with a variety of modern driver systems. Today’s train drivers encounter more automated technologies within the cab than ever before.
We still don’t fully understand the impacts of technology and how decisions are best made between human operators and increasingly autonomous technologies to carry out safety-critical tasks, but Kostas Triantis, a professor in the Grado Department of Industrial and Systems Engineering in the College of Engineering, will use a National Science Foundation award to find out.
For the project, researchers from multiple universities spanning the disciplines of decision theory, organizational theory, and systems and human factors engineering will collaborate with a network of European — Infrabel, Pro Rail, Network Rail, European Agency for Railways — and U.S. — Union Pacific — infrastructure providers and federal research centers such as the VOLPE Center.
As a representative for Infrabel, principal engineer Brad Roets provides new data for the research team to analyze for the project as well as serves as intermediary between Infrabel and the other European infrastructure providers.
“This research should deliver highly relevant and scientifically based insights and tools that will further improve on safety levels, staff well-being, and the smart use of automation in control rooms,” said Roets. “It will also help in preparing — and anticipating — for a future where automation plays an even more important role than it does today. I also expect new and innovative types of control room datasets and metrics to emerge from the interactions between the researchers, our data engineers, and other project collaborators.”
With this four-year, $2 million grant funded under the Leading Engineering for America’s Prosperity, Health, and Infrastructure program (LEAP-HI), Triantis will work with a transdisciplinary team of engineers and social scientists to explore how cognitive biases influence trust in automation and decisions to delegate tasks to automated technologies.
In the airline industry, air traffic flow management and collaborative decision-making drive safe aviation. A number of decisions have been related to the ability of pilots to interact effectively with automated systems.
When circumstances dictate, pilots and operators, in general, need to make decisions as to when to intervene and use systems manually. When they are using automated systems, their monitoring workload becomes larger, which then increases the probability for making an error. For example, Boeing was under intense scrutiny after its Max jet was involved in two fatal accidents between 2018 and 2019. Pilots were not trained to handle the new automated system, known as MCAS. Because the pilots were not trained on the new technology, they could not override the system, resulting in a crash and loss of life.
Based on interviews with engineers, operators, and managers, we know that during peak times of activity, controllers often turn off automated systems because regulating the transportation network becomes too complicated. Even though controllers are trained to deal with a variety of scenarios, different controllers respond to scenarios in a variety of ways. Younger controllers tend to use the automated systems more than their senior colleagues. The companies, in order to achieve cost of efficiency, would like to use technology more. However, decisions must balance multiple factors, including economic feasibility, safety, and workload demands placed on operators.
Similar situations abound in the health care sector, where managing human-technology interaction is critical. For example, in operating rooms, doctors, nurses, and other stakeholders need situation awareness and an acceptable level of mental workload to effectively manage automated systems. Such systems have the potential to prevent many potentially harmful incidents, but poorly designed automated systems may cause more incidents instead of preventing them, especially when automation reduces the human’s role to being primarily supervisory. Building off the LEAP-HI grant, Triantis has reached out to secure funds for a seed grant to further explore these issues in collaboration with MEDSTAR in Northern Virginia and Carilion in Roanoke.
For the past four decades since Triantis’ dissertation, the researcher has sought to marry different disciplines for a more holistic approach to societal problems.
“Our focus has been more and more multidisciplinary,” said Triantis. “Not just with evaluating the impacts of technological investments, but also with the well-known challenge of the integration of the social networks and their behaviors with technological change. We are trying to see the relevance of specific theoretical paradigms and different modeling approaches to address complex societal problems. As a Ph.D. student, there was something that I couldn’t really articulate at the time, this idea of doing multidisciplinary research, how different approaches pertain to different disciplines. How some of the theory actually relates to reality.”
A goal of this project is to reach a diverse group of students to stimulate interest in LEAP HI-related disciplines for students interested in STEM fields. Triantis will work directly with Virginia Tech’s Center for the Enhancement of Engineering Diversity on outreach activities to engage students in grades 7-12 by sharing the simulator decision-making challenges, the influence of distributed situational awareness, and fatigue on decision-making.
The insights obtained in this work will inform design and policy choices of both infrastructure providers — such as railroads, airlines, trucking, maritime, and pipeline controllers — and regulators with authority over infrastructure systems — such as the Federal Railroad Administration, the Federal Aviation Administration, the U.S. Coast Guard, and the Federal Transit Administration in the United States. The framework provides a critical input for the engineering community, specifically for safety regulators working with automation-driven infrastructure systems.
– Suzanne Miller