Dr. Yemao Man, Senior User Experience (UX) Researcher at ABB’s Corporate Research Center in Västerås, Sweden, challenges the prevailing narrative that human error is the primary cause of accidents in the technical world. Seeing human errors as a symptom of deeper systemic issues, he emphasizes the importance of human-centered design and offers a fresh perspective on the integration of UX in business in an increasingly automated world.
Tell us about your work and how you became interested in UX?
Basically, our laboratory is the users' own world. We conduct extensive research via field studies, observing and interviewing operators and other key stakeholders in their environments. We explore how they interact with equipment, software systems, and even each other, aiming to uncover unarticulated needs and identify pain points. Then we generate actionable insights to improve design or create novel concepts for visioning the future.
My interest in UX began when I was studying Computer Science and Engineering. UX and human-centered design are an intriguing field at the intersection of cognitive science and technology. I completed my PhD in Human Factors (HF), a social-technical discipline that studies the interactions between humans and other elements in a system.
UX and human-centered design are an intriguing field.
UX is a continually evolving continuum of those interactions, influenced by design and closely linked to emotion and engagement. In the marine domain, the time I have spent on ethnographical studies on vessels has significantly deepened my understanding of how design impacts user cognition, satisfaction, safety, and efficiency in industrial settings.
What are some non-technical aspects of Human Factors?
A core part of our research is identifying barriers to task execution. Is it due to lack of knowledge of the system in question, miscommunication, or a badly designed user interface?
The role of human interactions within organizations is a critical part of HF. Our focus is on enhancing communication to minimize misunderstanding, and we achieve this by closely examining how individuals collaborate and coordinate their efforts.
Training is very important. As technology advances, the need for appropriate training becomes more apparent. Relying on extensive manuals for user education is often ineffective. Can we use other ways to encourage more active learning? I think this is an important area of research. One promising direction is the application of augmented reality (AR) technologies. By making training more interactive and engaging, AR can facilitate experiential learning, allowing operators to assimilate knowledge more effectively through augmented hands-on experience.
How is UX different for automation technology?
The integration of automation technology into technical systems represents a significant shift in user-system interaction dynamics. Unlike many consumer-oriented technical applications, automation assumes a greater role in decision-making. This transforms the user’s role into a more supervisory or managerial one, which can lead to significant cognitive challenges and user engagement issues.
A critical aspect of UX research is to understand user behavior in response to automation failures or unexpected system behavior, and how design should cope with that resiliently.
Trust plays a fundamental role in our approach to understand human-automation relationship, and it is influenced by many factors, including individual factors, information visualization, and interaction design. Additional considerations for automation technology include the necessity of proper training and adherence to ethical design principles. Ethical design is especially important in ensuring that technology does not introduce biases or unfairness into the system. For example, a performance ranking system that displays an operator's performance alongside their peers could raise ethical and psychological concerns might raise ethical and psychological concerns.
Do you think people fear automation?
I have conducted a lot of interviews in which people have expressed their concerns about high-level automation, particularly regarding job security. Some managements seem more interested in using technology as a substitute for human labor rather than as a tool to enhance and support human performance.
While there is speculation about a jobless future, to me such a reality seems distant. Nowadays a lot of activities that have high knowledge barriers, such as writing, can be performed by AI. These systems are flexible and efficient, but unlike human beings they lack inherent goals or values. These are things that make us fundamentally different from machines.
I really don’t believe machines will take over our world. Our focus shouldn’t be solely on creating powerful technology in isolation. Instead, the future should be envisioned as a synergistic partnership between humans and machines, where we complement each other’s capabilities. Technology should support human endeavors without replacing the human element – I think this is fundamental to our society and values.
What are common misconceptions about Human Factors?
One of the most frequently heard misconceptions is that humans are not reliable, slow, forgetful, and easily stressed, thus forming the weakest link in systems. This often leads to the argument for replacing humans with advanced technology. However, this view overlooks the fundamental and unique strengths that humans bring to complex technical systems.
Researchers in an area of study within HF, called joint cognitive systems and resilience engineering, advocate for viewing humans not as potential liabilities, but as essential contributors, enhancing a system’s adaptability and resilience. We humans possess unique abilities to anticipate, adapt, and respond effectively to unexpected situations. A good example is the Apollo 13 incident back in 1970, in which NASA astronauts and ground crew, using their ingenuity and adaptability, crafted a life-saving solution using the available resources, landing the craft back on Earth after a rupture in an oxygen tank put the mission in jeopardy.
Human adaptability does come at a cost, which is the potential for errors. However, these errors should not be viewed as “bad apples” but rather an intrinsic part of an evolving social-technical system. The key is not to replace human operators, but to create systems that enhance and support human capabilities. We should not strive for rigid design that eliminates human errors at all costs, but for design that allows the system to absorb errors and still function safely.
How should HF/UX be considered in a business scenario?
It is important to consider the benefits that HF and UX research bring to a business. For a company trying to develop a user-friendly product, clear and transparent communication with end-users is crucial.
A characteristic of current automation technology is the use of complex pipeline diagrams or user interfaces. People often struggle to understand how the outputs are generated. They are unsure whether they should trust the technology, due to a lack of understanding of the underlying process. Addressing the transparency of these processes for end-users is a key focus in HF studies. We need to think about ways to effectively explain results to end-users who are experts in their own domains, not in machine-learning, statistics, or mathematics. When presenting AI-based recommendations, it is essential that operators comprehend the rationale behind those calculated results.
This goes beyond simple interface design, which is often mistakenly equated with HF and UX. We must have a holistic system perspective to understand what is going on in the field, and then translate those findings into design insights. This approach can significantly influence business, particularly with emerging trends like generative AI and the industrial metaverse, which have the potential to disrupt conventional business models. Understanding their impact on HF is crucial for devising better design strategies.
There is a saying in HF that “good ergonomics makes good economics”. While this might not be immediately visible in the short term, its long-term value is undeniable.
Do businesses generally invest enough resources in HF?
There is definitely room for improvement in how we view the impact of our work in HF and UX research. The focus is often on tangible business outcomes like top and bottom lines. Managers frequently ask UX researchers about the revenue potential of their work. Yet, UX is a complex social-system construct that cannot be simply quantified. It is similar to trust in design, because it is also challenging to measure the direct impact of design on revenue in quantifiable terms. I believe there should be a shift towards cultivating a human-centered mindset in business because putting the needs of customers at the center of business operations will eventually lead to higher customer satisfaction and long-term success. Over time, the profound impact of UX will become evident.
Human-centeredness focuses on innovating around users’ needs and challenges. If we embrace a technology-centered approach, which is prevalent in many industries, then we prioritize what can be sold to users based on available technology. I think this will backfire in the long run. You have to start with the user experience and work back towards the technology – not the other way around.
There is still a lot of that needs to be explored in the terms of HF, and that requires engaging with real end-users in real-world scenarios in a scientific manner. Our synthesized insights stem from deep engagement with end-users through rigorous user research methods, including observation, interviews, and even shadowing. This enables us to empathize effectively. We identify stress-inducing situations, critical information needs, and gather feedback on interface sketches. After fieldwork, we spend hours reviewing videos, analyzing notes, and figuring out how to convey these invaluable insights to product designers.
Is attributing incidents to human error without further explanation an old-fashioned approach? Are we now looking beyond human error?
This narrative around human errors being a primary cause of accidents is still prevalent in the technical world. It gets us nowhere because it implies that human beings are inherently error-prone and supports the argument for technology replacing them. We should not overlook the broader context in which errors occur. Human errors should not be seen as a cause, but rather a symptom of deeper issues within a socio-technical system.
Human errors are often the result of systemic problems such as design flaws or inadequate training, poor communication, or unrealistic workloads. An organizational culture and leadership style that views errors as learning opportunities and catalysts for systemic improvement can significantly increase the chances of proper support during mishaps and help to mitigate risks. Conversely, organizational issues, such as the pressure to outpace competitors by rushing production without prioritizing high safety standards, can silently contribute to system failures during the risk incubation period.
As long as there are humans in a system, there will be errors.
I think we should dig into how organizations can learn from errors and become more resilient. The focus should not be on the unrealistic goal of eliminating human errors entirely. As long as there are humans in a system, there will be errors. Instead, the emphasis should be on designing systems that can mitigate the impact of such errors. Leadership should be crafted to support a safety-conscious and human-centered culture, fostering an environment where individuals feel empowered to contribute to the improvement process. This approach will encourage a more reflective and growth-oriented organizational ethos.