Internationally, health information systems (HIS) safety has emerged as a significant concern for governments. Recently, research has emerged that has documented the ability of HIS to be implicated in the harm and death of patients. Researchers have attempted to develop methods that can be used to prevent or reduce technology-induced errors. Some researchers are developing methods that can be employed prior to systems release. These methods include the development of safety heuristics and clinical simulations. In this paper, we outline our methodology for developing safety heuristics specific to identifying the features or functions of a HIS user interface design that may lead to technology-induced errors. We follow this with a description of a methodological approach to validate these heuristics using clinical simulations.
Health information system (HIS) safety has emerged as growing concern around the world for clinicians, regional health authorities, and governments who are modernizing health care through information technology. Internationally, we have seen the number of studies documenting the existence of technology-induced errors grow [
Technology-induced errors have arisen as a significant international issue [
Today, we have seen an increase in the number of software companies who are designing information systems for healthcare [
HIS safety or “activities that seek to minimize or to eliminate hazardous conditions that can cause bodily injury” arising from the use of information systems and technologies have emerged as a new area research [
In some communities, this has emerged as a barrier to HIS adoption and has initiated discussions and publications regarding software developer and vendor blame from a legal perspective [
Researchers have recognized the need to design and develop HIS user interfaces for safety (i.e., user interface designs that prevent technology-induced errors). Researchers are using different approaches to address this issue from a software design, development, and testing perspective [
In this section of the paper, we discuss our methodology for developing evidence-based heuristics and validating them using clinical simulations (see Figure
Phase of the methodology for developing and validating safety heuristics.
A set of evidence-based heuristics for software application safety were developed in a series of stages. Initially, our work on HIS safety consisted of a systematic review of the medical literature in the area of technology-induced errors and healthcare. This was followed by expert panel development of evidence-based heuristics and initial work in conducting clinical simulation testing using simulations to determine if the heuristics could reduce the number of technology-induced “near misses” (i.e., slips) and “errors” (i.e., mistakes) made by health professional users [
What are the specific user interface design features or functions that lead to user “near misses” in HIS such as medication administration and physician order entry systems? What are the specific user interface design features or functions that lead to user “errors”? What user interface design features or functions that can lead to “severe” errors (i.e., resulting in human death, disability, and injury)? What is the nature of the situational context where the software application was used and lead to a user “near miss” or “technology-induced error”?
Data were extracted from relevant studies (see Section
If there was agreement among expert panel members, then each finding was discussed and a corresponding heuristic was developed. As each study finding was considered, previously developed heuristics were evaluated to determine if they could be effectively used to detect the technology error that was identified in the study. If a previously developed heuristic applied, then no new heuristic was developed. If no heuristic applied, then a new heuristic was developed. This was done until the review of the studies was completed. All differences of opinion regarding heuristic development and wording were resolved through discussion until unanimous agreement was reached. When the panel completed their work, 38 heuristics emerged.
The heuristics were then analyzed using a content analysis approach. Content analysis was employed as it provides a method for obtaining an objective and qualitative description of the content of text [
Usability themes and corresponding number of heuristics developed in each theme.
Usability theme | Definition for each theme | Number of heuristics developed in each area |
---|---|---|
Workflow | Workflow issues deal with process issues arising from HIS use | 10 |
Content | Content issues arise from poor quality information in the HIS | 14 |
Safeguards | Safeguard issues are specific to the presence or absence of decision supports that prevent medical errors | 8 |
Function | Functional issues deal with concerns arising from how a HIS functions | 6 |
Examples developed safety heuristics organized by usability theme.
Usability theme | Example of developed heuristic |
---|---|
Content | “System clearly displays the date and time the medication was updated” [ |
Workflow | “System accommodates clinician physical activities” [ |
Functional | “System allows for linkages between medication ordering, administration, and discontinuation procedures” [ |
Safeguards | “System checks for duplicated medications” [ |
In Phase 2 of our research we assessed the effectiveness of the evidence-based, safety heuristics. In this phase of this research, we tested these evidence-based safety heuristics by conducting a heuristic evaluation where a user interface of a HIS (i.e., an electronic health record system) was inspected. This involved a human factors analyst inspecting the user interface of a HIS to identify and predict potential user error [
Our findings suggested that not all heuristics could be applied using a traditional heuristic evaluation approach. Two workflow, six content, two functional, and three safeguard heuristics could be applied to the heuristic evaluation of a electronic patient record (during our testing of the heuristics). Twenty-five of the 38 heuristics could not be applied by an analyst conducting a traditional usability inspection. Instead, we determined that the heuristics needed to be tested in the context of a clinical simulation where scenarios that are representative of real-world environments could drive the safety testing [
Analysis of clinical simulation data involves fine-grained analysis of video, audio, and computer screen data so that technology-induced errors are captured [
There are two aspects of the methodology described in this paper that make it significant. First, the paper describes a new method for developing a set of evidence-based, safety heuristics that can be used to evaluate the safety of HIS interface designs used in complex, dynamic, and uncertain work settings. Few heuristics are specifically designed around safety of HIS user interfaces. These heuristics can be used by software developers to design safe HIS interface designs for complex and dynamic work settings such as those found in healthcare. As well, these heuristics can be used by human factors experts working in the field to identify potential sources of technology-induced error in systems that are currently being used (in order to prevent any future errors that can lead to human death, disability, and injury). Second, from a methodological perspective, the paper describes the development of an approach that links several different but complementary methodologies in a novel way to develop and empirically test HIS interface design heuristics (i.e., the use of a systematic review to inform heuristic development by a panel of human factors experts and the use of inspections of several IS interface designs by human factors experts followed by clinical simulation testing to determine the ability of the heuristics to predict HIS interface design safety issues).
The authors are currently conducting tests of heuristics on the static (e.g., fixed user interface features) and dynamic features (e.g., aspects of the user-system dialogue and interaction in carrying out real tasks) of user interface designs [
Our work indicates that the evaluation of such heuristics requires testing under realistic conditions that can be provided using simulation methods. Such an approach can be used in the empirical validation of HIS that are currently being used. From a theoretical perspective, the extension of work done in this area of human factors contributes to the development and validation of safety heuristics using clinical simulations. In summary, in this paper we outline a new methodology for developing evidence-based safety heuristics for HIS interface design and a methodology for developing and testing such evidence-based safety heuristics using clinical simulations. The full methodology (i.e., systematic review, expert panel development, inspections, and clinical simulation as applied to validation of safety heuristics testing) can be used in improving system safety.