Recording patient data on six observation charts: An experimental comparison

Horswill, Mark S., Preece, Megan H. W., Hill, Andrew, Christofidis, Melany J. and Watson, Marcus O. (2010) Recording patient data on six observation charts: An experimental comparison Sydney, Australia: Australian Commission on Safety and Quality in Health Care

Attached Files (Some files may be inaccessible until you login with your UQ eSpace credentials)
Name Description MIMEType Size Downloads
Horswill_et_al_2010_Record_Obs_on_Charts.pdf Horswill_et_al_2010_Record_Obs_on_Charts.pdf application/pdf 2.96MB 326

Author Horswill, Mark S.
Preece, Megan H. W.
Hill, Andrew
Christofidis, Melany J.
Watson, Marcus O.
Title of report Recording patient data on six observation charts: An experimental comparison
Publication date 2010-05
Open Access Status File (Publisher version)
Publisher Australian Commission on Safety and Quality in Health Care
Place of publication Sydney, Australia
Total pages 64
Language eng
Subjects 920299 Health and Support Services not elsewhere classified
111711 Health Information Systems (incl. Surveillance)
170112 Sensory Processes, Perception and Performance
Formatted abstract
Paper-based observation charts are the principal means of monitoring changes to patients’ vital signs. There is considerable variation in the design of observation charts and a lack of empirical research on the performance of observation charts. This report describes the results of one of a series of studies carried out as part of a project funded by the Australian Commission for Safety and Quality in Health Care and Queensland Health to investigate the design and use of observation charts in recognising and managing patient deterioration, including the design and evaluation of a new adult observation chart that incorporated human factors principles. The first phase of this project involved using a procedure known as heuristic analysis to review 25 charts from Australia and New Zealand. 1,189 usability problems, which could lead to errors in recording data and identifying patient deterioration, were identified in the charts. The results from the heuristic analysis were used to design a new chart (the ADDS chart) based on human factors principles and current best practice.
In order to assess how the ADDS chart compared with a range of existing patient charts, we previously conducted a study to evaluate the performance of both novices and health professionals when using two versions of the ADDS chart (with and without a systolic blood pressure table to control for a patient’s usual blood pressure) as compared with four existing charts. This study involved measuring the errors made by individuals when judging whether vital signs were normal or abnormal when they were presented on the six charts. The results indicated that those charts considered to be better designed in the initial stage of the project yielded considerably fewer errors together with shorter decision times. The two versions of the ADDS chart were found to be better than other charts (the other charts yielded between 2.5 and 3.3 times the number of errors as the ADDS charts). The absolute error rates were considerable (ranging from 9.8% for one of the ADDS charts to 32.6% for the worst-performing chart, where 50% would be chance performance).
Another stage at which error is likely to be important is when users are recording data (in the first study, the data was pre-recorded onto the charts). This report describes a second study focussed on data-recording errors, rather than decision errors. Participants recorded real patient data onto each of the six charts over an extended period in a simulated hospital ward, where they were given the task of monitoring six simulated patients (using a different chart for each patient). Each patient’s vital signs were shown on a computer display by the patient’s bed. The simulation was carried out in as realistic an environment as possible, including low lighting and background noise distraction. Results demonstrated that, contrary to the first study, the simplest charts yielded the fewest errors (presumably because these charts involved simply transcribing numbers from the display rather than converting the numbers into a graph, etc.). The more complex charts yielded the highest number of errors, where the two versions of the ADDS charts generated the fourth and fifth highest number of errors. However the magnitude of the error rates was much smaller than in the first study: The worst-performing chart yielded 2.3% errors, while the best-performing chart yielded 0.2% errors. That is, it appears that the process of recording data is overall far less prone to error than the process of detecting abnormal vital signs.
We aggregated data from the two studies to determine the overall error rate for each chart, taking into account both errors in recording data and errors in detecting abnormal vital signs among recorded data. Overall, the rank order found in the first experiment was maintained, because of the proportionally much higher error rates found in that study. That is, the two versions of the ADDS charts were associated with the fewest errors overall, followed by the two existing charts judged to have good designs in the heuristic analysis. The charts judged as average and poor in the heuristic analysis yielded the highest overall error rates.
These results suggest that the main errors involved in detecting deteriorating patients are likely to occur at the level of making clinical judgements rather than at the level of recording data. Chart design affects both types of error and there appears to be a trade-off between the ease of recording data and the ease of detecting deterioration, given that the charts that yielded fewer errors on detecting abnormal vital signs tended to yield more errors when data was being recorded. However, the error rates associated with recording data were much smaller than error rates associated with detecting abnormal vital signs, and hence chart design ought to focus on minimizing the latter rather than the former.
Overall, the results suggest that the ADDS charts represents an improvement over the existing charts, despite being associated with more recording errors than some of the existing charts.
Keyword Observation charts
Medical charts
Usability
Vital signs
Human factors
Patient safety
Q-Index Code AX
Q-Index Status Provisional Code

Document type: Research Report
Collection: School of Psychology Publications
 
Versions
Version Filter Type
Citation counts: Google Scholar Search Google Scholar
Created: Tue, 16 Nov 2010, 21:32:17 EST by Miss Megan Preece on behalf of School of Psychology