Informing the design of Integrating Data Software in Critical Care: A Systematic Review of the Impact on Clinician Performance and Meta-Analysis
CCCF ePoster library. Lin Y. Oct 28, 2015; 117366; P109 Disclosure(s): Natural Sciences and Engineering Research Council (NSERC) of Canada, CREATE Academic Rehabilitation Engineering (CARE) training program, Barbara and Frank Milligan’s support of biomedical engineering research through their graduate Fellowship, and the Canadian Institutes of Health Research’s Healthcare, Technology and Place (HCTP) strategic initiative.
Ying Ling Lin
Ying Ling Lin
Login now to access Regular content available to all registered users.

You may also access this content "anytime, anywhere" with the Free MULTILEARNING App for iOS and Android
Abstract
Rate & Comment (0)
P109


Topic: Systematic Review/Meta-analysis


Informing the design of Integrating Data Software in Critical Care: A Systematic Review of the Impact on Clinician Performance and Meta-Analysis



Ying Ling Lin, L. Kolodzey, C. Nickel, P. Trbovich, A. Guerguerian

Interdepartmental Division of Critical Care Medicine, Hospital for Sick Children, Toronto, Canada | Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Canada | Learning Institute, Hospital for Sick Children, Toronto, Canada | Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Canada | Interdepartmental Division of Critical Care Medicine, Hospital for Sick Children, Toronto, Canada

Introduction:

In the modern technology-driven intensive care unit (ICU), the complex critically-ill patient easily generates over a thousand individual data points on a daily basis. In addition to clinical notes, medical images, and laboratory test results, ICU clinicians interpret immense arrays of continuous monitoring data generated by technological devices throughout the clinical environment, potentially resulting in the experience of information overload. This problem of overwhelming continuous monitoring data may be solved through software that integrates this data and displays it on a single screen. We propose a global review of studies related to different steps of the user-centred design cycle and divide these into two parts to answer two distinct questions.



Objectives:

In Part 2 of the systematic review, we focus on the quantitative studies that measure change in clinician performance resulting from the use of displays that integrate data from continuous monitoring technologies. This part aims to answer the following research question: In critical care, does technology for integration and visualization of continuous monitoring data support clinicians in their decision-making?



Methods:

In the present systematic review, guided by the Preferred Reporting Items for Systematic Reviews (PRISMA) statement, we report the current human factors studies which are user-driven, rather than technology-driven, to inform the context of use and interface design of integrating data displays, while also evaluating the impact of these displays on clinical performance.

The systematic search of the literature was conducted in five databases: three medical (Ovid, Embase, CCRCT), one engineering (Web of Science), and one psychology (PsycINFO), from January 2000 to May 2014.

Two tools were modified from health informatics and human factors research and were successfully applied to this review’s studies. Specifically, the first tool reported study completeness, and the second tool, a modified QUality ASessment Informatics Instrument (QUASII), assessed quality.



Results:

The systematic search returned a total of 6111 articles with 20 articles matching the relevance criteria. A second screening of references from these 20 articles yielded a further four relevant articles. From the total of 24 articles, 17 studies, with quantitative components, were found to inform this question and were compared for completeness and quality. These studies were evaluated for study compleness and quality using the previously described tools, a 90% or above agreement was found. A meta-analysis technique combined cognitive load measurements from studies using the validated NASA-TLX instrument.



Conclusion:

This systematic review of human factors studies, of both qualitative and quantitative nature, is the first of its kind. The review highlights the variety of integrating technologies, comparator, study designs, clinician-participants and settings. In addition, as a proof-of-concept, the use of meta-analysis was successful applied to studies for which the raw data was provided by study authors and shows the need of using validated tools to draw conclusions from the body of knowledge and advances human factors studies to inform design of clinical information systems as a whole.



References:

1. Ahmad, S., et al., Continuous multi-parameter heart rate variability analysis heralds onset of sepsis in adults. PLoS One, 2009. 4(8): p. e6642.

2. Gomez, H., et al., Development of a multimodal monitoring platform for medical research. Conf Proc IEEE Eng Med Biol Soc, 2010. 2010: p. 2358-61.

3. Peute, L.W., et al., A framework for reporting on Human Factor/Usability studies of Health Information Technologies. Stud Health Technol Inform, 2013. 194: p. 54-60.

4. Bowling, A., Research methods in health: investigating health and health services. 2014: McGraw-Hill Education (UK).

5. Staggers, N. and J.W. Blaz, Research on nursing handoffs for medical and surgical settings: an integrative review. J Adv Nurs, 2012. 69(2): p. 247-62.

6. Le Roux, P., Physiological monitoring of the severe traumatic brain injury patient in the intensive care unit. Curr Neurol Neurosci Rep, 2013. 13(3): p. 331.

7. Rasmussen, J., W. Rouse, and L.P. Goodstein, Discriminative Display Support for Process Operators, in Human Detection and Diagnosis of System Failures. 1981, Springer US. p. 433-449.

8. McAlpine, B. and D. VanKampen, Clinical engineering and information technology: Working together to implement device integration. Biomedical Instrumentation and Technology WHO, 2011. 45(6): p. 445-449.

9. Brown, H., et al., Continuous Monitoring in an Inpatient Medical-Surgical Unit: A Controlled Clinical Trial. The American journal of medicine, 2014. 127(3): p. 226-232.

10. Kurosu, M., Human-Computer Interaction: Human-Centred Design Approaches, Methods, Tools and Environments: 15th International Conference, HCI International 2013, Las Vegas, NV, USA, July 21-26, 2013, Proceedings. 2013: Springer Berlin Heidelberg.

11. Pagliari, C., Design and Evaluation in eHealth: Challenges and Implications for an Interdisciplinary Field. Journal of Medical Internet Research, 2007. 9(2): p. e15.

12. Weir, C.R., N. Staggers, and S. Phansalkar, The state of the evidence for computerized provider order entry: a systematic review and analysis of the quality of the literature. Int J Med Inform, 2009. 78(6): p. 365-74.

13. Harden, A., Mixed-methods systematic reviews: Integrating quantitative and qualitative findings. Focus: A Publication of the National center for the Dissemination of Disability Research (NCDDR) Technical Brief, 2010(25).

14. Noblit, G.W. and R.D. Hare, Meta-ethnography: Synthesizing qualitative studies. Vol. 11. 1988: Sage.

15. Britten, N., et al., Using meta ethnography to synthesise qualitative research: a worked example. Journal of Health Services Research & Policy, 2002. 7(4): p. 209-215.

    This eLearning portal is powered by:
    This eLearning portal is powered by MULTIEPORTAL
Anonymous User Privacy Preferences

Strictly Necessary Cookies (Always Active)

MULTILEARNING platforms and tools hereinafter referred as “MLG SOFTWARE” are provided to you as pure educational platforms/services requiring cookies to operate. In the case of the MLG SOFTWARE, cookies are essential for the Platform to function properly for the provision of education. If these cookies are disabled, a large subset of the functionality provided by the Platform will either be unavailable or cease to work as expected. The MLG SOFTWARE do not capture non-essential activities such as menu items and listings you click on or pages viewed.


Performance Cookies

Performance cookies are used to analyse how visitors use a website in order to provide a better user experience.


Save Settings