|
|
|
|
LEADER |
04310nam a2200313Ia 4500 |
001 |
10.2196-35032 |
008 |
220630s2022 CNT 000 0 und d |
020 |
|
|
|a 22929495 (ISSN)
|
245 |
1 |
0 |
|a An Electronic Data Capture Tool for Data Collection During Public Health Emergencies: Development and Usability Study
|
260 |
|
0 |
|b JMIR Publications Inc.
|c 2022
|
520 |
3 |
|
|a Background: The Discovery Critical Care Research Network Program for Resilience and Emergency Preparedness (Discovery PREP) partnered with a third-party technology vendor to design and implement an electronic data capture tool that addressed multisite data collection challenges during public health emergencies (PHE) in the United States. The basis of the work was to design an electronic data capture tool and to prospectively gather data on usability from bedside clinicians during national health system stress queries and influenza observational studies. Objective: The aim of this paper is to describe the lessons learned in the design and implementation of a novel electronic data capture tool with the goal of significantly increasing the nation’s capability to manage real-time data collection and analysis during PHE. Methods: A multiyear and multiphase design approach was taken to create an electronic data capture tool, which was used to pilot rapid data capture during a simulated PHE. Following the pilot, the study team retrospectively assessed the feasibility of automating the data captured by the electronic data capture tool directly from the electronic health record. In addition to user feedback during semistructured interviews, the System Usability Scale (SUS) questionnaire was used as a basis to evaluate the usability and performance of the electronic data capture tool. Results: Participants included Discovery PREP physicians, their local administrators, and data collectors from tertiary-level academic medical centers at 5 different institutions. User feedback indicated that the designed system had an intuitive user interface and could be used to automate study communication tasks making for more efficient management of multisite studies. SUS questionnaire results classified the system as highly usable (SUS score 82.5/100). Automation of 17 (61%) of the 28 variables in the influenza observational study was deemed feasible during the exploration of automated versus manual data abstraction. The creation and use of the Project Meridian electronic data capture tool identified 6 key design requirements for multisite data collection, including the need for the following: (1) scalability irrespective of the type of participant; (2) a common data set across sites; (3) automated back end administrative capability (eg, reminders and a self-service status board); (4) multimedia communication pathways (eg, email and SMS text messaging); (5) interoperability and integration with local site information technology infrastructure; and (6) natural language processing to extract nondiscrete data elements. Conclusions: The use of the electronic data capture tool in multiple multisite Discovery PREP clinical studies proved the feasibility of using the novel, cloud-based platform in practice. The lessons learned from this effort can be used to inform the improvement of ongoing global multisite data collection efforts during the COVID-19 pandemic and transform current manual data abstraction approaches into reliable, real time, and automated information exchange. Future research is needed to expand the ability to perform automated multisite data extraction during a PHE and beyond. © Joan Brown, Manas Bhatnagar, Hugh Gordon, Jared Goodner, J Perren Cobb, Karen Lutrick.
|
650 |
0 |
4 |
|a clinical research design
|
650 |
0 |
4 |
|a design tenet
|
650 |
0 |
4 |
|a disaster management
|
650 |
0 |
4 |
|a EDCT
|
650 |
0 |
4 |
|a electronic data
|
650 |
0 |
4 |
|a electronic data capture
|
650 |
0 |
4 |
|a informatics
|
650 |
0 |
4 |
|a public health emergencies
|
650 |
0 |
4 |
|a public health emergency
|
650 |
0 |
4 |
|a real time data
|
700 |
1 |
0 |
|a Bhatnagar, M.
|e author
|
700 |
1 |
0 |
|a Brown, J.
|e author
|
700 |
1 |
0 |
|a Cobb, J.P.
|e author
|
700 |
1 |
0 |
|a Goodner, J.
|e author
|
700 |
1 |
0 |
|a Gordon, H.
|e author
|
700 |
1 |
0 |
|a Lutrick, K.
|e author
|
773 |
|
|
|t JMIR Human Factors
|
856 |
|
|
|z View Fulltext in Publisher
|u https://doi.org/10.2196/35032
|