Utilizing User-Centered EHR Design for Systematic Deep Brain Stimulation Data Collection (2024)

  • Journal List
  • AMIA Jt Summits Transl Sci Proc
  • v.2020; 2020
  • PMC7233084

As a library, NLM provides access to scientific literature. Inclusion in an NLM database does not imply endorsem*nt of, or agreement with, the contents by NLM or the National Institutes of Health.
Learn more: PMC Disclaimer | PMC Copyright Notice

Utilizing User-Centered EHR Design for Systematic Deep Brain Stimulation Data Collection (1)

Link to Publisher's site

AMIA Jt Summits Transl Sci Proc. 2020; 2020: 527–532.

Published online 2020 May 30.

PMCID: PMC7233084

PMID: 32477674

Daniel Robins, MD,1 Martijn Figee, MD Ph.D,1 Helen Mayberg, MD,1 and Joseph Finkelstein, MD Ph.D1

Author information Copyright and License information PMC Disclaimer

Abstract

This project aims to assess usability and acceptance of a customized Epic-based flowsheet designed to streamline the complex workflows associated with care of patients with implanted Deep Brain Stimulators (DBS). DBS patient care workflows are markedly fragmented, requiring providers to switch between multiple disparate systems. This is the first attempt to systematically evaluate usability of a unified solution built as a flowsheet in Epic. Iterative development processes were applied, collecting formal feedback throughout. Evaluation consisted of cognitive walkthroughs, heuristic analysis, and ‘think-aloud’ technique. Participants completed 3 tasks and multiple questionnaires with Likert-like questions and long-form written feedback. Results demonstrate that the strengths of the flowsheet are its consistency, mapping, and affordance. System Usability Scale scores place this first version of the flowsheet above the 70th percentile with an ‘above average’ usability rating. Most importantly, a copious amount of actionable feedback was captured to inform the next iteration of this build.

Introduction

Deep Brain Stimulation (DBS) is now an established treatment for Parkinson’s disease, essential tremor and dystonia, and it holds potential to treat numerous other neurological and neuropsychiatric disorders. This modality has seen rapid adoption accompanied by a quickly growing industry; there are multiple manufacturers of DBS- devices, and each manufacturer provides their own propriety data entry platforms, interfaces, and suggested workflow. Although DBS products may use similar data elements and track similar metrics, the totality of this data is not available in a single source. Patient visits are therefore complicated, requiring the capture of diverse heterogeneous data about the person as well as device specifications, performance variables and stimulation parameters. DBS providers examine data from these devices and from ongoing non-DBS treatments, so they must switch between multiple platforms and different applications in addition to the Electronic Medical Record (EMR) to find critical information about patient care.

These complexities are multiplied by frequent patient visits, as there are numerous ongoing adjustments to optimize DBS therapy. Furthermore, since DBS is already the most frequently performed surgical procedure for the treatment of advanced Parkinson’s Disease1, the volume of patients is expected to increase. Specialized multidisciplinary care centers that focus on DBS are already being formed; in New York, The Mount Sinai Health System now hosts the Nash Family Center for Advanced Circuit Therapeutics (C-ACT), to facilitate coordinated and comprehensive care. This comprehensive treatment model is involved in all stages of DBS therapy, including patient screening, imaging- guided surgical planning, implantation surgery, intraoperative testing, therapeutic parameter selection, ongoing testing and optimization, and long-term management with ongoing medication adjustments and rehabilitation throughout the treatment trajectory for each patient.

Unfortunately, excitement about DBS as a modality is somewhat tempered, owing to its complex and fragmented workflows. Providers - including physicians, PAs, nurses, and neurosurgeons – must endure multiple points of data entry across disparate systems and even on paper. Retrieving information is equally complex, as there is no one ‘source of truth’. This presents an increased risk for errors in patient care, and it is detrimental to any underlying research. Moreover, this contributes to significant documentation-related stress, adding to frustration, decreasing job satisfaction, and ultimately leading to burnout. Usability analysis is crucial to resolve these issues in this nascent workflow.

An effective solution to streamline the process may be to create a single ‘source of truth’. As the most comprehensive patient data already exists in the EMR, we chose this to be the single point of entry and reference for our staff. There are no pre-existing tools in the EMR that fit the needs of this specialty service. This project represents the first attempt to integrate all components of the DBS workflow into a single source by following user- centered design principles throughout all iterations of development.

Methods – System Design

User-centered design principles guided development at all phases.

Initial specifications were obtained through meetings with stakeholders and one-on-one sit downs with experts; these formative activities defined users, user needs, and basic workflows2. This feedback shaped the first functional version of the flowsheet (Figure 1).

The flowsheet was built in Epic 2019 with the standard interface of this build. A ‘table of contents’ that displays multiple groups arranged in a predicted workflow is visible on the left side of the screen. Each group corresponds with a different component of the patient encounter; many of these groups represent the disparate systems we seek to unite into a single interface. For example, the ‘Device Settings’ groups are intended to record DBS device- specific facts such as voltage, pulsewidth, and impedance, which are traditionally stored on a manufacturer-supplied device. Subsequent groups represent surveys such as the UPDRS or Y-BOCS, which capture information from later in the patient encounter.

Explanations for the fields (called ‘Row Information’) are visible on the right side of the screen. Field contents can be restricted to certain values relevant to that row of the flowsheet. Some fields require the entry of free text, whereas others require users to pick an item from a predefined list.

Methods - Study Design

The research assistant prepared a workstation already logged in to an Epic 2019 training environment with a mock patient encounter open and the flowsheet present on the screen. Upon sitting down at the workstation, the participant was presented with a standardized packet of surveys and instructions. Participants first filled out a demographic survey before reading the instruction packet and then performing 3 tasks.

Task 1 prompted the user to enter the current date into the flowsheet. This task was relatively simple and could be completed with few actions. Task 2 was progressively more complex: the participant would enter mock data into multiple fields, and could refer to the row information on screen for guidance. Task 3 expanded on this same format, requiring users to find the ‘Settings’ section of the flowsheet, which would cascade open upon entering data.

Participants were instructed to ‘think aloud’ during each task, and participants’ comments were recorded by the research assistant. Participants were prompted to keep talking if they stopped.

At the completion of each task, subjects were asked to grade that task on a scale of 1 to 5 using a 3-item Task Self- Assessment survey (Figure 2) which included the following questions: 1) How difficult or easy was it to complete this task? 2) How satisfied are you with using this application/system to complete this task? 3) How would you rate the amount of time it took to complete this task?

Open in a separate window

Figure 2.

Task Self-Assessment (Post-Task Questionnaire)

Upon completion of all tasks, participants filled out a heuristic evaluation (Figure 3) presented as a short 6- item survey based on Design Principles for Usability3, with Likert-like responses.

Open in a separate window

Figure 3.

Heuristics Evaluation Survey

Finally, participants were given an exit survey, and the System Usability Scale (SUS).

Results

12 subjects participated in this phase of the formal usability evaluation.

All tasks were successfully completed by all 12 participants, and there were no requests for help. On average, Task 1 required the least amount of time to complete, whereas Task 3 required the most time (Table 1).

Table 1.

Task Performance Summary

Task #Task Accomplished (%)Help Needed (%)Accomplished Time (sec) Mean ± SD
Task 110009.5 ± 8.5
Task 2100081.9 ± 19.2
Task 31000216.0 ± 81.1

Open in a separate window

Task Self-Assessment ratings were highest for Task 1, and the lowest for Task 3 (Table 2).

Table 2.

Task Self-Assessment Averages

Task 1 Rating (SD)Task 2 Rating (SD)Task 3 Rating (SD)
Difficulty4.7 (0.4)4.3 (0.9)3.4 (1.1)
Satisfaction4.9 (0.3)4.5 (0.9)3.0 (1.2)
Amount of Time4.7 (0.6)4.4 (0.8)3.0 (1.2)

Open in a separate window

Heuristics scores were highest for Consistency, Mapping, and Affordance. Scores for Constraints were lowest (Table 3).

Table 3.

Heuristic Evaluation Averages

Heuristic EvaluationAverage Score (SD)
Consistency4.1 (0.8)
Visibility3.8 (0.8)
Affordance4.0 (0.8)
Mapping4.1 (0.7)
Feedback3.9 (1.0)
Constraints3.0 (1.1)

Open in a separate window

The System Usability Scale (SUS) mean of 76.7 indicates an ‘above average’ usability, placing this flowsheet above the 70th percentile.

Discussion

DBS patient encounters are complex, and a single visit involves capturing both device data and other neuropsychiatric data such as movement disorder scales and depression scales, acquired by multiple caregivers from various specialties. Providers’ original workflow involved switching between multiple platforms within the same visit, repeatedly entering secure log in credentials,

identifying the correct patient, opening the correct encounter, then finding the relevant information and documentation. This complexity creates opportunity for errors of commission and errors of omission2, but it also contributes to EMR frustration. As clinician dissatisfaction and burnout has now reached epidemic proportions4, it is incumbent on us to perform formal usability evaluations on our tools, to reduce the frustrations contributing to this crisis.

Flowsheet design began with interviews of the experts and stakeholders during one-on-one sit downs. These encounters outlined the extent of the data that needed to be captured in a single source. Since DBS patient care involves an interdisciplinary team across multiple settings, the flowsheet would need to capture extraordinarily diverse data. Multiple mock-ups and static images were brought to the sit-down sessions, and these were useful to shape the overall design and feature set. These versions became the model to build the first functional flowsheet.

To best capture actionable feedback about this first version, a structured evaluation with both quantitative and qualitative elements was applied. Testing took place in a Playground environment that functionally matched the Production version. Study participants were chosen from the pool of clinicians and support staff who regularly need to access, record, and retrieve information in patients’ health records.

Cognitive walkthroughs were chosen since they evaluate how well the interface supports “exploratory learning,” or how well the first-time user can perform a task without formal training; they may uncover errors in design that interfere with task completion, and they explain mismatches between the users’ and the designers’ conception of a task5. When coupled with Think-aloud Technique, this combination provided copious feedback.

Tasks were selected for scenario-based testing and arranged in order of increasing complexity, appropriate for the end-user role of Clinician interviewing a patient. The first task was relatively simple, requiring few actions and taking little time, whereas subsequent tasks require the user to enter multiple fields and to explore distant parts of the flowsheet. Task #3 required users to enter information in a particular sequence, to trigger a cascade; certain rows that were hidden to reduce visual clutter would appear when users enter data in a select field. More complicated tasks were expected to require more time to complete, and this matches the Task Performance Summary results. It also offered more time for Think-aloud feedback during the more complex tasks.

Think-aloud feedback captured by the participants during their tasks yielded multiple action points (Table 4).

Table 4.

Think-aloud feedback

Think-aloud feedbackActions to Resolve
Table of Contents order does not match workflowChange group order and nesting
List of selection choices incompleteAdd list entries for relevant selections
Fix incorrect field namesUpdate display name for rows
Redundant fieldsConsolidate rows where applicable
Row descriptions do not show all choicesMore verbose row descriptions
Fields out-of-orderRearrange row order
Fields not presentAdd missing rows
Row restrictions missing or inappropriateUpdate row restrictions

Open in a separate window

Although the general workflow was outlined prior to development during formative sit-downs, clinical staff quickly discovered avenues for further optimization.

Users were particularly sensitive to row information (Figure 4) and row restrictions, when applicable.

Open in a separate window

Figure 4.

Row Information

Heuristic evaluation also offered valuable insight to the users’ experience with the flowsheet interface. This inspection method was chosen since it may prospectively uncover both major and minor problems with the user interface, indicate severity, and - importantly - suggest solutions. Additionally, it may capture issues that are missed by other methods6. For this study, a quick 6-item survey based on Norman’s Design Principles of Usability was provided to participants. Lowest scores for ‘Constraints’ and ‘Visibility’ emphasize the importance of building row restrictions (Figure 5) and an orderly table of contents, respectively. The high scores for Consistency, Mapping, and Affordance reflect the strengths of this flowsheet interface.

Open in a separate window

Figure 5.

Row Restrictions

System Usability Scale scores were normalized and calculated in the usual way (Figure 6). The SUS average of 76.7 places this first iteration of the flowsheet above the 70th percentile. While this confers an ‘above average’ usability rating, it may simply reflect Epic 2019’s general flowsheet design. Users are already familiar with basic flowsheets and how to navigate them. However, this serves as a useful baseline for future builds, which will need to accommodate more functionality and capture more complex data.

Open in a separate window

Figure 6.

System Usability Scale (SUS) Score Distribution structured, actionable feedback has been obtained, this information will be used to create a mature digital workflow that best fits the needs of DBS patients and the patient care team.

Overall, the results from this first version of the flowsheet highlight the importance of following formal usability practices during design, such as beginning with a formative evaluation and then engaging the target audience for structured feedback. Baseline metrics and actionable data are the crucial results from this stage of development. Quantitative and qualitative measures (the heuristic evaluation and the Think-aloud, in particular) provided valuable insights that will lead to a redesign of the flowsheet.

Conclusion

This is the first attempt to systematically evaluate the usability of an integrated DBS flowsheet. Now that

Figures & Table

References

1. AU Fasano A, Daniele A, Albanese A. Treatment of motor and non-motor features of Parkinson's disease with deep brain stimulation. Lancet Neurol. 2012;11(5):429. [PubMed] [Google Scholar]

2. Defining and Testing EMR Usability: Principles and Proposed Methods of EMR Usability Evaluation and Rating. 2009. HIMSS EHR Usability Task Force. [Google Scholar]

4. Norman Donald. New York: Basic Books. 1988. The Design of Everyday Things. [Google Scholar]

5. Shanafelt TD, Hasan O, Dyrbye LN, Sinsky C, Satele D, Sloan J, West CP. Changes in Burnout and Satisfaction With Work-Life Balance in Physicians and the General US Working Population Between 2011 and 2014. Mayo Clin Proc. 2015 Dec;90(12):1600–13. doi: 10.1016/j.mayocp.2015.08.023. [PubMed] [Google Scholar]

6. Wharton C, Rieman J, Lewis C. New York: John Wiley & Sons, Inc. 1994. The cognitive walkthrough method: A practitioner's guide. In: Nielsen J and Mack RL, editors. Usability inspection methods; pp. 105–40. [Google Scholar]

7. AHRQ Publication No. 11-0084-EF. August 2011. EHR Usability Toolkit: A Background Report on Usability and Electronic Health Records. [Google Scholar]

Articles from AMIA Summits on Translational Science Proceedings are provided here courtesy of American Medical Informatics Association

Utilizing User-Centered EHR Design for Systematic Deep Brain Stimulation Data Collection (2024)

References

Top Articles
Latest Posts
Article information

Author: Mrs. Angelic Larkin

Last Updated:

Views: 5831

Rating: 4.7 / 5 (47 voted)

Reviews: 94% of readers found this page helpful

Author information

Name: Mrs. Angelic Larkin

Birthday: 1992-06-28

Address: Apt. 413 8275 Mueller Overpass, South Magnolia, IA 99527-6023

Phone: +6824704719725

Job: District Real-Estate Facilitator

Hobby: Letterboxing, Vacation, Poi, Homebrewing, Mountain biking, Slacklining, Cabaret

Introduction: My name is Mrs. Angelic Larkin, I am a cute, charming, funny, determined, inexpensive, joyous, cheerful person who loves writing and wants to share my knowledge and understanding with you.