Skip to main content
  • Research article
  • Open access
  • Published:

Development, implementation and user experience of the Veterans Health Administration (VHA) dialysis dashboard

Abstract

Background

Adults with end-stage renal disease (ESRD) requiring chronic dialysis continue to suffer from poor health outcomes and represent a population rightfully targeted for quality improvement. Electronic dashboards are increasingly used in healthcare to facilitate quality measurement and improvement. However, detailed descriptions of the creation of healthcare dashboards are uncommonly available and formal inquiry into perceptions, satisfaction, and utility by clinical users has been rarely conducted, particularly in the context of dialysis care. Therefore, we characterized the development, implementation and user experience with Veterans Health Administration (VHA) dialysis dashboard.

Methods

A clinical-quality dialysis dashboard was implemented, which displays clinical performance measures (CPMs) for Veterans with ESRD receiving chronic hemodialysis at all VHA facilities. Data on user experience and perceptions were collected via an e-mail questionnaire to dialysis medical directors and nurse managers at these facilities.

Results

Since 2016 the dialysis dashboard reports monthly on CPMs for approximately 3000 Veterans receiving chronic hemodialysis across 70 VHA dialysis facilities. Of 141 dialysis medical directors and nurse managers, 61 completed the questionnaire. Sixty-six percent of respondents did not find the dashboard difficult to access, 64% agreed that it is easy to use, 59% agreed that its layout is good, and the majority agreed that presentation of data is clear (54%), accurate (56%), and up-to-date (54%). Forty-eight percent of respondents indicated that it helped them improve patient care while 12% did not. Respondents indicated that they used the dialysis dashboard for clinical reporting (71%), quality assessment/performance improvement (QAPI) (62%), and decision-making (23%).

Conclusions

Most users of the VHA dialysis dashboard found it accurate, up-to-date, easy to use, and helpful in improving patient care. It meets diverse user needs, including administrative reporting, clinical benchmarking and decision-making, and quality assurance and performance improvement (QAPI) activities. Moreover, the VHA dialysis dashboard affords national-, regional- and facility-level assessments of quality of care, guides and motivates best clinical practices, targets QAPI efforts, and informs and promotes population health management improvement efforts for Veterans receiving chronic hemodialysis.

Peer Review reports

Background

The delivery of high quality and cost-effective chronic dialysis care for adults with end-stage renal disease (ESRD) remains a daunting challenge as evidenced by stagnant poor health outcomes and high costs across the United States (U.S.) [1, 2]. The prevalence of ESRD among Veterans is disproportionately high and approximately double that of non-Veterans, owing in part to high rates of predisposing comorbid illnesses, including diabetes mellitus, hypertension, older age and other sociodemographic risk factors [3]. Like their non-Veteran counterparts, Veterans receiving chronic hemodialysis experience an annual mortality rate greater than 15%, spend more than 30 days hospitalized annually, and have annual mean total healthcare costs approaching $140,000 [3, 4]. Such poor outcomes underscore the critical need to improve quality of chronic dialysis care.

Quality measurement for chronic dialysis care is well established in the U.S. Beginning in the 1990’s, the Centers for Medicare and Medicaid Services (CMS) required ESRD Networks to monitor and improve the quality of dialysis care. Over the past decade, CMS has iteratively developed and refined a variety of clinical performance measures (CPMs) for ESRD for public reporting (e.g., Dialysis Facility Compare) and value-based purchasing (e.g., Quality Incentive Program) using clinical practice guidelines as an initial framework, but extending beyond consensus guidelines. Although there is little formal evidence that use of quality measures improve healthcare quality [5], they have become an essential facet of the healthcare environment [6].

To further improve quality of healthcare, the U.S. government has prioritized and devoted substantial resources to promoting adoption of electronic health records and more broadly spurring on health information technology in medicine as illustrated by the Health Information Technology for Economic and Clinical Health Act in 2009 [7]. One such example is a healthcare dashboard, which is a data visualization tool designed to provide summary data on performance, as compared with standard metrics, in a clear, timely, and efficient manner [8, 9]. Healthcare dashboards have been increasingly adopted for a variety of purposes including decision support, monitoring quality and safety improvement efforts, optimizing clinical management systems, surveillance of adverse clinical events, financial reporting for clinical units, and for academic detailing of clinical specialists to remedy suboptimal performance [8,9,10,11,12].

Surprisingly, formal inquiry into the impact of dashboards on end-users in the healthcare environment has been rarely conducted [8]. While dashboards may facilitate improvements in adherence to clinical guidelines and safety measures, resource optimization and efficiency, and staff accountability and engagement, they may also entail unintended consequences such as greater staff workload, disruptions to clinical workflow, and improper focus and use of quality measures [8, 10, 11, 13,14,15,16]. Few studies have examined these potential tradeoffs by characterizing the creation, implementation, and use of healthcare dashboards [9,10,11,12, 14, 16, 17], and only two evaluated user perceptions and satisfaction [18, 19]. Moreover, although considerable heterogeneity exists in the clinical foci and settings of previously described dashboards, none focused on kidney disease [8]. Because of their unusually high costs and poor outcomes, adults with ESRD requiring chronic dialysis are a highly deserving clinical population for quality improvement [1], which could be facilitated by dashboard reporting.

The Veterans Health Administration (VHA) has been a leader in healthcare quality and leveraging quality measurement and health information technology [20,21,22]. In 2012, the VHA National Kidney Program began a process of leveraging multiple kidney disease and dialysis informatics initiatives to build an internal kidney disease surveillance system, which included the development of a VHA dialysis dashboard whose primary goal was to guide and motivate high quality care by providing national and facility level assessments of the quality of chronic dialysis care across VHA. In this manuscript, we report on the development, implementation, and user experience with the VHA dialysis dashboard.

Methods

We employed a four-phase process to create a national VHA dialysis dashboard for capturing and reporting CPMs for approximately 3000 Veterans with ESRD receiving chronic hemodialysis at 70 VHA dialysis facilities across the U.S. (Fig. 1). Creation of the dialysis dashboard began in 2013 and by 2016 there was 100% dialysis facility participation. The primary expected user population consists of VHA nephrologists and dialysis staff (e.g., nurses, technicians).

Fig. 1
figure 1

Development Process Used for the VHA Dialysis Dashboard

Analysis

The analysis phase involved establishing goals, forming a dashboard team, and selecting CPMs. There was broad consensus amongst VHA specialty care clinicians that the overarching goal was to report quality of VHA dialysis care through CPMs on an electronic dashboard that would be accessible to diverse stakeholders for patient care. A VHA dialysis dashboard team was assembled to serve as a steering committee. The team attended a dialysis summit meeting in November 2012 with representatives from various program offices that included the VHA National Program for Kidney Disease and Dialysis, VA Center for Applied System Engineering (VA CASE), VA Inpatient Evaluation Center (IPEC), VA Allocation Resource Center (ARC), VA Support Services Center (VSSC), VA Decision Support System (DSS), VA Patient Care Services (PCS), VA Health Services Research and Development (HSR&D), and VA Office of Information and Analytics (OIA). Early in this process, a CPM subcommittee with representation from leadership, operations, clinical services, and research was assembled to review, discuss, and select CPMs for this evaluation. The diversity of this stakeholder committee was deemed essential to achieve broad consensus for CPMs and ensure acceptance in the field.

Selecting ESRD CPMs for inclusion in the VHA dialysis dashboard consisted of an 8-month-long transparent, stakeholder-driven process. The first several meetings focused on establishing the principles to guide CPM selection. In consideration of known principles guiding selection criteria for performance measures, the CPM subcommittee sought to include measures that were timely, clinically relevant to chronic hemodialysis care, informed by solid evidence, and feasible and usable in VHA to enable robust improvement in the quality of VHA ESRD [23]. The subcommittee desired to have a mix of process and outcome measures that address different aspects of ESRD care [24]. Like other large healthcare reporting systems, the committee decided to use only explicit measures. Additionally, because the main purpose of the VHA dialysis dashboard is quality assessment and improvement at the facility level, only facility-level measures were chosen. Finally, because the initial focus of the VHA dialysis dashboard would be benchmarking to community performance, only mature vetted measures were considered.

The next several meetings of the performance measure subcommittee were focused on determining sources for CPMs. The subcommittee agreed to adopt CPMs from those already established and promulgated as mature measures by leading organizations instead of developing and vetting measures de-novo because these measures had already gained acceptance, would afford benchmarking, and lastly because of internal resource constraints. An environmental scan revealed five key sources for the potential ESRD CPMs: CMS, American Medical Association-Physician Consortium for Performance Improvement (AMA-PCPI), Kidney Care Quality Alliance (KCQA) and organizations that review, endorse, and adopt measures (National Quality Forum (NQF) and Ambulatory Care Quality Alliance (AQA). A total of 78 measures from these organizations were identified and they spanned broad ESRD clinical domains such as mortality, hospitalization, dialysis adequacy, vascular access, anemia, bone and mineral metabolism, infection, immunization, transplant, and end-of-life planning (Supplementary Table 1).

The last component of this process was a series of teleconference discussions and interval web-based voting and commenting by members of the subcommittee on these 78 CPMs, which last several months. Ultimately, 11 mature facility-level CPMs were selected out of the initial 78 measures identified from reference organizations. The final performance measures encompass dialysis solute clearance, vascular access, and surveillance measures of anemia management, bone and mineral metabolism, infection, and immunization (Table 1).

Table 1 VHA Dialysis Dashboard Clinical Performance Measures

Design

Concurrently, the technical subcommittee investigated the availability of VHA data for the CPMs. They determined that gathering the extensive data needed would require not only leveraging the VHA electronic medical record and corporate data warehouse (CDW) but creating a web-based application for field reporting as well. Significant human resources and close collaboration across VA program offices and service lines were required to design a dashboard and were carried out through coordinated teleconferences over a 12-month period. The CPM subcommittee detailed specifications of the required data elements for each CPM, including acceptable ranges for laboratory values and proper calculation of derived values. For each selected CPM, the CPM subcommittee specified criteria for patient eligibility and inclusion/exclusion from numerator and denominator. A patient was eligible for inclusion in the CPM denominator if the patient had been receiving in-center outpatient hemodialysis treatments at the facility for at least 30 days as of the last day of the study period (e.g. calendar month). Secondary filters (e.g. ESRD duration, number of HD sessions in calendar month, etc.) were applied when applicable, in accordance with the technical specifications of the source where the measure was adopted from (e.g. NQF, CMS QIP, KCP). The subcommittee found that numerator and denominator specifications for established CPMs were often incomplete or insufficient for implementation in healthcare systems. Therefore, additional details were sought from available technical specification documents of measure developers, and/or reasonable standards were employed based on similar measures employed elsewhere. Specifications and eligibility naturally impacted CPM reporting and results. For example, for a few CPMs (e.g., minimum delivered hemodialysis dose), an additional requirement of ≥90 days since initiating maintenance HD was required for denominator inclusion. Also, some CPMs (e.g., monthly measurement of hemodialysis dose) required a patient to have ≥7 outpatient HD treatments at the dialysis facility during the calendar month for inclusion (in concurrence with CMS QIP criteria at that time), while others (e.g., maximizing AVFs) required ≥4 outpatient hemodialysis treatments during the calendar month for patient inclusion. The time period for numerator specifications also varied across CPMs. The majority required monthly assessment of whether patients met numerator criteria but a few such as the proportion of patients with hypercalcemia CPM required a 3-month rolling average assessment.

The technical subcommittee investigated the specific data elements needed for the CPMs and designed the general dashboard format, based on technology available in the VSSC. Analogous to the experience of others with national dashboards [10], significant variation in laboratory test names existed among VA facility laboratories across the United States. Although VHA is a national integrated healthcare system with a national data repository, laboratory test names are locally determined. To standardize the laboratory categories and facilitate extraction from VA electronic data warehouses, the dashboard team needed to correctly identify and map the laboratory names to standardized categories. Moreover, it is necessary for the team to continuously monitor local test names as new tests are made available within the system and program mapping must be updated to reflect these changes. We leveraged strategies and standards noted by others who have worked with VA data [25], including using the Logical Observation Identifiers Names and Codes (LOINC) framework to correctly structure and organize the data for abstraction. As part of the extraction of data elements from the various VA data sources methods for data mining as well as methods for synchronized nightly data extraction were specified via SQL queries. To supplement available elements extractable from exiting electronic databases, a web application was constructed to provide an interface for the dialysis clinical staff to report data that was otherwise unavailable.

The resultant web-based dashboard design was iteratively reviewed by human factors experts who applied usability heuristics and recommended format changes [26]. The teams made several key design decisions, including:

  • □ numerical values achieve the goal of reporting CPM values better than charts [27]

  • □ including a goal of ‘H’ (i.e., high) or “L’ (i.e., low) or a value where available, close to the actual values related to each measure allows for comparisons [28]

  • □ presentation of historical values allows for local trend analysis [27]

  • □ use of color

  •  □ horizontal and vertical color shading separate CPM and historical groupings to reduce conjunction search time [29]

  •  □ muted colors selected for their non-distracting appeal

  •  □ not to use color to indicate CPM value proximity to goal because they would overwhelm end-users [30]

  • □ linking to a file containing data definitions and to an email to the help desk if needed (help and documentation Heuristic and a VSSC standard)

  • □ allowing export to other applications if needed (flexibility and efficiency of use Heuristic and a VSSC standard)

Implementation

The implementation phase of VHA dialysis dashboard included pilot testing, a coordinated series of national rollout calls, and user acceptance testing. First, several VA dialysis facilities volunteered for pilot testing which involved validating eligible patient identification and extracted laboratory and medication data displayed on the dashboard. This process was critical in improving and refining the use of electronically abstracted data (e.g., administrative codes, laboratory values, etc.) for CPM calculation and patient identification. Identification of eligible patients from electronic records who met inclusion criteria was difficult because of issues such as patient vacation, interfacility transfers, and hospitalizations. Therefore, iterative modifications were required in selection algorithms with validation by these facilities. The pilot testing also allowed users to provide feedback regarding the web application for data entry. Second, a series of formal training sessions and rollout calls were held via teleconference with dialysis facility managers and the VHA dialysis dashboard team, which were critical for introducing the dashboard and receiving feedback. These sessions allowed for a detailed description of the functionality of the dashboard, including links to data definitions and patient level data, ways to export data for reports, and help desk links. Data security and dashboard access policies were reviewed at these sessions because of the importance to protect patient confidentiality. All stakeholders were reminded of VHA policies regarding patient identifying information (PII) and patient health information (PHI), including the required training for and regular review of all who access these data in conjunction with their job description and role in VHA. Moreover, access to the dashboard was subsequently enforced through network security measures. Finally, in collaboration with a usability lab, the VHA dialysis dashboard offered an extended period of user acceptance testing that was key to promoting an open forum for feedback and further validation from the field.

The final dashboard displays VHA’s dialysis facility performance on the CPMs in table form.

It affords the user the ability to compare VHA national and facility performance with CMS benchmarks, other VHA facilities, and VHA national average (Fig. 2). It displays details of the measure (numerators and denominators) and provides comprehensive descriptions by selecting the measure. Furthermore, the user may drill down to patient level data for each CPM to explicitly determine the inputs for the performance calculation. Figure 2a shows a portion of the dashboard format for national summary data for each CPM while Fig. 2b shows the more detailed facility format and Fig. 2c show patient-level view. The user logon filters the visibility of facility and patient data.

Fig. 2
figure 2

a VHA Dialysis Dashboard: National Level Data. b VHA Dialysis Dashboard: Facility Level Data. c VHA Dialysis Dashboard: Patient Level Data

Maintenance

During the maintenance phase, while the field used the dashboard, the development team developed a questionnaire to evaluate the constructs of Perceived Ease of Use and Usefulness as defined in the Technology Acceptance Model (TAM) [31]. According to TAM, these constructs form the basis for actual system use (Fig. 3).

Fig. 3
figure 3

The Technology Acceptance Model

The questionnaire consisted of three parts: seven questions about demographics and professional experience, seven closed- and open-ended questions about use and application of the dashboard, and fifteen questions to assess perceptions of the dialysis dashboard on a Likert scale from one to five, representing ‘strongly disagree’ to ‘strongly agree’. A mix of negative and positive questions was used to reduce acquiescence bias [32]. Approximately 6 months after full implementation, an e-mail was sent to VHA nurse managers and medical directors at all 70 VHA outpatient dialysis clinics inviting them to participate in a confidential self-administered questionnaire (Additional file 3). This e-mail contained a short paragraph explaining the purpose of the questionnaire, which was to determine the role and usefulness of the VHA dialysis dashboard by assessing end-users’ use, satisfaction, and application of it to Veteran care. The survey was closed to end user respondents 6 weeks after the invitation distribution.

Results

Participant characteristics

Of the 141 VHA dialysis medical directors and nurse managers who were invited, 61 (43%) responded. Among the 61 respondents, 55 were aware of the VHA dialysis dashboard and had accessed it, 3 were aware of it but had not accessed it, and 3 were not aware of its availability. The final analytic cohort consisted of the 52 respondents who were aware of the VHA dialysis dashboard, had accessed it, and completed at least a portion of the questionnaire.

More than two-thirds of the respondents were 41 to 60 years of age (71%), female (69%), and nurse managers (73%) (Table 2). Nearly all respondents (88%) had been in their position for at least 1 year. Approximately one-half of respondents work with computers on their job greater than 6 h per work-day, and three-quarters (73%) characterize their computer skills and experience as average, and 65% characterize their skill and experience with medical information systems as average. A substantial majority of respondents (77%) use the VHA dialysis dashboard on a monthly or weekly basis.

Table 2 Characteristics of Respondents to VHA Dialysis Dashboard Survey (n = 52)

VHA Dialysis dashboard and clinical management

Dialysis staff indicated the predominant uses of the dialysis dashboard included clinical reporting (71%) and quality assessment and improvement (62%) (Table 3). They also used the dashboard data for clinical decision making (23%) and for comparisons with their own internal data and comparisons of their facility’s performance with other VA facilities. Sixty-five percent of respondents indicated that they used the dashboard with other applications including the Centers for Medicare and Medicaid Services (CMS) CROWNWeb national ESRD patient registry and quality measure reporting system, the VHA Medical Director toolkit, and National Kidney Foundation patient education tools.

Table 3 Uses of the VHA Dialysis Dashboard in Clinical Management (n = 52)

Participant perceptions and experience with the VHA Dialysis dashboard

Sixty-six percent of respondents do not find the dashboard difficult to access and furthermore 64% agree that the dashboard is easy to use (Table 4). Approximately one-half of respondents agreed that the layout of the dialysis dashboard is good (59%) and that the presentation of data on the dialysis dashboard is clear (54%) and organized (48%), while few disagreed with these characterizations (≤ 20%). A substantial plurality of respondents did not find it difficult to find data on the dialysis dashboard (41%). Slightly more than 50% of respondents agreed that data for their facility on the dialysis dashboard is accurate and up-to-date. Few respondents agreed with the statement that the dialysis dashboard is not useful in taking care of dialysis patients (12%) while 48% indicated that it had helped them improve the care of dialysis patients. More respondents were satisfied (39%) than not satisfied (25%) with the dashboard’s design, and nearly 50% indicated that helpful answers to questions and support for the dashboard is sufficient. Additional written comments in space provided on the questionnaire included three respondents who stated that the dialysis dashboard is a ‘nice’ or ‘excellent tool’ and two respondents who desired more ‘up-to-date’ or ‘timely’ data on the dashboard. In subgroup analysis, responses among nurse managers/others and physician medical directors did not demonstrate substantial differences and revealed a high level of thematic convergence (Supplementary Table 2).

Table 4 Level of Agreement with Statements Regarding the VHA Dialysis Dashboard (n = 52)

VHA Dialysis dashboard clinical performance measure results

During the 12-month period following full implementation of the VHA dialysis dashboard (i.e., the end of 2016 until the end of 2017), reporting of process measures (e.g., measurement of delivered hemodialysis dose, assessment of iron stores, measurement of serum calcium) on the dashboard revealed consistently high performance across all facilities with rather small performance gaps requiring improvement. In contrast, reporting of several intermediate outcome measures (e.g., placement of AVFs, use of chronic dialysis catheters) revealed a consistent performance gap across most facilities and a substantial variation in performance across facilities.

Discussion

Since initial implementation, the VHA dialysis dashboard has reported monthly clinical performance for 11 quality measures for approximately 3000 Veterans at 70 VHA maintenance HD facilities across the United States. This electronic dashboard is accessible by VHA operations and clinical dialysis staff, and is being used for administrative reporting as well as quality assurance and performance improvement (QAPI) activities in important clinical areas of dialysis adequacy, vascular access, anemia, bone and mineral metabolism, and infection surveillance. Among VHA dialysis medical facility physician directors and nurse managers using the dialysis dashboard 1 year after its implementation with 100% facility participation, the majority found it accurate, up-to-date, easy to use, and helpful in improving the care of their patients. Nearly two-thirds of these users not only used the dialysis dashboard for clinical reporting and QAPI activities but also in combination with other clinical applications to enhance patient care.

As recently noted, healthcare organizations are increasingly implementing quality or clinical dashboards to improve patient care [8]. Similar to the electronic medical record, healthcare dashboards will continue to have a growing role in healthcare delivery, and they will need to be integrated by healthcare providers in their day-to-day care delivery. Nevertheless, only 14 published articles formally evaluated the impact of dashboards on clinical outcomes or examined clinician perceptions of dashboards and their utility [8,9,10,11]. Among these studies, 12 focused on single departments or facilities with relatively small sample sizes [8], and only one described a quality dashboard [18]. While clinical dashboards provide timely clinical data to clinicians so that they can make informed daily decisions about patient care, quality dashboards display information on performance measures at a unit or organizational level to assist administrators and managers with decision-making [8].

Although considerable heterogeneity exists in the clinical foci and settings of dashboards previous published studies, none focused on kidney disease [8]. Considering their increasing size, rising healthcare costs, and stagnating poor health outcomes, adults with ESRD requiring chronic dialysis are a highly deserving clinical population for quality improvement [1], which could be facilitated by dashboard reporting. The VHA dialysis dashboard combines features of both quality and clinical dashboards to drive improvement. It displays facility-level performance on quality measures with comparisons against recognized CMS benchmarks as well as other VHA facilities, and it is engineered with drill-down capability to patient-level clinical data. Hence, our report characterizing the creation and user experience of a clinical-quality dashboard used by dialysis facilities throughout a large national integrated healthcare system makes an important and timely contribution to literature.

Only two prior studies queried user satisfaction and perceptions of their dashboards [18, 19]. In three clinical areas of a National Health Service mental health trust in the United Kingdom, 21 mental health team members (nurses, medical staff, clerical staff) completed an anonymous questionnaire about their experiences with a recently implemented clinical-quality dashboard [18]. This dashboard used a variety of different graphics with color coding to present information ranging from clinical data such as number of available beds to quality measures such as percentage of fall assessments completed. Key findings included that 38% found the dashboard to be useful, 71% found it easy to use, and 86% found the format easy to understand. The most commonly cited benefits were timely access to information, increased communication and information-sharing, and data quality while difficulties included staff access, inaccurate data, and increased workload. In 2009, 175 clinical personnel (medical trainees, clerical staff, nurses, physicians) completed an anonymous written two-page survey regarding a recently implemented clinical dashboard in an emergency room in Beirut, Lebanon [19]. Key features of this dashboard included a patient severity index score as well as clinical laboratory and radiology results being laid out in a color-coded cubical form to designate whether the information had been reviewed by a team member. According to the few Likert scale questions that assessed user experience, 93% concurred (i.e., strongly agreed or agreed) that the dashboard was easy or extremely easy to use, 84% concurred that the dashboard allowed more time to take care of patients, and 83% concurred that the layout helped keep them organized [19]. In contrast, 38% of users felt that too many clicks were needed to navigate the dashboard.

The experience and satisfaction of nurse and physician users of the dialysis dashboard appears similar to that of these prior reports [18, 19]. Like prior studies, we designed our questionnaire to assess user perceptions around themes of access, ease of use, design, and clinical usefulness. In these domains, we noted similar findings since approximately 65% found the dialysis dashboard easy to use, less than 5% found it difficult to access or navigate, nearly 50% found that it helped to improve patient care, and a plurality if not a majority found the design clear, organized, and well laid out. We attribute some of these results to our extended period of user acceptance testing, feedback, and iterative refinement, which is well-known to be critical to successful usability [10, 33]. Recognizing their importance in sustained dashboard implementation [8], we also included questions to assess user perceptions of data integrity and support. Recognizing the reported wide variation in accuracy of electronically reported quality measures [34], we directed substantial efforts to data validation with the user field before dashboard implementation. Roughly 50% or more users of the dialysis dashboard viewed the data as timely and accurate and programmatic support for the dashboard as sufficient, while only 25% or less held an opposing viewpoint.

In contrast to prior studies, we also sought to understand how the dialysis dashboard was being used and incorporated by providers into patient care. Consistent with the objectives for creating the dialysis dashboard, nearly 2 two-thirds of physicians and nurse managers utilized the dashboard for clinical reports and QAPI activities. Moreover, users appear to be integrating the dashboard with other clinical applications such as existing CMS reporting portals and patient education materials for care delivery. However, the impact of the dialysis dashboard alone or in combination with other tools on processes of care and patient outcomes could not be assessed in this study. As noted elsewhere, there is scant evidence that clinical and quality dashboards have a positive effect on health outcomes, and more investigation is needed [8]. Moreover, quality dashboards may warrant scrutiny considering the recognized pitfalls and unintended consequences of quality measures, which include reporting burden, insufficient evidence, lack of validity, improper endorsement, inappropriate application, non-patient-centered, tunnel vision, measurement fixation, and misuse of healthcare resources [6, 35,36,37,38,39].

There are limitations to our study. Although we sampled all end users of the dialysis dashboard, our response rate was 43% and a higher proportion of nurse managers responded than physician medical directors; therefore, our findings may be subject to non-response bias. Because healthcare providers are generally more homogenous than the public in regard to knowledge, attitudes, and behavior, this concern is lessened [40]. Moreover, subgroup analyses comparing physician medical directors and nurse managers/others revealed a high level of thematic convergence in responses. Additionally, we purposely created a mix of questions where both high and low scores on Likert scales connoted a positive user response to reduce bias. Second, we assessed user perceptions and experience during a single time period. We recognize that such perceptions of the dashboard may change over time and impact sustainability. Lastly, while we surveyed users regarding how the dashboard was incorporated into clinical care, we had only a short period of observation (i.e., 12 months) to analyze changes in achievement of clinical performance measures coincident with questionnaire administration. Although the period of observation limits inferences of the effect of the dashboard on clinical practice and patient outcomes, these preliminary observations have already begun the VHA to focus efforts on best practices in regard to these clinical areas.

Conclusions

The VHA clinical-quality dialysis dashboard affords national and facility level assessments of the quality of chronic dialysis care, guides and motivates best clinical practices, targets QAPI efforts at the facility and patient level, and promotes a cohesive approach to process improvement efforts at a national level for population management. The detailed multistep interdisciplinary stakeholder and user-driven implementation process for developing and implementing the dialysis dashboard serves as a prototype for dashboards of other chronic illnesses. Furthermore, characterizing the data selection and extraction processes for quality measurement and conveying an understanding of these methods is especially important since they have a large impact on the reliability of quality measure reporting, which is essential for the integrity of enterprise wide assessments and comparisons as well as stakeholder buy-in. While the dialysis dashboard offers tremendous potential to facilitate improvement in health outcomes for Veterans with ESRD requiring chronic dialysis, future work is needed to formally evaluate its impact on health outcomes.

Availability of data and materials

The datasets generated and/or analyzed during the current activity are not publicly available due VHA policies, but are available from the corresponding author on reasonable request and if approved by VHA.

Abbreviations

AQA:

Ambulatory Care Quality Alliance

AMA-PCPI:

American Medical Association-Physician Consortium for Performance Improvement

ARC:

VA Allocation Resource Center

CDW:

Corporate Data Warehouse

CMS:

Centers for Medicare and Medicaid Services

CPMs:

Clinical Performance Measures

DSS:

VA Decision Support System

ESRD:

End-Stage Renal Disease

HSR&D:

VA Health Services Research and Development

IPEC:

VA Inpatient Evaluation Center

KCQA:

Kidney Care Quality Alliance

NQF:

National Quality Forum

OIA:

VA Office of Information and Analytics

PCS:

VA Patient Care Services

QAPI:

Quality Assurance and Performance Improvement

TAM:

Technology Acceptance Model

U.S.:

United States

VA CASE:

VA Center for Applied System Engineering

VHA:

Veterans Health Administration

VSSC:

VA Support Services Center

References

  1. United States Renal Data System: Available at: http://www.usrds.org. Accessed April 1, 2019.

  2. Parker T, Hakin R, Nissenson AR, et al. Dialysis at a crossroads: 50 years later. Clin J Am Soc Nephrol. 2011;6:457–61.

    Article  Google Scholar 

  3. Identifying VA’s Internal and External Options for Optimizing Delivery of Veteran Dialysis Care. VA Dialysis Strategic Plan. Department of Veterans Affairs. March 2, 2012.

  4. Hynes DM, Stroupe KT, Fischer MJ, et al. Comparing VA and private sector healthcare costs for end-stage renal disease. Med Care. 2012;50(2):161–70.

    Article  Google Scholar 

  5. Saver BG, Martin SA, Adler RN, et al. Care that matters: quality measurement and health care. PLoS Med. 2017;12(11):1–10.

  6. Panzer RJ, Gitomer RS, Greene WH, et al. Increasing demands for quality measurement. JAMA. 2013;310(18):1971–80.

    Article  CAS  Google Scholar 

  7. Murdoch TB, Detsky AS. The inevitable application of big data to health care. JAMA. 2013;309:1351–2.

    Article  CAS  Google Scholar 

  8. Dowding D, Randell R, Gardner P, et al. Dashboards for improving patient care: review of the literature. Int J Med Inform. 2015;84:87–100.

    Article  Google Scholar 

  9. Weiner J, Balijepally V, Tanniru M. Integrating strategic and operational decision making using data-driven dashboards: the case of St. Joseph mercy Oakland hospital. J Healthc Manag. 2015;60(5):319–30.

    Article  Google Scholar 

  10. Stadler JG, Donlon K, Siewert JD, et al. Improving the efficiency and ease of healthcare analysis through use of data visualization dashboards. Big Data. 2016;4(2):129–35.

    Article  Google Scholar 

  11. Shaw SJ, Jacobs BJ, Stockwell DC, et al. Effect of renal-time pediatric ICU safety bundle dashboard on quality improvement measures. Jt Comm J Qual Patient Saf. 2015;41(9):414–20.

    PubMed  Google Scholar 

  12. Lau MK, Bounthavong M, Kay CL, et al. Clinical dashboard development and use for academic detailing in the US Department of veterans affairs. J Am Pharm Assoc. 2019;59(2S):S96–S103.

    Article  Google Scholar 

  13. Fracasso MR, Sanders B. Two keys to deliver better care and measure quality: pod implementation and dashboards. Physician Exec. 2012;38(6):48–52.

    PubMed  Google Scholar 

  14. Loeb BB. A dashboard for medical staff goals. Trustee. 2010;63(3):35–6.

    PubMed  Google Scholar 

  15. Ash JS, Berg M, Coiera E. Some unintended consequences of information technology in health care: the nature of patient care information system-related errors. J Am Med Inform Assoc. 2004;11:104–12.

    Article  Google Scholar 

  16. Koopman RJ, Kochendorfer KM, Moore JL, et al. A diabetes dashboard and physician efficiency and accuracy in accessing data needed for high-quality diabetes care. Ann Fam Med. 2011;9(5):398–405.

    Article  Google Scholar 

  17. Ghazisaeidi M, Safdari R, Torabi M, et al. Development of performance dashboards in healthcare sector: key practical issues. Acta Inform Med. 2015;23(5):317–21.

    Article  Google Scholar 

  18. Daley K, Richardson J, James I, et al. Clinical dashboard: use in older adult mental health wards. Psychiatrist. 2013;37:85–8.

    Article  Google Scholar 

  19. Batley NJ, Osman HO, Kazzi AA, et al. Implementation of an emergency department computer system: design features with users values. J Emerg Med. 2011;41(6):693–700.

    Article  Google Scholar 

  20. Kizer KW, Demakis JG, Feussner JR. Reinventing VA health care. Med Care. 2000;38(suppl):17–116.

    Google Scholar 

  21. Jha AK, Perlin JB, Kizer KW, et al. Effect of the transformation of the veterans affairs health care system on quality of care. N Engl J Med. 2003;348:2218–27.

    Article  Google Scholar 

  22. Atkins D, Clancy CM. Advancing high performance in veteran affairs health care. JAMA. 2017;318(9):1927–8.

    Article  Google Scholar 

  23. Chassin MR, Loeb JM, Schmaltz SP, et al. Accountability measures – using measurement to promote quality improvement. N Engl J Med. 2010;363(7):683–8.

    Article  CAS  Google Scholar 

  24. Smith KA, Hayward RA. Performance measurement in chronic kidney disease. J Am Soc Nephrol. 2011;22(2):225–34.

    Article  Google Scholar 

  25. McGinnis KA, Skanderson M, Levin FL, Brandt C, Erdos J, Justice AC. Comparison of two VA laboratory data repositories indicates that missing data vary despite originating from the same source. Med Care. 2009;47(1):121–4.

    Article  Google Scholar 

  26. Nielsen J. Enhancing the explanatory power of usability heuristics. Boston, MA: Proc. ACM CHI ‘94 Conf; 1994. p. 152–8.

    Google Scholar 

  27. Vessey I. Cognitive fit theory: a theory-based analysis of the graphs versus tables literature. Decis Sci. 1991;22(2):219–40.

    Article  Google Scholar 

  28. Carswell CM. Choosing Specifiers: An evaluation of the basic tasks model of graphical perception. Hum Factors. 1992;34(5):535–54.

    Article  CAS  Google Scholar 

  29. Treisman A, Gelade G. A feature integration theory of attention. Cogn Psychol. 1980;12:97–136.

    Article  CAS  Google Scholar 

  30. Wickens CD, Hollands JG. Engineering psychology and human performance. Englewood Cliffs: Prentice Hall; 1999.

    Google Scholar 

  31. Davis F. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1986;13(3):319–39.

    Article  Google Scholar 

  32. Jackson DN. Acquiescence response styles: problems of identification and control. In: Berg IA, editor. Response set in personality measurement. Chicago: Aldine; 1967.

    Google Scholar 

  33. Johnson CM, Johnson TR, Zhang J. A user-centered framework for redesigning health care interfaces. J Biomed Inform. 2005;38(1):75–87.

    Article  Google Scholar 

  34. Kern LM, Malhotra S, Barron Y, et al. Accuracy of electronically reported “meaningful use” clinical quality measures. Ann Intern Med. 2013;158:77–83.

    Article  Google Scholar 

  35. Wilensky G. The need to simplify measuring quality in health care. JAMA. 2018;319(23):2369–70.

  36. Powell AA, White KM, Partin MR, et al. Unintended consequences of implementing a National Performance Measurement System into local practice. J Gen Int Med. 2012;27:405–12.

    Article  Google Scholar 

  37. MacLean CH, Kerr EA, Qaseem A. Time out – charting a path for improving performance measurement. NEJM. 2018;378(19):1757–61.

    Article  Google Scholar 

  38. Walter LC, Davidowitz NP, Heineken PA, et al. Pitfalls of converting practice guidelines into quality measures. JAMA. 2004;291(20):2466–70.

    Article  CAS  Google Scholar 

  39. Casalino LP. The unintended consequences of measuring quality on the quality of medical care. N Engl J Med. 1999;341:1147–50.

    Article  CAS  Google Scholar 

  40. Kellerman SE, Herold J. Physician response to surveys. American J of Preventive Medicine. 2001;20(1):61–7.

    Article  CAS  Google Scholar 

Download references

Acknowledgements

The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs or the United States government.

Funding

As this non-research activity was part of quality assessment and improvement activities by the authors who are employees of VHA, no specific funding was provided.

Author information

Authors and Affiliations

Authors

Contributions

MJF: study design, data analyses, contributed to drafting the manuscript, approved the final version of the manuscript. WMK: study design, contributed to drafting the manuscript, approved the final version of the manuscript. KS: study design, data collection, contributed to drafting the manuscript, approved the final version of the manuscript. KF: study design, data collection, contributed to drafting the manuscript, approved the final version of the manuscript. CG: study design, approved the final version of the manuscript. NL: study design, contributed to drafting the manuscript, approved the final version of the manuscript. SL: study design, data collection, contributed to drafting the manuscript, approved the final version of the manuscript. KM: study design, approved the final version of the manuscript. GR: study design, data collection, approved the final version of the manuscript. PMP: study design, contributed to drafting the manuscript, approved the final version of the manuscript. STC: study design, contributed to drafting the manuscript, approved the final version of the manuscript. All authors have read and approved this manuscript.

Corresponding author

Correspondence to Michael J. Fischer.

Ethics declarations

Ethics approval and consent to participate

This manuscript contains survey data of VHA dialysis staff (medical directors and nurse managers) to assess their use, satisfaction, and application of the Dialysis Dashboard to Veteran care. The VHA National Kidney Disease and Dialysis Program office conducted this confidential electronic survey (questionnaire). Participation and completion of this survey by dialysis staff were completely voluntary and participants were assured that all individual survey results would be kept confidential and no identifying information would be used in reports. The Edward Hines VA Institutional Research Board (IRB) formally reviewed this project protocol and all accompanying products from it. Edward Hines VA IRB determined that it does not meet criteria for research under the Common Rule at 38 CFR 16.102 and HHS at 45 CFR 46, and therefore is not required to have formal IRB review and approval for research. Additionally, informed consent was deemed unnecessary by the Edward Hines VA IRB because this employee survey was completely voluntary and not considered research.

Consent for publication

Not applicable. This manuscript does not contain any individual person’s data (e.g., individual details, images, or details). Rather, it contains aggregate de-identified data.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1: Supplementary Table 1.

Initial Clinical Performance Measures Identified by the VHA Dialysis Dashboard Committee.

Additional file 2: Supplementary Table 2.

Level of Agreement with Statements Regarding the VHA Dialysis Dashboard: Subgroup Findings.

Additional file 3.

Survey of Dashboard Users – Medical Directors and Nurse Managers.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fischer, M.J., Kourany, W.M., Sovern, K. et al. Development, implementation and user experience of the Veterans Health Administration (VHA) dialysis dashboard. BMC Nephrol 21, 136 (2020). https://doi.org/10.1186/s12882-020-01798-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12882-020-01798-6

Keywords