The Digital Dashboard: The Solution to Keep Your Eyes on the Road
The PACS road is a well-traveled one. While many facilities have deployed PACS, many more are just jumping into the digital era. Still more facilities are shopping for or deploying second- or third-generation systems. While lots of people are ‘driving’ PACS, travel is not as smooth as it could be. Digital image management has not solved the radiology workflow conundrum; in some cases, it has created new roadblocks. For example, radiologists and administrators often remain unaware of bumps in the road or travel hazards like network bottlenecks or piles of unsigned reports. The upshot? Driving on the digital radiology highway is not as efficient or effective as it could be.
Digital dashboards offer a software solution to common radiology business bottlenecks by providing a bird’s eye view of the department — facilitating maximum productivity among individuals and systems. With help akin to an air traffic controller, patient image and information ‘traffic’ (a.k.a. workflow) is smooth and high-speed.
The “views” are tailored by need to the radiologist, radiology administrator and PACS or IT administrator. Some of the measured parameters include RVUs, unsigned reports, unread studies, average patient wait times and scanner utilization which includes total scans per day and per year and when service is scheduled. Other parameters can include priorities according to patient acuity, patient wait times for imaging studies and the best methods for notifying physicians and administrators to resolve issues such as unsigned and unread studies as well as offering proactive support.
A quick look in the rear view mirror
In the analog era, workflow management was a fairly simple process. A stack of unread films provided a tangible cue of mounting work. In the filmless, paperless environment; however, it’s not clear which studies are stat, which images have been interpreted or which reports need a signature. The challenge for the PACS administrator is equally tough as there is no barometer to gauge the performance of the PACS. In many cases, there is a slow degradation in performance until…the inevitable crash.
While digital has been touted as the solution, radiology departments must undergo the next phase of the digital evolution before realizing the promise of PACS. “Going digital is not the whole story,” asserts Matthew Morgan, MD, MS, radiology resident at University of Pittsburgh Medical Center (UPMC) who in April presented “Flying Blind: Using a Digital Dashboard to Navigate a Complex PACS Environment” at the annual meeting of Society for Computer Applications in Radiology/Society for Imaging Informatics in Medicine, which sponsored the research. Deploying digital image management is a mere first step in the process, says Morgan.
Optimizing workflow is an entirely different, and the next, task. Paul Nagy, PhD, director, informatics research and assistant professor of radiology at University of Maryland School of Medicine in Baltimore agrees. He says the PACS industry is maturing and the next step is a system to optimize PACS.
The ultimate radiology department is an assembly line that incorporates technology to facilitate real-time practice, says Bruce Reiner, MD, director of research VA Maryland Health Care System in Baltimore, Md. The assembly line model relies heavily on immediate and comprehensive access to patient data. From the moment the patient enters the imaging chain, users have access to the data needed to interpret studies, communicate results and proactively meet the patient’s needs such as additional interventions and imaging studies.
The digital radiology dashboard is the cornerstone of next phase in the PACS evolution. It can deliver productivity and performance improvements as well as the business intelligence necessary to optimize the digital department. It’s a tall, and complex, order. And yet, at the same time, the dashboard must be simple for users. If it’s too complex and requires one more password or log-in, it becomes ineffective and will not aid workflow. One common visual representation is a traffic light to alert users to potential problems; other dashboards provide graphical information to allow at-a-glance trending data. Red lights signal urgent matters, yellow lights urge caution and green lights mean workflow is going smoothly.
The dashboard is not a one-size-fits-all product or solution. It can take a number of forms, target various users and be customized to individual facility needs.
Dashboard models
Reiner defines dashboards by the target audience: radiologists, technologists or IT/PACS administrators. Each type of user requires a different vehicle to reach the end destination and optimize workflow. Yet one design element remains common across all drivers. Each model tracks metrics in an automated fashion.
The radiologists’ dashboard might monitor the number of studies in the queue, how long studies have sat in the queue, report turnaround times and workload and productivity of individual radiologists. Chris Petillo, director for PACS at New York University Medical Center (NYUMC) in New York City, further refines the radiology dashboard. It can serve as a tool for individual radiologists to monitor daily workflow by tracking the number of cases, waiting time and other parameters.
Like the radiologists’ dashboard, the radiology administrator dashboard might incorporate dynamic updates about where patients are in the queue, patient wait times, backlogs, problems with equipment and throughput, enabling the department to better allocate staff and resources, says Reiner. Similarly, the dashboard could provide department and section heads with data on the entire staff and allow them to re-direct work or balance workloads as needed.
Nagy employs a slightly different approach, dividing dashboards into three categories: operational, tactical and strategic. An operational dashboard provides data about the current state of the PACS or radiology department. A tactical dashboard analyzes historical trends and resources to better forecast future needs. Finally, a strategic dashboard adopts a business intelligence stance to provide a scorecard for the radiology department.
The PACS dashboard
During his tenure at Medical College of Wisconsin in Milwaukee, Nagy and his colleagues realized there was no way to monitor the performance and health of PACS, so they constructed PACSPulse — a performance monitoring dashboard to facilitate improved resource allocation and capacity management.
PACSPulse provides a graphic analysis of PACS performance by tracking usage by network, server, workstation, type of traffic and time of day. “PACSPulse helped us focus resources,” explains Nagy.
For example, when ER docs clamored for additional workstations, the PACS administrator could see that the current inventory sufficed for the department. On the other hand, the administrator could see which departments were bursting at the seams and reallocate equipment from lesser-used areas.
PACSPulse allows the administrator to get under the PACS hood in other ways as well. For example, another hard-to-forecast critical need is image storage. PACSPulse calculates the number of slices produced by CT scanners to allow better forecasting of storage space. The system also provided the data to justify a WAN in an unexpected high volume area. At the same time, the system clearly proved that a fiber channel storage system was not needed to run the PACS and demonstrated that more economical serial ATA storage could meet the needs. “The digital dashboard is the way to manage and optimize PACS,” sums Nagy.
The homegrown radiology dashboard
“There are not a lot of commercial dashboard solutions available yet. Most [like PACSPulse] have been created within a facility,” notes Reiner. Like Medical College of Wisconsin, UPMC also took the dashboard matter into its own hands. One advantage of this approach, says Morgan, is that the individual institution can craft the dashboard to meet site-specific needs.
UPMC opted to integrate multiple radiology systems into a single meta-system that queries each system and pulls the data together for pre-emptive monitoring at the individual, division and system levels. The initial UPMC dashboard focused on two parameters — report signing and delinquent reports. Morgan presented his findings at the Society for Computer Applications in Radiology (now renamed the Society for Imaging Informatics in Medicine) meeting in April.
UPMC relies on voice recognition and natural language processing with transcriptionists placing reports in the queue for radiologists to sign. Pre-dashboard average turnaround time hovered in the 22.5 hour range for all users. “Ideally the radiologist should be alerted when it’s a good time to sign reports,” explains Morgan. The UPMC dashboard counts the number of unsigned reports and the color of the traffic light changes accordingly. A green light signifies no unsigned reports, and a yellow light represents between one and 20 unsigned reports. When the unsigned report count exceeds 20, the light changes to red. At more thirty or more, the light begins blinking red. A web-based link allows radiologists to launch the report-signing application without having to navigate through a separate system. The ability to take actions based on the dashboard is a critical element of the system, says Morgan. Since deploying the dashboard, average UPMC report turnaround time has dropped to a clinically and statistically significant 17.7 hours.
The other arm of the UPMC experiment analyzed delinquent dictations, i.e. completed but not dictated cases. Examples include difficult sub-specialty cases that came in on the weekend or misrouted exams. Prior to the dashboard, the PACS administrator printed delinquent dictations once a week and traipsed back and forth across the department, haranguing division heads as needed. The weekly rounds are hardly the stuff of a streamlined digital department but they are a reality for nearly every institution. At UPMC, it took an average of 13.6 days to resolve a delinquent dictation.
“This function couldn’t be automated completely,” explains Morgan. “We didn’t want the dashboard crying wolf.” That is, it could not misidentify a report as delinquent. If the dashboard demonstrated a pattern of jumping the gun and mislabeling reports, radiologists might begin to ignore it, says Morgan.
UPMC opted for a hybrid solution. If an exam remains in the queue for longer than 72 hours without a dictation, the dashboard notifies the PACS administrator who flags the exam and launches a form to notify the involved division. The message also includes an active link so the exams can be loaded and read. Post-dashboard delinquent dictation resolution sits at a respectable 2.8 days. “Also, the number of outstanding exams that exceed 72 hours prior to dictation fell from 49 out of 33,000 to zero out of 37,000,” reports Morgan. “We eliminated delinquent exams and the duplication of effort that accompanied those studies.”
The mega-dashboard
A few years ago, the radiology department at Cincinnati Children’s Hospital Medical Center might have been considered state-of-the-art. The department featured an integrated RIS/PACS and voice recognition. “Those technologies were beneficial, but there were a lot of workflow issues that weren’t addressed,” states Mark J. Halsted, MD, chief of radiology informatics research core.
Workflow busters included items like communication with referring physicians and exam reading order. “These aren’t niche problems,” asserts Halsted. Radiology departments and hospitals everywhere are affected by these dilemmas on a daily basis. The ability of the radiology department to efficiently allocate resources and read exams impacts the pace and delivery of care throughout the entire hospital, says Halsted.
“If there were 30 stat cases in the queue, I would be lost and would read on a first-come, first-served basis,” explains Halsted. That led to phone calls from concerned clinicians, which, in turn, disrupted reading.
Halsted and his colleagues devised RadStream workflow software to eliminate radiology-based bottlenecks and address the soft-copy workflow conundrum at Cincinnati Children’s. RadStream incorporates an algorithm that measures several parameters such as medical acuity and patient anxiety to create a prioritized worklist. The result? “Radiologists always read the most acute cases first,” explains Halstead. “RadStream pre-empts phone call disruptions because the cases that are the most likely to generate a phone call are read first.”
The software also closes the communication loop by brokering communication of results. If a case is identified as a Call Request Report, RadStream routes the case to an operator as soon as it is read. The case pops up on screen with physician contact information, patient data and the complete signed report. The operator calls the clinician to notify him or her about the report.
A related feature tackles the minefield of unexpected findings. Before RadStream, when a radiologist encountered an unexpected finding, he or she would have to decide whether or not to disrupt workflow for five to 10 minutes to track down the clinician. RadStream triggers the call at the click of a button — without breaking the radiologist’s workflow. An analogous button handles critical findings in a similar fashion.
RadStream has improved workflow at Cincinnati Children’s, says Halsted. Researchers at the University of Cincinnati College of Business found than turnaround time decreased 40 percent after the hospital deployed RadStream. In addition, interruptions to radiologists dropped 22 percent, which translates into a savings of 2,000 radiologist hours per year within the department. Halstead estimates that overall departmental productivity increased as though two additional FTE radiologists were working.
Radiologists at the hospital’s satellite facilities also realized some major gains. Before RadStream, findings were faxed for every stat case, and positive findings required a phone call to verbally convey information. (In other words, breaking workflow, gathering thoughts, dialing, paging transferring, waiting and in many cases, repeat, repeat.) Now, RadStream brokers the calls saving two to three hours per radiologist each day at satellites.
The list of features continues, says Halsted. “RadStream allows all users to have their fingers on the pulse of the department. It connects the department and allows any point of contact to answer questions.” For example, it’s not uncommon for a radiologist to receive a phone call from a clinician with questions about a patient or exam. Instead of breaking workflow or transferring the clinician to another staff member who might (or might not) have the information, the radiologist can click on the patient name to tell the caller which service the patient is on, who’s reading the exam, status of the study and more. “It takes 30 seconds and saves time and hassle for several staffers,” notes Halsted.
The software also provides administrators and department heads with a bird’s eye view of the entire radiology department. With the click of a button, users can check the vital signs of the department: how far behind any service is, the number of active cases, total acuity scores and more.
RadStream technology is expected to be commercially available through Amicas, Inc. The PACS vendor acquired exclusive rights to the software this spring. Amicas plans to sell the software as a stand alone product utilizing AMICAS integration capabilities and AMICAS Insight Services in cooperation with Cincinnati Children’s team for deployment to any vendor’s PACS. The company also plans to embed RadStream into its Vision Series PACS for a fully integrated offering.
The vendor collaboration approach
NYUMC is taking a different path to dashboard deployment. “We have a vision of what we want and are working very closely with our partner Siemens Medical Solutions to develop a solution,” explains Petillo. The vision is to develop a one stop shop that allows users to see how functional the radiology department is on a technical and workflow level.
NYUMC has tackled what it defines as the first step to dashboard development and implemented an integrated RIS/PACS. “Simplicity and integration are key. With a single vendor department, we don’t have to deal with the nitty-gritty details of integration. We can address technology on a higher level and determine how to make the department more efficient,” opines Petillo. He says that it is possible, but difficult, to create a dashboard in a multi-vendor environment.
The next step is the front-end dashboard graphical user interface (GUI). One major dashboard challenge is to provide each user with the at-a-glance information he or she needs to improve workflow and productivity. The PACS administrator, for example, requires data about network bottlenecks, down systems and archive capacity. Radiologists, on the other hand, need a system to help them re-route work and fill in if colleagues are overburdened. “Right now, we lack a trigger to re-route work. Radiologists would pickup the slack for each other if it didn’t require multiple log-ins,” says Petillo. Petillo and his colleagues aim to solve these dilemmas collaboratively with Siemens as they engineer a dashboard solution.
Dashboard essentials
Although technology has continued to advance, many radiology solutions continue to rely on a one-size-fits-all approach where the end-user is forced to adapt to the hardware or software, says Reiner. “Technology needs to be more adaptive to the end user to make all users more productive,” continues Reiner. In the dashboard world, this might translate into a smart system that provides each individual user with the preferred data and settings based on the type of study. At the same time, the dashboard can not disrupt or distract the user; simple interfaces such as the traffic light seem to work best.
Radiology departments may find it difficult to purchase a commercial off-the-shelf dashboard; however, it may be possible to invest in a mega-system as a commodity and configure it locally, says UPMC’s Morgan. “Local developers are needed to make the dashboard work,” confirms Morgan. Morgan says UPMC’s Philips Medical Systems iSite PACS incorporates an application protocol interface (API) that allows developers to poke and peek under the hood without writing C++ code. In addition, the web-based PACS can query other systems to obtain information to feed the dashboard.
A sneak peek at the dashboard horizon
Dashboard technology is in its infancy; advances and new options are guaranteed. After deploying PACSPulse, Morgan surveyed users about other features or tools they would like incorporated into the system. While stock quotes, cafeteria menus and local movie times were high on many users’ wish lists, developers adopted a more practical approach and focused on methods to re-assign studies and alert administrators to mis-routed exams.
Reiner ties the dashboard to the ultimate goal of healthcare: optimum patient care. The dashboard could facilitate improved patient care by making it easier to measure (and reward) quality, says Reiner.
The end of the road?
The radiology dashboard is in its early phase. Anyone who takes a quick tour of trade show floors will find various iterations of the dashboard at major PACS vendors’ booths. For example, Agfa Healthcare’s Enterprise Clinical Dashboard Services provide web-accessible, single sign-on to disparate systems to increase productivity and improve patient care. GE Healthcare’s Centricity Radiology Business Intelligence Portal is designed to evaluate business operations by comparing current equipment and process performance against industry standards and benchmarks.
Whether the dashboard is a vendor solution or homegrown software, it promises to boost the efficiency, productivity and workflow of the digital department by proactively identifying network and system issues, offering a succinct view of the department and streamlining processes of radiology. Essentially, these new software solutions will serve as the vehicle to deliver on the promise of PACS.
How to Assess Your PACS Health |
This graph shows the usage and performance dashboard at the Medical College of Wisconsin Department of Radiology — PACSPulse — that represents the imaging volume in numbers of studies by the blue bars, while the superimposed gray line shows the speed at which the volumes travelled in megabytes per second (MB/S). The x-axis is the hour of the day from 0 to 23. A lack of a dip in the gray line at peak loading times in the day means the system is not saturated and more workstations could be added. This graphical synopsis shows the operational “health” of the entire system and allows a troubleshooter to quickly focus on problems. It details the performance of data going out to the workstations, which reflects the time the radiologist spends waiting for the data. When a PACS or network is saturated and becomes overloaded, the performance degrades as volume increases. If performance doesn’t dip as a function of volume, this means the system has not reached capacity and additional workstations can be added. PACSPulse is an open-source tool developed to identify and analyze performance bottlenecks of PACS. For more information, visit http://radiographics.rsnajnls.org/cgi/content/full/23/3/795. |