Availability, Applications Drive New Archive Strategies

Lost access to healthcare data can affect cash flow leading to reduced revenues; and more importantly, negatively impact patient care, according to John S. Koller, president of KAI Consulting. Citing a study by Sunquest Information Systems of Tucson, Ariz., Koller says that when a hospital billing system went down, the manual recovery process for data lost during that period was equivalent to about 8 hours for every hour of system downtime. Koller spoke at the 2008 Digital Healthcare Information Management System (DHIMS) conference in San Antonio, Texas.

“Donald Holmquest, MD, president and CEO of CalRHIO [California Regional Health Information Organization], stated that missing information affects patients by delaying care 59.5 percent of the time and is judged to adversely affect patients 44 percent of the time,” Koller says.

High availability of healthcare data is key, given these sobering statistics.

“All data must be protected and recoverable,” Koller advises. “There must be a plan to respond to an event that interrupts the access to data and restore access.”

This plan must address business continuity—the people, processes, and technology required to continue to deliver the mission of a department or enterprise during an event—and disaster recovery—the people, processes, and technology required to recover operation and data access after an interruption from an event, he says.

“No organization is static; the plans need to change as the organization does,” he notes.

Edward M. Smith, ScD, professor of imaging sciences at the University of Rochester Medical Center in Rochester, N.Y., recommends that diagnostic imaging practices apply availability metrics on a need basis.

For example, storage media and network hardware should have 99.999 percent availability while storage, servers, the RIS, and the PACS broker should achieve 99.99 percent availability, according to Smith. Critical-use diagnostic workstations should have 99.9 percent availability while clinical review and teleradiology workstations can suffice with 99 percent availability.

“None of us have unlimited funding when we’re dealing with implementation projects,” Koller says. “[Smith’s recommendations] will probably give you the best balance between the cost and the availability you can achieve with the money.”


Virtual availability



One of the means by which high availability can be achieved is through the use of a virtual storage network. Virtual storage allows the access of data across an enterprise through a single interface, which sends user requests for information to the data repository independent of its physical location—such as spinning disk or tape—and delivers that data to the user regardless of the operating system platform from which the request was generated.

Storage virtualization can be accomplished with file-based systems, such as network attached storage (NAS), via a distributed file system across a redundant array of independent nodes. It also can be used for in-box systems such a storage area network (SAN) by virtualizing heterogeneous disks behind the system.

“Virtualization provides better performance, scalability and redundancy,” he says.

According to Koller, the benefits of virtualization in healthcare are lower costs and faster response to change. In addition, it allows non-disruptive upgrades and migrations as well as faster recovery from unplanned events.

Citing a Gartner Group study, Koller notes that by 2010, virtualization will be the most important technology in IT.

A new storage virtualization technology enjoying some currency outside the healthcare archive environment is multi-system capacity recovery. This schema aggregates and recovers unused disk space across application servers and in some cases other systems into a network storage grid. Koller notes that multi-system capacity recovery’s performance with diagnostic image archives is unknown.

The critical elements in deploying a virtual storage system are standards and standardization. “When you start dealing with proprietary anything it starts closing your options,” Koller noted. “You need to have a management environment that understands virtualization so that you can take advantage of automation. That allows you to be more responsive to your user community and minimize the downtime. The last, most critical piece is that you need to get buy-in (for virtualization) from your clinical application vendor. That, unfortunately, is why we’re one of the slower adoptees of this technology.”


Advanced visualization restructures architecture


The use of large image datasets with advanced visualization capabilities has demonstrated great clinical utility; however, the increase in near-isotropic image datasets presents a non-trivial data management and informatics challenge for PACS administrators, according to Paul J. Chang, MD.

Chang, professor and vice-chairman of radiology informatics, medical director of pathology informatics at the University of Chicago School of Medicine, and medical director of enterprise imaging at the University of Chicago Hospitals, discussed advanced visualization archive strategies during the 2008 DHIMS conference.

He notes that there is a definite trend toward on-demand archive design, with less emphasis on hierarchical storage models.

“The penalty for migration of very large datasets from slower media is increasingly unacceptable,” he says.

Users of advanced visualization tools want on-demand study retrieval across the image enterprise. System administrators seeking to meet the advanced visualization archive challenge have six models, with variations, from which to select, according to Chang. With advantages and disadvantages to consider before determining which best meets the needs of an institution.

The Selective Archive model has the advantages of being trivial to implement and permanently stores only “thick” slices, which is the least demanding on PACS archive storage requirements, Chang says. However, because only stored key images are available, it is not able to use interactive advanced visualization on prior or even relatively recent studies.

“It’s not able to take full advantage of future visualization and analysis tools on prior studies,” he notes.

A Selective Archive model with temporary cache strategy will allow the use of interactive advanced visualization on all current and relatively recent prior studies, depending on the size of the cache, Chang said. The downside is that once the prior studies are no longer resident in the temporary cache, these tools are not able to be used.

A Complete Dataset model permanently stores both thick and think slices, which allows for advanced visualization technology to be used on all current and prior studies. The disadvantage is that this greatly increases the PACS permanent archive storage requirements, Chang notes. “Storing both thick and thin slices within a PACS archive is wasteful and redundant.”

A Complete Dataset model with a Pseudo-Integrated Workstation scheme provides the advantage of making advanced visualization tools available via a “plug-in” and is less disruptive to normal workflow, Chang observes. Like the previous model, the drawback to this schema is that it calls for the storage of both thick and thin slices.

A Complete Dataset archive with an integrated workstation model permanently stores thin slices, which permits the use of advanced visualization components on all current and prior studies. Integrated tools in the workstation provide convenience to the interpreting physician and supports workflow requirements, and the architecture eliminates the inefficiency of storing redundant thick slices, Chang says. The disadvantage is that it increases the PACS permanent archive storage requirements.

The Complete Dataset archive with thin-client/server communication model holds the most advantages, according to Chang.

He says that it permanently stores thin slices, allowing the use of current and future advanced visualization software on prior and current studies; it supports workflow requirements; it allows for automated generation and transmission of thin slices; it reduces hardware and network requirements for workstations; it can improve client-workstation performance; and it allows for the utilization of full-featured advanced visualization tools via web viewers. However, it does increase PACS archive storage requirements.

The need for more storage to meet advanced visualization demands may not play as negative a role as one might think, primarily due to technology advances and the effect of economy of scale influences on the image archive market space.

“Current and even near-future storage requirements for large image datasets track reasonably well with continuously improving technical and economic efficiencies related to mass storage,” Chang notes. “These trends have significantly influenced archive persistence models for large image datasets.”

Around the web

The new technology shows early potential to make a significant impact on imaging workflows and patient care. 

Richard Heller III, MD, RSNA board member and senior VP of policy at Radiology Partners, offers an overview of policies in Congress that are directly impacting imaging.
 

The two companies aim to improve patient access to high-quality MRI scans by combining their artificial intelligence capabilities.