Skip to main content
Learn more about advertising with us.

How industry trends are shaping hospital infrastructure requirements

austin-park
Austin Park, Principal Consultant, LCMC Health Epic Infrastructure Team, Sapphire Health

Consolidation, reduced IT spending, increasing requirements for real-time patient data and the need to manage tremendous volumes of data are guiding, governing and dictating how hospitals shape IT infrastructure.

Consolidation drives need for agility

Over the past 3 – 5 years, consolidation has become more common in the healthcare industry than at any time in history. In the past, the biggest challenges were related to integrating medical staff, nursing staff, and other clinicians while working to streamline processes for efficiency. 

Today, while consolidation and efficiency at the service-line level is still important, often one of the most difficult and expensive tasks organizations face is integration of disparate IT systems – and the movement and protection of critical data. While other industries have successfully utilized public cloud infrastructure for many of these tasks, healthcare has the additional burden of regulatory and privacy concerns that many other industries lack – limiting the viability of some cloud solutions. 

Hospitals are faced with the added challenge that not all data is created equal, requiring varying levels of protection. Healthcare institutions, especially those engaging in, considering or preparing for M&A activities, must give special consideration to both data security and agility.

The goal is to have the ability to move data between the institution’s data centers, private clouds, cloud service providers, and cloud hyperscalers – with local control of essential data. This protects critical patient and business data while adhering to healthcare data governance requirements.

Reduced IT spending drives need for simplicity

Hospital and health system net profit margins have declined significantly and for many organizations are at all-time lows. While IT spending in healthcare lags behind many other industries, averaging between 2 – 3 percent of overall operational spend, technology leaders are being asked to become more lean and efficient, even with the additional workload required to support advanced clinical systems and manage consolidation efforts. 

Effectively meeting this mandate requires leaders to attack the problem on multiple vectors. Capital expenditures are typically addressed first, and with good reason, as these types of reductions are immediate and carry fewer political implications. 

Operational efficiencies not only include reducing general OPEX, but also rethinking the business – and often times, rethinking the processes by which the job is done.  In an industry ripe with tradition, IT administrators often get stuck in the, “we have always done it this way” mentality. While new thinking lays the groundwork for new opportunities, change management can be complex, difficult, painful, and in many cases, not within the leader’s core competencies. 

The reality is that all of these areas should and must be evaluated and streamlined, as each contributes to overall efficiency. Technology architectures have evolved significantly over the past several years, and new modern infrastructures lend themselves to agility and time-saving automation of historically manual tasks. With the challenges facing healthcare today, IT leaders must demand a top-to-bottom re-evaluation of the critical technology infrastructure – as well as the processes that support them.

Shift from volume- to value-based care drives need for accessibility to real-time patient data

For years, critical patient data was housed on “Hospital Information Systems,” and it was acceptable for some amount of downtime to occur for system upgrades, enhancements, and other operational maintenance. 

Today, clinicians, nurses and other caregivers throughout the institution rely on IT systems to inform the hospital’s mission critical work: patient care. Downtime of any kind is no longer acceptable and even the best available downtime protocols are insufficient to maintain effective clinical management when systems become unavailable. 

Healthcare organizations need to evaluate the entire “stack” that supports these critical systems, including the physical plant (electrical systems, HVAC, etc), data storage, network, compute, and the applications themselves to assure these systems are engineered to deliver always-on availability. 

Anyone in healthcare IT understands that, in most cases, the clinical applications the institution relies on do not support the same levels of redundancy and fail-over that the foundational technology can support – so it is essential that all aspects of the system be designed and engineered to provide these capabilities. This is one reason it is essential that organizations work with solution providers like NetApp, with both experience and expertise supporting clinical applications, and have the certifications and validations to prove it.

Growing volumes of clinical data drives need for an infrastructure that is designed and built to scale

As financial pressures have escalated, three-year technology refresh cycles have evolved into five-year refresh cycles. Making do with what you have has become the norm. 

To avoid getting caught in this trap, savvy IT leaders invest in systems with the ability to scale over time, without being caught in finite refresh cycles, over and over. Moore’s Law has created an industry that moves faster than anyone imagined a few decades ago, and as a result, investments must be future-proofed. 

Imagine an organization that purchased a storage array three years ago with 300GB, 15k spinning disks from one of any number of storage vendors. When the organization needs to expand its storage today, its IT leaders are told that they need to purchase entirely new systems to take advantage of today’s solid state, flash technologies. 

Rather than simply doing what has always been done, and embarking on a forklift upgrade once again, organizations should evaluate the market and embrace technologies that future-proof against the need for complete system overhauls. IT administration should insist their storage providers offer a way to used mixed drive capacities in the future, that can upgrade code without taking systems offline. Controllers and shelves should be easy and seamless to add – without making users endure downtime. 

IT leaders that take a close look now and evaluate what can be done to update, improve and future-proof their hospital’s infrastructure will pay dividends in the form of always-on operation to every hospital, clinician, and patient – now and in years to come.

To learn more about how Sapphire Health and NetApp teamed to build a state-of-the-art infrastructure for UMC New Orleans in just three months, click HERE.