Managing healthcare these days is as much about managing data as it is about managing patients themselves. The tsunami of data washing over the healthcare industry is a result of technological advancements and regulatory requirements coming together in a perfect storm. But when it comes to saving lives, the healthcare industry cannot allow IT deficiencies to become the problem rather than the solution.

The healthcare system generates about a zettabyte (a trillion gigabytes) of data each year, with sources including electronic health records (EHRs), diagnostics, genetics, wearable devices and much more. While this data can help improve our health, reduce healthcare costs and predict diseases and epidemics, the technology used to process and analyze it is a major factor in its value.

According to a recent report from International Data Corporation, the volume of data processed in the overall healthcare sector is projected to increase at a compound annual growth rate of 36 percent through 2025, significantly faster than in other data-intensive industries such as manufacturing (30 percent projected CAGR), financial services (26 percent) and media and entertainment (25 percent).

Healthcare faces many challenges, but one that cannot be ignored is information technology. Without adequate technology to handle this growing tsunami of often-complex data, medical professionals and scientists can’t do their jobs. And without that, we all pay the price.

Electronic Health Records

Over the last 30 years, healthcare organizations have moved toward digital patient records, with 96 percent of U.S. hospitals and 78 percent of physician’s offices now using EHRs, according to the National Academy of Medicine. A recent report from market research firm Kalorama Information states that the EHR market topped $31.5 billion in 2018, up 6 percent from 2017.

Ten years ago, Congress passed the Health Information Technology for Economic and Clinical Health (HITECH) Act and invested $40 billion in health IT implementation.

The adoption of EHRs is supposed to be a solution, but instead it is straining an overburdened healthcare IT infrastructure. This is largely because of the lack of interoperability among the more than 700 EHR providers. Healthcare organizations, primarily hospitals and physicians’ offices, end up with duplicate EHR data that requires extensive (not to mention non-productive) search and retrieval, which degrades IT system performance.

More Data, More Problems

IT departments are struggling to keep up with demand. Like the proverbial Dutch boy with his finger in the dyke, it is difficult for IT staff to manage the sheer amount of data, much less the performance demands of users.

We can all relate to this problem. All of us are users of massive amounts of data. We also have little patience for slow downloads, uploads, processing or wait times for systems to refresh. IT departments are generally measured on three fundamentals: the efficacy of the applications they provide to end users, uptime of systems and speed (user experience). The applications are getting more robust, systems are generally more reliable, but speed (performance) is a constant challenge that can get worse by the day.

From an IT investment perspective, improvements in technology have given us much faster networks, much faster processing and huge amounts of storage. Virtualization of the traditional client-server IT model has provided massive cost savings. And new hyperconverged systems can improve performance as well in certain instances. Cloud computing has given us economies of scale.

But costs will not easily be contained as the mounting waves of data continue to pound against the IT breakwaters.

Containing IT Costs

Traditional thinking about IT investments goes like this. We need more compute power; we buy more systems. We need faster network speeds; we increase network bandwidth and buy the hardware that goes with it. We need more storage; we buy more hardware. Costs continue to rise proportionate to the demand for the three fundamentals (applications, uptime and speed).

However, there are solutions that can help contain IT costs. Data Center Infrastructure Management (DCIM) software has become an effective tool for analyzing and then reducing the overall cost of IT. In fact, the US government Data Center Optimization Initiative claims to have saved nearly $2 billion since 2016.

Other solutions that don’t require new hardware to improve performance and extend the life of existing systems are also available.

What is often overlooked is that processing and analyzing data is dependent on the overall system’s input/output (I/O) performance, also known as throughput. Many large organizations performing data analytics require a computer system to access multiple and widespread databases, pulling information together through millions of I/O operations. The system’s analytic capability is dependent on the efficiency of those operations, which in turn is dependent on the efficiency of the computer’s operating environment.

In the Windows environment especially (which runs about 80% of the world’s computers), I/O performance degradation progresses over time. This degradation, which can lower the system’s overall throughput capacity by 50 percent or more, happens in any storage environment. Windows penalizes optimum performance due to server inefficiencies in the handoff of data to storage. This occurs in any data center, whether it is in the cloud or on premises. And it gets worse in a virtualized computing environment. In a virtual environment the multitude of systems all sending I/O up and down the stack to and from storage create tiny, fractured, random I/O that results in a “noisy” environment that slows down application performance. Left untreated, it only worsens with time.

Even experienced IT professionals mistakenly think that new hardware will solve these problems. Since data is so essential to running organizations, they are tempted to throw money at the problem by buying expensive new hardware. While additional hardware can temporarily mask this degradation, targeted software can improve system throughput by up to 30 to 50 percent or more. Software like this has the advantage of being non-disruptive (no ripping and replacing hardware), and it can be transparent to end users as it is added in the background. Thus, a software solution can handle more data by eliminating overhead, increase performance at a much, much lower cost and extend the life of existing systems.

With the tsunami of data threatening IT, solutions like these should be considered in order to contain healthcare IT costs.

Download DymaxIO to solve I/O performance degradation and boost performance.