Healthcare data is sexy – well, in an offensive-lineman kinda way

While the industry’s attention is almost solely focused on AI, it will be a struggle if the basic blocking and tackling is ignored.



I’ve spent most of my career chasing the “sexy” parts of healthcare transformation, the technologies closest to patients and clinicians, the innovations that show up in demos, and the breakthroughs that make headlines. 

But the deeper I’ve gone into healthcare, the more my perspective has shifted. The unsexy layer has become the most interesting layer – the part of the game the commentators don’t talk about much, yet it decides everything. 

That’s why, in a recent HDM ROUNDtable with five Fellows from Harmony Healthcare IT – Tom Liddell (CEO), Brian Liddell (president and CFO), Jim Hammer (COO), Amanda Mais (vice president of data integration) and Erik Johnson (vice president of marketing), I started the discussion with a football analogy: if you look at any championship team, the highlights tend to focus on the quarterback, the receivers, the playmakers. But championships are won in the trenches, on the offensive line and defensive line. 

In healthcare, that line is data. 

And as we enter the AI era in which organizations are eager to apply new models to historical datasets, automate workflows, and unlock new clinical and operational intelligence. The truth is uncomfortable but unavoidable – if the data layer isn’t sound and reliable, none of the “playmakers” can shine. 

Where we are today 

From Brian Liddell’s seat – the financial seat – the industry has already amassed extraordinary volume of data – terabytes and petabytes across imaging and discrete data, spanning lifetimes of care. The asset is there. 

The problem is what happens next. 

Even something as basic as a patient moving from one health system to another still runs into friction, in the form of limited sharing, real cost and real risk. He named the two constraints most leaders feel immediately – financial feasibility and security exposure. And he pushed the conversation toward what I believe is the difficult end-state – not just health systems treating data as an asset, but patients being able to own and move it more easily across settings. 

Amanda Mais then took us one layer deeper, the layer that makes many “data strategy” discussions feel abstract until you hit reality. The data isn’t just siloed; it’s often locked inside complex, proprietary structures. What looks like a simple field on a screen can be something entirely different behind the curtain encoded in ways that require interpretation, business logic and deep technical knowledge to reconstruct. 

In other words, even if you have access, you may not have usability. And even if you have usability in the application, you may not have portability outside of it. 

This all matters in a world where leaders keep saying, “Let’s unify the data” or “Let’s build a new analytics layer” or “Let’s train models on the history.” There is real work required to unlock meaning from the underlying structures at scale, across many systems, without breaking trust. 

Is the juice worth the squeeze? 

At that point in the conversation, I asked, with budgets tight and complexity high, is the juice worth the squeeze? 

Brian Liddell’s answer was honest: it depends on mission. If you’re pursuing the best patient care or research, or a complete longitudinal record that supports understanding not only what happened, but why it happened, then the value is real. But the financial environment is also constrained – rates make investments harder, and smaller organizations feel these pressures differently than large systems. 

Jim Hammer added the governance layer to the equation. Health systems often have immediate needs migrations, acquisitions, urgent operational demands and the “day job” can swallow the space required for long-term strategy. The CDIO/CDO movement is happening, he said, but slower than many hoped, in part, because the urgent keeps beating the important. 

Tom Liddell described Harmony’s posture as trust-centric and efficiency-driven. Customers trust them with sensitive data and complex projects and in return, Harmony’s mandate is to drive costs down and de-risk environments. But he also described something that should catch every leader’s attention – the asset is “burgeoning,” yet still largely untapped especially when the time horizon is extended and consider what becomes possible when the numerator and denominator of data expand over years. 

What progress looks like 

When I asked for examples of what “progress” looks like, the answer wasn’t a single shiny use case. It was a lifecycle. 

Hammer laid out a continuum that many organizations are beginning to pursue – migrate data from a legacy application including discrete, documents, and even imaging, then de-identify it, then apply artificial intelligence/large language models across imaging studies, combined with reports and discrete data. 

The operational lesson is that repeatability depends on forethought. If you have the vision from the start, if governance and strategy are established upfront, the elimination of redundancy can be significant. But if you treat de-identification and secondary use as an afterthought, you create rework, repeat effort and additional cost. 

Tom Liddell added another reality – once data is enabled and consolidated, use cases multiply cohort communication, analytics, audits, regulatory reporting and operational efficiency. Sometimes the transformation is simply moving from “a hundred different places” to one place where the data is accessible and usable for “the ables” able to do things.

The playbook 

So, what’s the playbook for leaders trying to turn data into an institutional asset? 

Brian Liddell went straight to governance and mindset. If you have hundreds of legacy systems, the risk isn’t only that you have too many tools – it’s that you end up with an incomplete record if you only bring some of them forward. And after you accept the importance of completeness, you naturally end up at prioritization – what are you still paying maintenance on, what’s falling off vendor support, what’s disconnected from the network, what do records teams lack access to and where can you create ROI by retiring cost and restoring access. 

Hammer reinforced the operational ingredient – cross-functional ownership with regular cadence leaders who can decide what happens to each application and, more importantly, what the entire lifecycle of that data should be. Some data is retention-only. Some becomes research-ready. Some becomes future optionality. And given what AI is already capable of, let alone what it will be capable of next, he argued it can be worth keeping data in inexpensive long-term storage rather than expunging it, because we don’t yet know what future models will unlock. 

Tom Liddell framed it as a three- to-five-year commitment to data lifecycle management – more affordable in the long run, lower risk and a foundation for opportunities we can’t fully articulate yet. And then, he landed on the truth hiding in plain sight – healthcare is an information business. We absorb information, and we try to give clinicians the best tools possible to do the right thing for the person in front of them. 

What data leaders should do 

If you’re a health data leader reading this, the temptation right now is to leap straight to AI. 

But if this Roundtable surfaced anything clearly, it’s that the AI moment is downstream of the data moment. And the data moment is downstream of governance, architecture and lifecycle discipline. 

The “unsexy” work of unlocking proprietary complexity, reducing fragmentation, consolidating access, sequencing ROI and building repeatable lifecycle strategies is the offensive line. When you build it, the rest of the field opens up. And when you don’t, the play collapses before it starts. 

Watch the full HDM ROUNDtable interview here.  

Mitchell Josephson is CEO of Health Data Management.

More for you

Loading data for hdm_tax_topic #reducing-cost...