Data Analytics | Data Applications | Data Science | Data Strategy | Databricks | Healthcare and Life Sciences

Healthcare Data Interoperability Tools-#3 of 3

This is Blog #3 in a 3 Part Blog Series On HL7 and Healthcare Data Interoperability

In the past 2 blog posts (Click here to access Blog #1 and Blog #2) of this series, we’ve shown you how to use Smolder to parse your hl7v2 data and build a Medallion architecture within Databricks. 

Let’s look at the bigger picture though. As a healthcare organization, it’s likely you have many different data sources beyond hl7v2 data. These data sources might be based on standards like FHIR or even OMOP. 

This brings up some new questions: How do I take advantage of all of this data and perform quick analytics when it’s being collected in all of these different formats? How can I easily move between these different data formats? How can I integrate all of these different standards in the same pipeline?

These questions lead us to the overarching idea of healthcare interoperability: the ability to share and use data from various health standards securely and quickly. 

Designing an Interoperability Tool

Developing a well-designed interoperability tool is easier said than done. 

In our hypothetical architecture, we’ll start off with the assumption that each health data source is represented as its own data model. For instance, we would have a data model for FHIR, a separate data model for OMOP, and so on. In order to move from one data model to another we need to create mappings between these different data models. Let’s start off simple and look at two data models: hl7v2 and FHIR.

In the diagram below we see that we need to create two sets of mappings: 1) to go from hl7v2 to FHIR and 2) to go from FHIR to hl7v2. 

OMOP

Now what if we add OMOP as a third data model into the mix. We see that we now need to implement triple the number of mappings in our architecture. This highlights a key challenge for interoperability which is that these data models have a many-to-many relationship. Implementing and maintaining all of the mappings becomes unrealistic as the number of data models grows. 

One way to overcome this issue would be through the use of an intermediate data model. Let’s continue with the previous example but set the OMOP data model as an intermediate data model.

The diagram below illustrates the new architecture. For each additional data model we only need to create a mapping between the new data model and the intermediate data model. This significantly reduces the complexity, especially as the number of data models grows, since we no longer have to map each additional data model to all existing data models. 

There’s a lot more that goes into creating a well-designed healthcare interoperability tool than what we’ve discussed thus far in this blog post. So much so, that it could be the topic of its own multi-part blog post series. To continue the discussion, join us at the Databricks AI Summit where we dive deeper into interoperability, its importance, and the exciting solutions we’ve been developing to start answering the questions outlined in this post. 

Lovelytics is a preferred partner of Databricks and has helped many clients install and configure their Databricks instances. To learn more please visit us at www.lovelytics.com/partners/databricks or connect with us via email at [email protected]

Healthcare Interoperability is a focus for Lovelytics and Databricks. Please join us at the Databricks AI Summit on June 27-30 2022 in San Francisco to see Healthcare Interoperability in action. Click here to register to attend.

Author

Related Posts

A conversation with Lovelytics' new databricks MVPs
Jan 22 2026

The New Era of AI: A Conversation with Lovelytics’ New Databricks MVPs

As AI reshapes the enterprise landscape, Databricks has launched a new AI MVP designation to recognize the practitioners leading the charge. We are thrilled to...
Nov 11 2025

Taxonomy Agentic AI: Building the Foundation for Smarter Data and AI Outcomes

Across industries, organizations face a common challenge: messy, inconsistent product, parts, and content taxonomies. Whether in manufacturing, retail, CPG, or travel,...
Oct 09 2025

Gridlytics AI: Transforming Utility Grid Operations with Unified Ontology and Interpretive AI

As the energy landscape rapidly evolves, utilities face unprecedented challenges. Aging grid infrastructure, decentralized renewables, surging demand from electric...
Sep 30 2025

Customer Story: Locality Is Changing Local Advertising with Audience Intelligence

Scaling local advertising has always been hard. Fragmented workflows, rising costs, and limited ownership of audience data slowed progress. Locality has set out to...
Sep 29 2025

How Locality Is Redefining Local Advertising with Unified Audience Intelligence

Campaign planning, audience activation, and measurement have long been handled in silos. Teams jump between platforms, vendors, and manual processes. That slows down...
Aug 27 2025

Why “Data as a Product” Is the Shift Business Leaders Need Now

Most companies don’t have a data problem. They have a data usability problem. You have data. Lots of it. But when it’s time to make a business decision, whether it’s...
Aug 19 2025

Beyond Prompt Engineering: Building Agentic Workloads with DSPy, MLflow, and Databricks

Learn how enterprises can move beyond fragile prompt engineering to build reliable AI agents with DSPy, MLflow 3.0, and Databricks.

Blog title image with logos for OpenAI and Databricks
Aug 13 2025

Harnessing the Power of OpenAI gpt-oss and GPT-5 with Databricks and Lovelytics

The AI landscape is advancing rapidly, with breakthroughs unlocking new possibilities for businesses every day. OpenAI’s recent release of the gpt-oss and GPT-5 models...
Aug 04 2025

How Lovelytics and Databricks Partnered to Migrate and Automate Databricks’ Internal Reporting to AI/BI

Introduction: What is AI/BI and Why It’s a Game-Changer For years, BI tools have helped organizations analyze and visualize data, but the landscape has shifted....
Jun 24 2025

What is Databricks AI/BI Genie and how do you use it?

AI/BI Genie is an agent that allows us to interact with data through conversation. In this article, we’ll explain what challenges it addresses, how much it costs, and...