Gap Analysis in HL7 Interface Deployment, Part 2

In Part 1 of this series on HL7 gap analysis, I covered the need to identify gaps before starting to configure an interface, and explained why gaps happen. Today, I’m going to talk about gap analysis steps and the limitations you need to plan for.

Gap Analysis Steps
1. Obtain vendor conformance profile.
Vendor analysts start out with an HL7 conformance profile for the product they’re deploying. The profile documents the specific trigger events and segments used by the application. It wouldn’t necessarily cover every segment mentioned in the HL7 standard — just the relevant ones. The document also includes fields and related tables. The purpose, constraints, pre-requisites and expectations of each field need to be clearly defined in the vendor profile.

2. Obtain hospital specs and/or message log.
If available, the hospital IT team provides up-to-date documentation for the applications and systems that will need to pull or push data to the new product. Because these specifications are often unavailable, incomplete, or outdated (they potentially change with every product upgrade and interface engine tweak), reliable documentation is sometimes hard to get.

An alternative is to have the hospital share a sample message log with the vendor. The message log can be from a test or production system as long as it represents the “real thing” the vendor has to interface with. The sample message log should be relevant in terms of trigger events used, data exchange workflows, reference data (user tables) and customized data elements. The more use cases you can cover in the sample log, the fewer unidentified and unexpected gaps you can expect down the road.

If the log happens to contain protected patient information, both parties are mindful of encryption needs, usage restrictions, and HIPAA compliance.

3. Document gaps between the product and the hospital’s system.
This is mostly a manual task. The analyst simply opens the vendor conformance profile and the hospital spec (or sample message log), then starts comparing the two documents side-by-side on a computer screen. The analyst lists all the items that will need to be configured or customized. See Part 1 of the Gap Analysis series for details about gap types that analysts look for.

Gap Analysis Limitations
1. Complete, up-to-date documentation is hard to find.
That’s just a fact. When documentation exists, it needs to be validated. While it helps to have the documentation on hand, unfortunately, most of the time it’s just not there.

2. Log querying is still manual.
With a manual gap analysis process, the limiting factor boils down to getting a big enough sample to catch rare occurrences of values that would impact the interface configuration. Yet logs need to be small enough to be manageable on a human scale. Even at 5,000 to 10,000 messages, chances are an analyst will have enough time to sample just 50 messages at a time and read through 500 messages per gap issue. In the real world, with tight project timelines, no one would have enough time to parse every single message. Or catch every single gap.

3. Maintenance is repetitive.
Every time an interface (or the data semantics going through an interface) changes, the gap analysis process should be repeated for every single interface. Again, in practice this is very difficult to achieve. It wouldn’t be just one gap analysis but many more. Organizations sometimes choose to work reactively to solve problems on the fly, rather than proactively decrease the risk level.

Of course, this has an impact downstream during software implementations. Other teams encounter bugs and defects due to constrained gap analysis. And the result is that vendors and hospitals spend more time iterating through corrections and fixes both pre and post Go-Live.

These limitations are why we’re seeing a need to automate some of this work and increase gap identification earlier in the process.

Do you see risk reduction in gap analysis as a need in the industry? Do you see any other issues related to the gap analysis phase in HL7 interfacing? Share your thoughts in the comments.

Gap Analysis in HL7 Interface Deployment, Part 1

Nine times out of ten when a hospital deploys a new software system, the new system will need to exchange data with existing information systems in order to deliver on expected value. Even with fully integrated vendors like Epic and others, hospitals still have data from flowsheets from monitoring systems and medical devices to pull in through an interface.

Many vendors provide connectivity libraries and most hospital deploy interface engines to cope with connectivity and interfacing issues. But before they can use these tools, they need to work through a gap analysis phase.

What is Gap Analysis?
In this context, gap analysis is the phase in a deployment project where analysts map the data elements between the product they are installing to the elements in the hospital’s existing information systems. In most software or medical technology deployments, vendors are responsible for configuring and delivering a testable interface to the hospital.

Even when the new and legacy systems are based on the same HL7 version, the gap can be considerable:

HL7 messaging gap
Gap between 2 systems, based on same HL7 version

Why Do Gaps Exist?
Fundamentally, because HL7 v2.x is a loose standard.  HL7 was built to adapt to the many different environments where healthcare data integration was needed (as you know, every single provider organization is unique… 😉 ).  Standards developers aimed to ensure that messaging stayed independent of system architecture and that custom interfacing development was minimized.  The HL7 standard recommends an exchange format but also provides capabilities to extend and adapt the standard to real-world use.

The result is an 80-20 situation.  An HL7-compliant product will probably allow you to complete 80% of the interface with 20% of the effort.  For the remainder, you don’t need to change existing organizational processes or data structures. Instead, you adapt the way data is exchanged.  This strategy was fine for promoting adoption. But over the long run, as the number of interfaces grows and they become more complex, hospital IT teams can face significant challenges.  Especially when that custom 20% is different for each and every system…

Gap Sources
Gaps happen for several reasons.

1. Data structure: HL7 specifies/recommends usage of a data structure that includes trigger events, segments, fields and data types. Because clinic processes and representative data are fairly complex, the recommended structure must account for that complexity. The result is that deployed systems can’t fully conform to the recommendation. One example: a data structure gap based on the maximum length of data elements. What does the new system do if it receives a patient name longer than what its database can store? Do you truncate the data? How will clinical end-users react?

2. Data tables: HL7 suggests (…this is the term used in the specification) data sets. You probably already know that HL7 v2.6 “suggests” 6 different values for patient gender. But most installed systems don’t handle the whole list. Even worse, they might use a completely different set of terms to indicate gender.

3. Data meaning (or data semantics): There is a well known rule in the data exchange world: “It’s not because you call it the same thing that you mean the same thing.” HL7 interfaces will interpret the standard using their view of the world. For instance, which information would a system use to uniquely identify a patient? Several fields are potential candidates: PID-2 Patient ID, PID-3 Patient Identification List, PID-18 Patient Account Number, PV1-19 Visit Number, and so on. Even a combination of more than one field can be used. But would the same information, from a semantic point of view, always be at the same location within the various messages?

4. Z-segments: These segments are used when the standard doesn’t provide guidance for a piece of information you need to exchange. But sometimes, development teams resort to Z-segments when they need to cut delays and costs and/or work around technical limitations. When you come across Z-segments, any piece of data you need will probably result in a gap.

5. Legacy: Here the thought process is, “If it ain’t broke, don’t fix it.” Systems and data exchange mechanisms evolve. For instance, the way allergies were handled in early HL7 standard versions is different than the most recent. If you don’t have an interface engine or if the software system doesn’t make changes transparent, you’re going to be facing some gaps.

In the next blog post, I’ll be covering gap analysis steps and limitations. But there’s so much more to discuss. We want to hear about your experience with gap analysis – please let us know in the comments.

Pinpoint Software for HL7 Interface Gap Analysis

We’ve had great feedback from early users of Pinpoint software. They aren’t just using Pinpoint for interface troubleshooting. They’re also incorporating the software into their gap analysis workflow. In this context, gap analysis is the process of documenting the HL7 interface gaps between new software and legacy systems.

Here’s one way Pinpoint can make this process a little quicker.

Start with a Message Log
Analysts usually start with a raw message log from the provider, covering a few days of transactions. The log contains multiple segments, covering whatever the provider needs in the way of data from the software currently in place.

Filter Out Unneeded Segments
Many of these segments won’t be relevant to the gap analysis. But with a plain text editor like Notepad, analysts would have no choice but to parse the extra segments. For instance, for a new flowsheet, an analyst might need to screen the PID (patient identification), PV1 (patient visit) and OBX (observation) segments, but not INS-related segments, which contain payer-related data.

With Pinpoint, analysts can filter out the extra segments before starting to document gaps. This means they don’t have to read and scroll through extraneous data — which makes a big difference when they’re sampling batches of 50 to 100 messages at a time, out of a total volume of 5000 messages.

Pinpoint screenshot for HL7 interface gap analysis
Filter out extra segments with Pinpoint

(demo data generated by Caristix software)

And that’s a quick tip for using Pinpoint during the HL7 interface gap analysis process. We’ll be posting an article on gap analysis workflow and potential pitfalls later this week.

How are you using Pinpoint? Share other uses in the comments below.

HL7 Interface Troubleshooting? Introducing Pinpoint Software

Now that we’ve gone live with our new website, we’ve also launched our first product, Caristix™ Pinpoint software for troubleshooting HL7 interfaces.

What Does Pinpoint Do?

If you’ve ever received an HL7 message log with thousands of messages and segments, you know you’re in for a few hours of scrolling… just to find a handful of messages that you need for a troubleshooting task. Pinpoint lets you find the target messages in a few minutes rather than a few hours.

Who Is Pinpoint Designed For?

Pinpoint is for interface analysts and engineers who work on implementing and maintaining HL7 interfaces. They work for HIT and healthcare technology vendors as well as hospitals.

Pinpoint 30-Day Trial

We hope you’ll take Pinpoint out for a spin. Download a trial.

More About Pinpoint

Check out the product page.

Any questions or comments? We’d love to hear back from trial users. Leave a comment or question below, or contact us at support@caristix.com .

Introducing the Caristix Blog

Hello and welcome to the new Caristix blog! In this first post, we’ll cover a little background about what you can expect from this blog. And we’ll introduce the people who’ll be writing here.

HL7 Is Where It’s At

We’ll be covering interfaces, integration, and interoperability in healthcare information technology. Topics will include:

  • software development and testing. What makes healthcare so different? We’ll be talking about that.
  • HL7 interface scoping and configuration. HL7 is all about configurability. Which is why… Every. Single. Implementation. Is. Different. (whether we like it or not)
  • interface maintenance and troubleshooting. Updates, bug fixes, upgrades, new products. Interfaces change all the time. How do you keep up?
  • customer acceptance testing. Test patients, test orders, test messages. How do we as an industry make this easier for clinicians and vendors?

Plus we’ll provide customer stories, product news, tips and tricks, and opinions. It’ll be a mix of technical and business topics for interface analysts, software developers, testers, team leaders, and healthcare executives.

Meet the Bloggers

Jean-Luc Morin

Jean-Luc is the R&D lead at Caristix. Jean-Luc chose to work on the issues we address at Caristix because of something he’s noticed each time he’s worked with hospitals both big and small. He says that whenever the subject turned to interfacing, it seemed like the walls would start to shake. So Jean-Luc focuses on delivering tools that will make software implementations and interface configuration accessible and effortless for both vendors and hospitals.

On a personal level, Jean-Luc loves getting knee-deep in data, syntactically and semantically speaking. And interfacing is the perfect blend of both.

Learn more about Jean-Luc on LinkedIn

Stéphane Vigot

Stéphane is the sales lead at Caristix. Stéphane chose to work on the issues that Caristix addresses because he knows we’ve identified a real need that interface analysts and their executives face. He feels that there has got to be a better, simpler way to deploy healthcare software and interfaces than what he’s seen in previous implementations: plain text editors for HL7 message logs and long lists of paper-based specs. Healthcare technology can be a whole lot easier for hospitals (and vendors) to digest if they have the right tools.

He’s also excited about working with the folks at Caristix. We’re a group of people who’ve worked together over the past years, and we’re a team with a singular focus on HL7 and healthcare. So things look promising.

Learn more about Stéphane on LinkedIn

Sovita Chander

Sovita is the marketing lead at Caristix. Sovita chose to work on the issues that Caristix addresses because she’s seen the roadblocks that implementation teams face with interfacing, both on the business side with delayed go-lives and on the people side with user frustration and poor product adoption. With Caristix, there’s a real opportunity to change things and smooth out some of those bumps.

Sovita is really looking forward to the customer feedback. The most successful products she’s worked on have all included strong voice-of-the-customer input. And we want that at Caristix.

Learn more about Sovita on LinkedIn

Donald Marcotte

Donald is the QA/quality lead at Caristix. Throughout his career, he’s been responsible for making sure that product development teams actually deliver the products they say they’re going to. And he chose to work on the issues that Caristix addresses because of the complexity people face in healthcare. When it comes to product quality in healthcare, the stakes are high. And Donald thinks that’s a great incentive to work on tools that make product development and implementation smoother.

He sees the tools we’re developing at Caristix as technology enablers. For Donald, our work lets the big brains in healthcare — the technology innovators and pioneers — focus on making better products that help more patients. We just help them take care of getting the left hand to talk to the right.

Learn more about Donald on LinkedIn

Stay in Touch and Spread the Word

Bookmark this blog, email links to a friend/colleague, or add us to your RSS reader. We’d love to get your feedback on Caristix and this blog.

And of course, send us your requests (info@caristix.com). We want to know what you’d like to see us cover in the future.