Article
· Apr 30 3m read

LowCode simple demo transforming HL7 SIU messages to Kafka, then consuming Kafka messages to store them into IRIS via SQL

Gitter

Production Configuration

This demo has an interoperability production with 16 items. 

Production Configuration HL7 + Kafka Producer

The first part of this demonstration consists of sending an HL7 SIU file which will be transmitted to the 2 other HL7 flows (HTTP and TCP), and transformed and transmitted to the Kafka server. HTTP and TCP flows will transform HL7 messages in the same way before sending them to Kafka as well.

  • 3 HL7 Business Services
  • 1 HL7 router
  • 2 HL7 Business Operations
  • one Business Operation sending the transformed messages to Kafka

Business Rule

The production has a business process with is a HL7 router, which transforms and sends HL7 message to Kafka. 

Data Transformation

The Data Transformation Builder allows the edition of the definition of a transformation between HL7v2 SIU sources into Kafka Messages. Data Transformation 

Visual Trace

After an HL7 message has been processed, ie: by copying some messages from /data/HL7/test into /data/HL7/in directory), you can see its Visual Trace   You can see here the message with I/O and the HL7 ACK 

Kafka Manager

Then, you can check the messages in Kafka, using KafkaManager interface and fetching data from the different topics.  And the content of one topic : 

Production Configuration Kafka Consumer + SQL IRIS

The second part of this demonstration consists of consuming Kafka messages and route them to IRIS tables through SQL components.

  • 3 Kafka Business Services consuming 3 Topics of Kafka
  • 1 router
  • 3 SQL Business Operations inserting data into IRIS database

Business Rule

The production has a business process with is a Kafka router, which sends Kafka messages to IRIS SQL components. 

Visual Trace

Each time a Kafka topic is consumed, it is sent to the Kafka router process which performs content-based routing of Kafka messages, to the appropriate SQL tables into IRIS. If you look carefully to the messages, you can notice that the message is sent directly to IRIS without being transformed (same message ID).   You can see here the message with I/O and the SQL insert result 

SQL

You can then see the results inside IRIS database through SQL queries.

  • TrakCare table

* Surg table

* And thanks to inheritance, you can also query all the data by just querying the root table, here data.kafka

ClassExplorer

The Class Explorer allow you to see the data model of IRIS classes. 

Default Settings

In order to simplify the process of copying a production definition from one environment to another, and ensure watertight separation between the parameters of the different environments, it is recommended to set settings outside of the production class, in the system default settings.  So you will see the settings in blue in the production configuration 

Prerequisites

Make sure you have git and Docker desktop installed.

Installation: ZPM

Open IRIS Namespace with Interoperability Enabled. Open Terminal and call: USER>zpm "install hl7v2-to-kafka"

Installation: Docker

1. Clone/git pull the repo into any local directory

$ git clone https://github.com/SylvainGuilbaud/hl7v2-to-kafka.git

2. Open the terminal in this directory and run:

$ docker-compose build

3. Run the IRIS container with your project:

$ docker-compose up -d

How to Run the Sample

  1. copy some HL7 messages from /data/HL7/test into /data/HL7/in
  2. check the Visual Trace
  3. see a full trace
  4. go to Kafka Manager and fetch data from the different topics
Discussion (2)1
Log in or sign up to continue