Hi Michael,

To summarize, HealthConnect is built on top of Ensemble. If you are getting HealthConnect, you must also have Ensemble. 

Here is the main difference: Ensemble in an Interface Engine that is used primarily to receive data in most common formats and transform the data before re-sending transformed data to another destination. Ensemble supports FTP File Transfer, SQL Transfer (with a connection to another DB), HTTP Transfer and TCP Transfer. As part of TCP Transfer there is also support for HL7 Standard. The outbound support is the same as for inbound. As part of Ensemble there are Transformation Languages (BPL, DTL) that allow you to transform your data.

HealthConnect has all of that but because it is specialized, in addition to the above transfer modes, it also implements the IHE IT Infrastructure Technical Framework (if you search for it you will get its documentation on the IHE web site). In order to implement the framework, in addition to the Interface Engine, you have to have a set of Configuration Databases (Registries). This is what HealthConnect provides. In a nutshell, with HealthConnect, you can process and transform not only HL7 messages but also Continuity of Care Documents (CCDs) - XML-based documents that are typically delivered via SOAP messages with a specialized SOAP Body (IHE Transaction) and a payload/attachment (the CCD document itself which is typically b64-encoded). HealthConnect also has a native intermediate XML format (SDA) and it provides ready-made (and customizable) XSLT transforms that can transform the CCD to and from this native format.

I hope this helps.

Hi Kurro,

1) The best way to debug SOAP issues is to enable SOAP Logging on the client side.

On your Cache server, use Terminal to zn to the namespace from which the request originates and type:

NAMESPACE> set ^ISCSOAP("LogFile")="C:\temp\SOAP.log"

NAMESPACE> set ^ISCSOAP("Log")="ios"

and then retry your GET/POST. You should see messages in the file above.

To turn off logging, type:

NAMESPACE> set ^ISCSOAP("Log")=""

2) If you try this you will likely see the same error in the SOAP log. So the most likely issue is the ContentType property of your HttpRequest object.

In your Business Operation, your probably have code like this somewhere:

Set tHttpRequest = ##class(%Net.HttpRequest).%New()

Set tHttpRequest.ContentType = "text/xml" // is this property set?

Here is the message from the class on its default behavior:

Sets/gets the 'Content-Type:' entity header field in the HTTP request. If it
/// is not specified and there is an <PROPERTY>EntityBody</PROPERTY> then it default
/// to 'text/html'.<p>

...

Property ContentType As %String [ Calculated ];
 

So if you used HttpRequest.EntityBody for your payload, you probably need to set ContentType explicitly.

I hope this helps.

Hi Frances,

You are saying that you are getting an Order message. Is it an ORM message or an ORU message (the latter typically contains OBX segments and is of type Observation/Result). 

In general, you would probably have to do something of the following:

1) During message processing, concatenate the content of every OBX:5 field into a single variable. In order to avoid a "Long String" problem in Cache, I would open a GlobalCharacterStream or FileCharacterStream and keep writing each new OBX:5 field to Stream as you loop over the segments.

2) In order to convert anything into PDF, you would need an external Rendering Engine. There is one I found on Open Exchange:

https://openexchange.intersystems.com/package/iris-pdf-generator

Also, Cache provides Apache FOP PDF Engine.  Here is some documentation on how to run it:

https://xmlgraphics.apache.org/fop/

https://xmlgraphics.apache.org/fop/2.5/running.html

The issue is, how to invoke the rendering engine from your ObjectScript code. This thread might be helpful:

https://community.intersystems.com/post/how-create-pdf-file-html

You can then write your PDF to a file.

3) If you wrote your PDF to file, you can use the following article on how to embed it into the HL7 Message:

https://community.intersystems.com/post/ensemble-how-embed-pdf-file-hl7-...

This is a complex project; but the tools listed here should help you.

Hi Blakely,

I assume you already have code that goes over your XML and extracts data into the HL7 fields.

So then you get to the Par_Location element, do something like this:

Set tPar_Location_Text = M071|M074|...

For i=1:1:$LENGTH(tPar_Location_Text) {

    Set word = $PIECE(tPar_Location_Text, "|", i)

    Do createSegment("IVT", i, word, field3, ...)

}

Note I assume you have a routine that creates the segments...

I hope this helps

Vitaly

I think Dmitriy is right. You would probably need to create an intermediate Object Model - a class or classes that model your XML Structure and another class that models your JSON Structure. The XML model extends %XML.Adaptor and imports your XML using tools like ImportXMLFromStream etc.., while the JSON model extends %JSON.Adaptor. Then you would step through your XML model, copying your XML properties to the properties of your JSON-aware class. 

Looking at the code, I see that EnsLib.RecordMap.Service.FileService extends EnsLib.RecordMap.Service.Standard. You can try to create a custom Business Service (for example, EnsLib.RecordMap.Service.StreamService) extending the same parent class and override the OnProcessInput method to process your stream.

One issue you might have is, the FileService is bound to EnsLib.File.InboundAdapter that does a lot of work in the service. There are other types of out-of-the-box adapters - like EnsLib.HTTP.InboundAdapter which you could use here but your code would be customized.

Please note that this might not be officially supported by ISC.

I had not worked with DICOM for quite a while but my first tack would be to enable logging on the receiving PACS and try to see what the errors look like.

Hi Mary,

I just went through a similar exercise, although in my case, I was importing JSON Data directly into SDA.

I can think of two ways of going about this:

The first one is what you are doing - that is, extracting JSON data using %Get(), %GetNext(), etc... and save the data to the Properties of the correct EnsLib.HL7.Message object (based on the message type). This is the most straightforward way but I found that it has drawbacks. The biggest one is that you have to initialize every variable that is being saved to the object Property. The reason is, you usually wouldn't know whether a certain field that you expect is populated or not (unless you have a way to check this by running the JSON files through some kind of schema first). So if you try to store something that you expect but does not exist in this particular JSON snippet, your processing would stop. Everything that is being set to a Property should be initialized (usually to ""). 

Because you are on IRIS, you have another option. IRIS now supports JSON Adaptors (not sure if your particular version supports it). You could create a Data Model: a separate Object for each HL7 message, which would all extend %JSON.Adaptor. This is rather like working with %XML.Adaptor: you should be able to read JSON messages directly without using Dynamic Objects. The advantage would be that before storing the data in the HL7 Message Object, you would store it in an intermediate object. I see the following advantages:

1) If you need to make a change because message structure changes, you would make a change only to the particular object, not the whole Business Process/Operation that would be extracting all of your JSON files.

2) It is easier to troubleshoot and maintain for the same reason.

3) It is easier to scale for different Participants (if they have differing JSON objects)

4) These objects would be %Persistent so you could pass them to Methods and in Ensemble messages.

Because you are dealing with JSON objects that mimic HL7 messages, this method may not give you more advantages. HL7 messages are discrete so if you create a Business Service for each message you could create a business process for each one as well, making it easier to troubleshoot. 

I hope this helps.

Yes or course... in this case, you would return patients that have an instance of one Facility but not the other.

Hi George,

I would start by reading the following:

https://docs.intersystems.com/irislatest/csp/docbook/Doc.View.cls?KEY=EF...

This explains how the FTP Inbound Adapter works and how to create a Business Service with the ADAPTER parameter. As an example, take a look at EnsLib.FTP.PassthroughService  class.

An Ensemble FTP Service is essentially a poller - it would connect to the remote FTP server and pull files from a remote folder that is specified in the File Path property on the Business Service.