In Cache product line, you would need to use Dynamic Objects. In order to iterate through a JSON Object, you would need to know its structure. Most likely, it is a nested object so you would have drill down to it.

To read your JSON Object into a DynamicObject, use

Set tDynObject = {}.%FromJSON(yourJSONString).

Here is some reference for Dynamic Objects:

https://docs.intersystems.com/latest/csp/docbook/Doc.View.cls?KEY=GJSON

In IRIS Product Line, in addition to Dynamic Objects, you have JSON Adaptors. If your class inherits from %JSON.Adaptor, it can recognize JSON  key/value pairs and you can just assign them to the Properties in your object.

Here is a reference for JSON Adaptors:

https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls...

This use case looks good for the ISC product "HealthConnect"... it is designed to accept some data, transform it and output new data. It includes built-in XSLT Transforms that can transform CCD Documents into SDA (XML format for ISC Data Model) and back to CCD.

If you have ISC HealthConnect, you can do the following:

1) Use one of the CCDA-to-SDA transforms to create an SDA XML Document

2) Use TransformIntoObject() method of HS.Util.XSLTTransformer to get an SDA Object with Properties

3) Change whatever you need and save a new SDA Object

4) Use TransformFromObject() of the same class to get a new SDA Document

5) Use one of the SDA-to-CCDA XSLT Transforms to create a new CCDA Document.

All of this is basically out of the box... you just need to find a way to get your document into the system. If you just use FTP you can either use Ensemble FTP Service or write a script to transfer the files onto the file system where an Ensemble FileService would pick them up. 

You probably would need to create a Business Operation that would invoke the Transforms and manipulate your data; after that you can pass your output to an FTPOperation that will send the files out to your recipients.

If you don;t have HealthConnect, I would recommend getting it - XSLT is very fast compared to reading the XML directly into Objects. ISC basically solved this problem for you (for extra money of course...) If it is not possible, you may need to write your XML to Obj conversion... I don;t think there is anything like that in Ensemble...

You can try the ParseFile method of class %XML.TextReader. The method returns Status, but its Output parameter is a %XML.TextReader object. you can then loop over the all of the nodes of the XML document using

While (textreader.Read()) {

     ...

}

Your XML Elements will be Objects, and the Attributes of an Element will be Strings. So your ClinicalDocument objects will have Properties that will be its direct attributes, and a Property "Observation" which will be another Object. To get the Attributes you can do something like this inside your Element:

Do textreader.MoveToAttributeName("xlmns")

If textreader.LocalName = "xmlns" Set tXMLNS = textreader.Value

And so on for all attributes.

%XML.TextReader has other methods so please explore them as well...

       

Hi Frances,

You are saying that you are getting an Order message. Is it an ORM message or an ORU message (the latter typically contains OBX segments and is of type Observation/Result). 

In general, you would probably have to do something of the following:

1) During message processing, concatenate the content of every OBX:5 field into a single variable. In order to avoid a "Long String" problem in Cache, I would open a GlobalCharacterStream or FileCharacterStream and keep writing each new OBX:5 field to Stream as you loop over the segments.

2) In order to convert anything into PDF, you would need an external Rendering Engine. There is one I found on Open Exchange:

https://openexchange.intersystems.com/package/iris-pdf-generator

Also, Cache provides Apache FOP PDF Engine.  Here is some documentation on how to run it:

https://xmlgraphics.apache.org/fop/

https://xmlgraphics.apache.org/fop/2.5/running.html

The issue is, how to invoke the rendering engine from your ObjectScript code. This thread might be helpful:

https://community.intersystems.com/post/how-create-pdf-file-html

You can then write your PDF to a file.

3) If you wrote your PDF to file, you can use the following article on how to embed it into the HL7 Message:

https://community.intersystems.com/post/ensemble-how-embed-pdf-file-hl7-...

This is a complex project; but the tools listed here should help you.

Hi Blakely,

I assume you already have code that goes over your XML and extracts data into the HL7 fields.

So then you get to the Par_Location element, do something like this:

Set tPar_Location_Text = M071|M074|...

For i=1:1:$LENGTH(tPar_Location_Text) {

    Set word = $PIECE(tPar_Location_Text, "|", i)

    Do createSegment("IVT", i, word, field3, ...)

}

Note I assume you have a routine that creates the segments...

I hope this helps

Vitaly

Thank you for the answer. It does sound like upgrading to IRIS product line would resolve the issue. The extension of %Set and %Get is a nice addition; I did not know about that. That means, we can continue using Dynamic Objects at least for outbound JSON and use those Stream>base64 to the encoding in the %ToJSON() call rather than before.

In trying different things, I used Dynamic Objects to break the long string into chunks of at most 3,641,144 and store them as Dynamic Array elements. That allows to get the data out of Cache without hitting the the long string problem. This will solve our problem if the receiving system accommodates an array with multiple elements and re-assembles the string.

Hi Steve,

Thank you for the answer. It does sound like upgrading to IRIS product line would resolve the issue. The extension of %Set and %Get is a nice addition; I did not know about that. That means, we can continue using Dynamic Objects at least for outbound JSON and use those Stream>base64 to the encoding in the %ToJSON() call rather than before.

In trying different things, I used Dynamic Objects to break the long string into chunks of at most 3,641,144 and store them as Dynamic Array elements. That allows to get the data out of Cache without hitting the the long string problem. This will solve our problem if the receiving system accommodates an array with multiple elements and re-assembles the string.

Hi Yakov,

I had not worked with these adapters. Looked at the EnsLib.TCP.CountedXMLInboundAdapter; its (brief) description states that it is an adapter for an XTE server. Nonetheless, it does appear to read counted blocks from TCP socket from the Stream and write into XML-aware object. I would try it with your custom Business Service.

There is an EnsLib.XML.TCPService which extends EnsLib.TCP.PassthroughService but the only new thing it does it adds processing for XML SearchTableClass. There is also EnsLib.TCP.CountedXMLInboundAdapter (which inherits from Ens.Util.XML.Reader)

You can try to extend EnsLib.TCP.PassthroughService by overriding its OnProcessInput method and bind it to EnsLib.TCP.CountedXMLInboundAdapter (you may want to explore it further to see whether there are any useful methods there that might help you extract your XML).

In any case, a Business Service simply reads messages from input (File, TCP,  HTTP etc..) into an object. So, you also could send this object to a Business Process which will be processing your XML (it would likely need to extend %XML.Adaptor).

When you talk about writing data into SQL Server do you mean simply storing data into a Cache table? Or do you actually need to write to a real SQL Server? In the latter case, you would need a Business Operation that would bind perhaps to EnsLib.SQL.OutboundAdapter and will call your external database with the data extracted from your XML.

Hi Anna,

This error points to an Authentication issue. I see only two areas where the problem could be:

1) Incorrect username/pw (unlikely as you probably already checked it)

2) Mismatched Certificate

I assume you created an SSL Configuration in System Administration on the Server that is sending the email. Did you add any Certificate file to that configuration? If you did, I would check the Common Name (CN) - is it set to "smtp.gmail.com"?

I found a thread in which a setting in the Postfix configuration on the local server would cause a substitution of the CommonName with the Fully Qualified Domain Name obtained during DNS lookup - which was different from "smtp.gmail.com" and resulted in a similar error:

https://www.experts-exchange.com/questions/21813187/530-5-7-0-Must-issue...