Hi Steve,

Thank you for the answer. It does sound like upgrading to IRIS product line would resolve the issue. The extension of %Set and %Get is a nice addition; I did not know about that. That means, we can continue using Dynamic Objects at least for outbound JSON and use those Stream>base64 to the encoding in the %ToJSON() call rather than before.

In trying different things, I used Dynamic Objects to break the long string into chunks of at most 3,641,144 and store them as Dynamic Array elements. That allows to get the data out of Cache without hitting the the long string problem. This will solve our problem if the receiving system accommodates an array with multiple elements and re-assembles the string.

Hi Yakov,

I had not worked with these adapters. Looked at the EnsLib.TCP.CountedXMLInboundAdapter; its (brief) description states that it is an adapter for an XTE server. Nonetheless, it does appear to read counted blocks from TCP socket from the Stream and write into XML-aware object. I would try it with your custom Business Service.

There is an EnsLib.XML.TCPService which extends EnsLib.TCP.PassthroughService but the only new thing it does it adds processing for XML SearchTableClass. There is also EnsLib.TCP.CountedXMLInboundAdapter (which inherits from Ens.Util.XML.Reader)

You can try to extend EnsLib.TCP.PassthroughService by overriding its OnProcessInput method and bind it to EnsLib.TCP.CountedXMLInboundAdapter (you may want to explore it further to see whether there are any useful methods there that might help you extract your XML).

In any case, a Business Service simply reads messages from input (File, TCP,  HTTP etc..) into an object. So, you also could send this object to a Business Process which will be processing your XML (it would likely need to extend %XML.Adaptor).

When you talk about writing data into SQL Server do you mean simply storing data into a Cache table? Or do you actually need to write to a real SQL Server? In the latter case, you would need a Business Operation that would bind perhaps to EnsLib.SQL.OutboundAdapter and will call your external database with the data extracted from your XML.

Hi Anna,

This error points to an Authentication issue. I see only two areas where the problem could be:

1) Incorrect username/pw (unlikely as you probably already checked it)

2) Mismatched Certificate

I assume you created an SSL Configuration in System Administration on the Server that is sending the email. Did you add any Certificate file to that configuration? If you did, I would check the Common Name (CN) - is it set to "smtp.gmail.com"?

I found a thread in which a setting in the Postfix configuration on the local server would cause a substitution of the CommonName with the Fully Qualified Domain Name obtained during DNS lookup - which was different from "smtp.gmail.com" and resulted in a similar error:

https://www.experts-exchange.com/questions/21813187/530-5-7-0-Must-issue...

Hi Rob,

This discussion I had with Mary George a couple of weeks ago might help:

https://community.intersystems.com/post/hl7-message-json-input-file

Here the issue is accepting HL7 messages encoded as JSON; you are trying to the reverse.

Basically, first, you would have to save your HL7 data to the correct EnsLib.HL7.Message object (in this case the message type is ADT). This Object has several useful Properties and Methods that would allow you to get the number of Segments in the message and also get a segment Object by index. You can loop over your segments and extract the data from each segment. 

Once you have the data in the Properties of the object, there are two ways to continue:

1) If you were on IRIS, your custom object could also extend %JSON.Adaptor. The data from the message that you stored in Properties could then be exported to a JSON string by calling yourObject.%JSONExport(). However, this option is not available in Cache product line.

2) The Cache product line has support for Dynamic Objects. These are JSON-like objects that cannot be passed in methods or in Ensemble messages directly - you would need to serialize them into a String or a Character Stream (but they would appear as JSON).

Here is the link to the documentation on JSON:

https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls...

(Because this is for IRIS is also includes a chapter on JSON Adaptors).

For example, in the same loop where you process your message, you could create a Dynamic Object for each Segment Field and use fieldObject.%Set("PID:"_i, valueOfPID:i) where i is the field number.

At the very end, where you have your complex Message Object you would 

Do msgDynamicObject.%ToJSON() - that will serialize the object to JSON String which you can then assign to a Property of an HttpRequest and post it to the Saleforce API. (I would create a separate Business Operation for that and use EnsLib.HTTP.OutboundAdapter). You would then need to process their JSON Response by doing

Set responseObject = {}.%FromJSON(HttpResponse.Data)

I hope this helps. Unfortunately, there is no easy way to do this.

Hi Michael,

Are you talking about Translation Maps in HealthShare Terminology Management?

I think Dmitriy is right. You would probably need to create an intermediate Object Model - a class or classes that model your XML Structure and another class that models your JSON Structure. The XML model extends %XML.Adaptor and imports your XML using tools like ImportXMLFromStream etc.., while the JSON model extends %JSON.Adaptor. Then you would step through your XML model, copying your XML properties to the properties of your JSON-aware class. 

Looking at the code, I see that EnsLib.RecordMap.Service.FileService extends EnsLib.RecordMap.Service.Standard. You can try to create a custom Business Service (for example, EnsLib.RecordMap.Service.StreamService) extending the same parent class and override the OnProcessInput method to process your stream.

One issue you might have is, the FileService is bound to EnsLib.File.InboundAdapter that does a lot of work in the service. There are other types of out-of-the-box adapters - like EnsLib.HTTP.InboundAdapter which you could use here but your code would be customized.

Please note that this might not be officially supported by ISC.

I had not worked with DICOM for quite a while but my first tack would be to enable logging on the receiving PACS and try to see what the errors look like.

Hi Mary,

I just went through a similar exercise, although in my case, I was importing JSON Data directly into SDA.

I can think of two ways of going about this:

The first one is what you are doing - that is, extracting JSON data using %Get(), %GetNext(), etc... and save the data to the Properties of the correct EnsLib.HL7.Message object (based on the message type). This is the most straightforward way but I found that it has drawbacks. The biggest one is that you have to initialize every variable that is being saved to the object Property. The reason is, you usually wouldn't know whether a certain field that you expect is populated or not (unless you have a way to check this by running the JSON files through some kind of schema first). So if you try to store something that you expect but does not exist in this particular JSON snippet, your processing would stop. Everything that is being set to a Property should be initialized (usually to ""). 

Because you are on IRIS, you have another option. IRIS now supports JSON Adaptors (not sure if your particular version supports it). You could create a Data Model: a separate Object for each HL7 message, which would all extend %JSON.Adaptor. This is rather like working with %XML.Adaptor: you should be able to read JSON messages directly without using Dynamic Objects. The advantage would be that before storing the data in the HL7 Message Object, you would store it in an intermediate object. I see the following advantages:

1) If you need to make a change because message structure changes, you would make a change only to the particular object, not the whole Business Process/Operation that would be extracting all of your JSON files.

2) It is easier to troubleshoot and maintain for the same reason.

3) It is easier to scale for different Participants (if they have differing JSON objects)

4) These objects would be %Persistent so you could pass them to Methods and in Ensemble messages.

Because you are dealing with JSON objects that mimic HL7 messages, this method may not give you more advantages. HL7 messages are discrete so if you create a Business Service for each message you could create a business process for each one as well, making it easier to troubleshoot. 

I hope this helps.