This may be slightly dodgy as it depends on assuming how the DTL will be compile, but I have used them to implement a loop around a set of standard DTL actions.

For one the source message had a stream with a document in it, and it needed to be chopped up into chunks, each one being put in a field in an HL7 v2 OBX segment. So an initial CODE action opens the loop with something like "Do {", gets the next chunk and quits if none. Then there are some normal ASSIGN actions setting up the new segment - easy to put in and appear in the DTL display. And finally there is a second CODE action to end the loop with "}".

It works, but only as long as InterSystems keep the compiled code for the assigns nice and simple.

A similar use was when dealing with xml documents. Again I needed a loop that enclosed normal ASSIGN actions, but this time for things like stepping backwards and deleting unwanted elements, or creating them from an array.

Also, I used a CODE action to work out if a DTL was running inside a business process, or just in the Studio for testing, so that some expected properties could be created to allow the test to run. Just for debugging.

Regards,

Mike

I agree, the Production class is a major problem for Configuration Management. In the past we've tried System Defaults and found it very awkward to use. In the last project we started with separate classes for dev, test and live. The updates to test and live versions were done in a release preparation namespace, with comparisons done to ensure they were in step, as mentioned earlier. Then a release was built from that and installed in test slot and later in live slot.

But it all got harder and harder to handle, and when the system went fully live we stopped doing full releases and went over to releasing only changed classes, etc. and never releasing the production classes. Now the problem is that changes to the production have to be done manually via the front end. Fortunately, there are a lot less of these now.

Regards,

Mike

Hi Steve,

I have done something like you describe. I used BPL, and at the time tried to keep away from using bits of code, but it got complex and in retrospect I'm not sure it was the best way. The diagrams are nice, but I think a bit of well written code might have been easier to follow!

First I created a "TempStore" class with an "MRN" (Medical Records Number) property and no permanent storage. This is used as the target class for a transform that pulls out the patient id and puts it in that property.

In the BPL Process I added an instance of the TempStore class to the BPL context object, and the first activity in the diagram is the transform with Source of "request" and Target "context.TempStore".

With the MRN found, I then use code like the following in the Value of an "assign" activity to put the target stored object into another context property of "context.BNetEpisode" already set to the same class.

##class(...).MRNIndexOpen(context.TempStore.MRN,4)

An "if" activity with a Condition like "$IsObject(context.BNetEpisode)" is used to see if anything was found, and create a new one if required by setting the "context.BNetEpisode.MRN" property equal to "context.TempStore.MRN".

The "context.BNetEpisode" property is then be used as the Target for "transform" activities later on with Create = "existing" used. Ensemble does a save automatically when the Process completes.

I hope this makes sense. (I cannot provide the full code as it belongs to the customer, and anyway it gets a lot more complex as there are 3 types of inbound message, one of which was an HL7 v3 document, and it was actually using an xml document inside the stored object to hold much of the data - but that's another story).

Mike

Hi Andrew,

Thanks for posting this. I am in the process of coding a similar automated statistics email, so I am reading your code with interest and may well use some of it. My only comment is that since this is for monitoring Ensemble, I actually decided to use Ensemble to implement it. So I have a message class with properties like "Subject" etc. and this is sent to an Operation that uses the Ensemble email adapter to send it. There is also a Service that runs once a day to pull the required information and builds the email message.

Mike

There may be arguments in favour of either solution, depending on the types of data involved and programmer preference, but if you are to embrace the full Ensemble "model" then I think the second option is far better.

By putting the non-HL7 into a message sent to a business process,  it gets stored and becomes visible in Ensemble in it's raw form (or as close as you can make it) on the  message queue into that Process. This makes support much easier as you can see before and after messages in the Ensemble GUI. Also, a Business Service should do a minimum of work so that messages are input as fast as possible.

Using DTL is also the cleanest option, since it is meant to transform messages, but I admit that sometimes this is more effort than it is worth. I have had to deal with complex xml documents, and ended up writing methods in my custom message class to make extraction easier to understand in the DTL.

Mike