So I forgot that there's a Pad() method in the DTL function list that would likely work better for your purposes than $EXTRACT() and $JUSTIFY(). You can use it to zero or space fill the fields to the required width. The first argument is the value to pad, the 2nd the width (positive numbers for pad right, negative for left), and the 3rd is the pad character to fill with.

Your update to the requirements is incomplete; it doesn't specify what, if anything, goes in the 2nd (and subsequent) row(s) of the output after the ItemCodeExternal.Identifier value, whether the fractional value is in the Quantity field is right or left justified zero-filled, or whether the UnitofMeasure and DateNeeded values are padded to make line length consistent across all records.

Here's an example of what it might look like and will need to be adjusted to accommodate your vendor's spec:

The code rules that write the records to the stream would need to be adjusted to eliminate the "|" delimiters and insert the renamed/added variables:

This should get you to where you need to be.

Hi Doug,

Looks like only two lines need to be changed in what I posted previously; they're numbered 20 and 26 in the screen shot. You'd use a combination of string concatenation and $EXTRACT(), along with $JUSTIFY() to line things up according to the specification. Alternately you can set the variables to the justified/aligned/padded version of the values extracted from the HL7 message and then just concatenate them without the pipe characters in lines 20 and 26.

If I have time over the weekend I'll take a stab at it, but if you figure it out beforehand, please post your solution here.

I went ahead and created a DTL that appears to do what you requested and does not require a custom File Operation to work; It assumes you're using EnsLib.File.PassthroughOperation as the outbound operation class.

The filename is created using the value set for target.OriginalFilename in Ens.StreamContainer in the DTL, so you could base it on something from the HL7 message itself or just set it to a static value (as I've done). You can use date/time tokens in the outbound operation's File Name field to aggregate multiple messages per file, or just let it create uniquely named files for each message with the default pattern.

Here's the DTL Configuration:

And the rules:

To test, I created a HL7 file with repeating ORC groups based on the sample provided in your post, but the DTL will work whether it's repeating or not:

The Filename pattern I used in the outbound operation:

This file was created:

And contained this output:

Hope this helps.

If the 3rd argument to EnableConfigItem() is 1, the method will update the production on each call. That can be time consuming, so it might be worth considering setting that to 0 and then call Ens.Director.UpdateProduction() after the loop completes.

The other issue is that simply disabling a Production Config Item will only shut it down at the next polling interval or completion of the currently-processing request. This is a generally a good thing, but can take time for some interfaces.

For @Eduard Lebedyuk's benefit ... the regex @Scott Roth referred to is most likely to allow the selective shutdown of interfaces by name pattern to accommodate outages/upgrades for external systems. Alternately to be able to disable inbound interfaces before outbound interfaces to prevent queued messages.

Well I guess there IS a setting (thanks, @Eduard Lebedyuk!) laugh

The parameter Undefined specifies the behavior when ObjectScript attempts to fetch the value of a variable that has not been defined. The value of Undefined may be 0, 1, or 2:

  • 0 - Always throw an <UNDEFINED> error. (default)
  • 1 - If the undefined variable has subscripts, return a null string, but if the undefined variable is single-valued, throw an <UNDEFINED> error.
  • 2 - Always return a null string.

You can change that setting in System Administration | System Configuration | Additional Settings | Compatibility.

What does your method's argument list look like? If it's something like this:

Method Encrypt(pVarA As %String = "", pVarB As %String = "", pVarC As %String = "") As %Status

The pVar* variables above should automatically default to empty strings when the method is called as provided in your first example.

I'm not aware of any system setting that would affect the behavior of unsupplied values for method arguments when they're not defined with an initial value (unlike those in my snippet above).

That doesn't mean that there isn't one, though ...

Not sure what version of Caché or IRIS you're on; for future reference it's helpful to include that information. In IRIS 2021.2, you can do this from the IRIS SQL Shell:

JEFF>do $system.SQL.Shell()
SQL Command Line Shell
----------------------------------------------------
The command prefix is currently set to: <<nothing>>.
Enter <command>, 'q' to quit, '?' for help.

[SQL]JEFF>>set displaypath /home/jeff/tmp/
displaypath = /home/jeff/tmp/

[SQL]JEFF>>set displayfile sqlout
displayfile = sqlout

[SQL]JEFF>>set displaymode csv
displaymode = csv

[SQL]JEFF>>set selectmode display
selectmode = display

[SQL]JEFF>>select top 100 * from Ens_Util.Log
13.     select top 100 * from Ens_Util.Log

/home/jeff/tmp/sqlout.csv
/home/jeff/tmp/sqloutMessages.txt

statement prepare time(s)/globals/cmds/disk: 0.0002s/6/831/0ms
          execute time(s)/globals/cmds/disk: 0.0035s/467/20822/0ms
                          cached query class: %sqlcq.JEFF.cls115
---------------------------------------------------------------------------

The default delimiter is comma, but you can change that. For example, the tab character:

[SQL]JEFF>>set displaydelimiter = $C(9)

A Business Process Component is the BPL analogue of a subroutine or function and is called exclusively from a BPL. The idea is that they can be reusable components applicable to potentially multiple, different business processes. I don't think that's really what you're looking for.

If you don't have a FIFO concern with this database processing and are thinking that increasing the number of parallel processes performing these database activities might improve performance, you could try increasing the pool size for the BP.

There are methods for dealing with what are essentially embedded streams in HL7 Objects. See the methods GetFieldStreamRaw() and StoreFieldStreamRaw() in class EnsLib.HL7.Message; these are useful for copying streams from one message to another. If the need is to extract the Base64 stream as a binary stream for writing to a file, there's also GetFieldStreamBase64() in the same class; the stream obtained from it can be used with file-based streams to write to a disk file.

I'm not sure whether this will work in Ensemble 2018.1, but it does seem to work fine in IRIS for Health Interoperability 2022.2.

I'm testing with a simple JSON file that looks like this:

{
    "mrn": 12345678,
    "name": "Johann Smythe",
    "firstname": "Johann",
    "lastname": "Smythe",
    "dob": "1989-03-21 14:20:00",
    "phone": "(555) 555-4917",
    "mobile": "(555) 555-6401",
    "email": "johann@smythe.com",
    "address": "123 Anystreet St",
    "city": "Anytown",
    "state": "ME",
    "zip": "04121"
}

I've used the File Passthrough Service (EnsLib.File.PassthroughService) to read the JSON document into a stream, message class Ens.StreamContainer. Because this isn't an HL7 object, my router is based on the "General Message Routing Rule" rule type, and my constraint consists of the source service name with a message class of Ens.StreamContainer.

In the DTL called by the send action, I use Ens.StreamContainer as the source message type. The target message type is EnsLib.HL7.Message with whatever Document Category and Type is needed.

The first rule in the DTL is this:

After setting a number of default values for the target HL7 message (Event Type/Trigger, Date/Time of message, etc.) I populate the PID fields as follows:

And I now have an HL7 message created from JSON.

This isn't going to work with a batch of patient records in a JSON array; you'd need to create a BPL to process that. But for input that consists of a simple structure like the example I used, you can accomplish what you need without building a custom service or creating a BPL.

Are you using LDAP for authentication? I seem to remember running into this when the web applications created as part of enabling Ensemble/Interoperability weren't set to support LDAP.

Compare the settings for the web applications created for your new namespace in Security | Applications | Web Applications with those from other (working) Ensemble-enabled namespaces.

For those that use Interoperability/HealthConnect, nc/netcat is also an excellent tool for verifying that remote ports are accessible for HL7 MLLP, HTTP or other protocols that require a TCP socket client connection.

And while this thread is specifically for Unix/Linux, there's a Windows PowerShell analogue named Test-NetConnection (alias tnc) that provides a subset of nc's features.

Something like this, perhaps?

Class User.Util.StringFunctions Extends Ens.Util.FunctionSet
{

ClassMethod ReReplace(pStr As %String, pPat As %String, pRepl As %String = "") As %String
{
    Set tStrt = $LOCATE(pStr,pPat,,tEnd) - 1
    // in case the pattern isn't found, return source string
    Return:(tStrt < 0) pStr
    Set tPrefix = $EXTRACT(pStr,1,tStrt)
    Set tSuffix = $EXTRACT(pStr,tEnd,*)
    Return tPrefix_pRepl_tSuffix
}

}
USER> set mystr = "REASON->Blood(1.23)"
USER> set newstr = ##class(User.Util.StringFunctions).ReReplace(mystr,"->\w+")
USER> write newstr
REASON(1.23)
USER> set altstr =  ##class(User.Util.StringFunctions).ReReplace(mystr,"->\w+","-CODE")
USER> write altstr
REASON-CODE(1.23)