If the embedded file is long enough, then that GetValueAt call will truncate it.  Instead you need to use GetFieldStreamRaw.  You do not need to save the file locally in order to attach it to an email.  If you want to do both, you should use a file operation for saving the file to disk and a separate email operation for sending out the email.

What I would do is extract the file from the HL7 with GetFieldStreamRaw and then store it in a message object.

To make Ensemble send an email you need a custom EMail operation:


Your message object that you send to the email operation needs to have a stream property to hold the file contents.

Your custom email operation will need to create a %Net.MailMessage object to send the mail.  This object is where you attach the file.  You can use the AttachStream method to attach the stream as a file.


I think the way I would code this is create a new message object which contains properties for all the fields that might appear in the file.  Then create a new custom file service using the adapter EnsLib.File.InboundAdapter.  In the OnProcessInput for that new file service, use Dmitry and Carlos's suggestion to read the file with pRequest.ReadLine() and parse each line to extract the property name and value and store the value in the appropriate property of the new message object.

Then, have the service send that message object to a router with a DTL transform which will convert from your new message object class into an HL7 message and send that message to the target.

I don't think there is a way to repeatedly receive queue count alerts if the queue remains high.

What you could do is use Managed Alerts rather than simple alerts and you can configure the managed alert to repeatedly send notification emails until the person responsible closes the alert.

Doing it this way will require a person to actually look at the queue and either verify it's gone down and close the alert, or take action to reduce the queue.

I just thought of another method.  Getting the count should work for checking for existence as well.  If using DOM-style paths, then get the count with [*] and if using VDoc style paths, then get the count with (*).  Using the VDoc path example from above:

<if condition='source.{element1.element2.element3(*)}&gt;0'>

This will return True if there are any element3's inside the element2.  It will return False if there are not.

If you just want to know whether there is a value or not in element1.element2.element3, compare it to "".  The 'not equal' ('=) operator in Cache contains an apostrophe, so it must be escaped in the XML. (if a '= b write "a is not equal to b")

<if condition='source.{element1.element2.element3}&apos;=""'>

This will return True if element1.element2.element3 has a value, or it will return False if either element1.element2.element3 does not exist or if element1.element2.element3 exists but contains no value, like this:


If you need to differentiate between these two cases, things get more difficult.  There is no method in XML VDoc that can tell you whether an element exists or not.  In general with XML, the existence of an element shouldn't be used as a logical boolean value.  Instead you should use a boolean datatype in your XML schema.

If you must check whether element3 exists or not, you could do it like this:

say your XML document looks like this:


If you were to get the value of element2 with an assign action like this:

<assign value='element1.element2' property='e2' action='set' />

then after that line, the variable e2 will hold the value "<element3>sometext</element3>" so to check for the existence of element3, you could search e2 for the text "<element3>" with the ..Contains function:

<if condition='..Contains(e2,"&lt;element3&gt;")' >

If your element3 might be self-closed (like this: <element3 />) then you'll need to account for that possibility as well.

Any given core can only run one process at a time, so it is surprising that you saw faster performance with a pool size of 100 than a pool size of 50.  If anything, that should result in more context switching resulting in a reduction in performance.

I am seeing the same thing.  OutputToString() internally uses OutputToIOStream() but it sets the CharEncoding on the stream to "binary" before passing it.  I think this is the source of the problem.

I was able to work around it using OutputToLibraryStream instead:

ENSEMBLE>set msg = ##class(EnsLib.EDI.XML.Document).ImportFromString("<Test>מִבְחָן</Test>")

ENSEMBLE>write msg.OutputToString()
ENSEMBLE>set stream = ##class(%Stream.TmpCharacter).%New()

ENSEMBLE>write msg.OutputToLibraryStream(.stream)
ENSEMBLE>write stream.Read()

Just to add on to this, lookup tables are stored in ^Ens.LookupTable, subscripted by the table name and then the key.  The value of each node is the lookup value of the key in the subscript.

For example, if you have a table named Codes which contains a key named 123 that maps to value "ABC", it will look like this in the global:


The reason reverse lookups are not supported is because we allow many keys to map to the same value.  So if you need to find a key, given a value, there may be many values that match.

Writing code to perform a reverse lookup will involve copying the ^Ens.LookupTable global into a variable subscripted by value rather than by key, like this:


and then performing a lookup on that variable.