I did:

Class OrdRes.VendorMDM Extends Ens.DataTransformDTL [ DependsOn = EnsLib.HL7.Message ]
{

Parameter IGNOREMISSINGSOURCE = 1;
Parameter REPORTERRORS = 1;
Parameter TREATEMPTYREPEATINGFIELDASNULL = 0;
XData DTL [ XMLNamespace = "http://www.intersystems.com/dtl" ]
{
<transform sourceClass='EnsLib.HL7.Message' targetClass='EnsLib.HL7.Message' sourceDocType='2.3:ORU_R01' targetDocType='2.5:MDM_T02' create='new' language='objectscript' >
<assign value='source.{MSH}' property='target.{MSH}' action='set' />
<assign value='"MDM"' property='target.{MSH:MessageType.MessageCode}' action='set' />
<assign value='"T02"' property='target.{MSH:MessageType.TriggerEvent}' action='set' />
<assign value='"2.5"' property='target.{MSH:VersionID.VersionID}' action='set' />
<assign value='source.{MSH:DateTimeofMessage}' property='target.{EVN:2}' action='set' />
<assign value='source.{PIDgrpgrp(1).PIDgrp.PID}' property='target.{PID}' action='set' />
<assign value='source.{PIDgrpgrp(1).PIDgrp.PV1grp.PV1}' property='target.{PV1}' action='set' />
<assign value='source.{PIDgrpgrp(1).ORCgrp(1).ORC}' property='target.{ORCgrp(1).ORC}' action='set' />
<assign value='source.{PIDgrpgrp(1).ORCgrp(1).OBR}' property='target.{ORCgrp(1).OBR}' action='set' />
<assign value='source.{PIDgrpgrp(1).ORCgrp(1).NTE()}' property='target.{ORCgrp(1).NTE()}' action='set' />
<assign value='"Endoscopy Image"' property='target.{TXA:DocumentType}' action='set' />
<assign value='"AU"' property='target.{TXA:DocumentCompletionStatus}' action='set' />
<assign value='"AV"' property='target.{TXA:DocumentAvailabilityStatus}' action='set' />
<foreach property='source.{PIDgrpgrp(1).ORCgrp(1).OBXgrp()}' key='k1' >
<assign value='source.{PIDgrpgrp(1).ORCgrp(1).OBXgrp(k1).OBX:SetIDOBX}' property='target.{OBXgrp(k1).OBX:SetIDOBX}' action='set' />
<assign value='source.{PIDgrpgrp(1).ORCgrp(1).OBXgrp(k1).OBX:ValueType}' property='target.{OBXgrp(k1).OBX:ValueType}' action='set' />
<assign value='source.{PIDgrpgrp(1).ORCgrp(1).OBXgrp(k1).OBX:ObservationIdentifier}' property='target.{OBXgrp(k1).OBX:ObservationIdentifier}' action='set' />
<assign value='source.{PIDgrpgrp(1).ORCgrp(1).OBXgrp(k1).OBX:ObservationSubID}' property='target.{OBXgrp(k1).OBX:ObservationSubID}' action='set' />
<assign value='source.{PIDgrpgrp(1).ORCgrp(1).OBXgrp(k1).OBX:Units.identifier}' property='target.{OBXgrp(k1).OBX:5.3}' action='set' />
<assign value='source.{PIDgrpgrp(1).ORCgrp(1).OBXgrp(k1).OBX:Units.identifier}' property='target.{OBXgrp(k1).OBX:5.4}' action='set' />
<if condition='source.{PIDgrpgrp(1).ORCgrp(1).OBXgrp(k1).OBX:SetIDOBX}' >
<true>
<code>
<![CDATA[ do source.GetFieldStreamRaw(.tStream,"PIDgrpgrp(1).ORCgrp(1).OBXgrp("_k1_").OBX:5(1).1",.tRem)
 //
 set tRem = "|PDF|||||F|"
 //
 // Store the stream to the appropriate target field
 do target.StoreFieldStreamRaw(tStream,"OBXgrp("_k1_").OBX:5(1).5",tRem)]]></code>
</true>
<false>
<assign value='source.{PIDgrpgrp(1).ORCgrp(1).OBXgrp(k1).OBX}' property='target.{OBXgrp(k1).OBX}' action='set' />
</false>
</if>
</foreach>
<assign value='source.{PID:18}' property='target.{TXA:12.3}' action='set' />
</transform>
}

}

Now, I used PDFs rather than BMPs, I'm a little OCD, so my output looks slightly different from yours. But it does work. Notice that I used the numeric syntax to reference OBX:5's components, though. There are no symbolic names for those components in HL7, but they're still recognized using the numeric syntax.

Also, I think one of the OBX:5 components should probably contain "Base64" since that's probably how OBX:5.5 is encoded.

Here's the output:

Hi Anthony,

I think the issue is that you're using GetFieldStreamRaw() against the entire OBX segment, when you should be using it against the field that contains the stream: OBX:5.1. The method can take 3 arguments, the 3rd being a variable passed by reference that contains the remainder of the current OBX segment. That variable is of type %String and can be modified to include different values for the remaining fields, and then supplied as the 3rd argument to StoreFieldStreamRaw() ... which you would use to populate OBX:5.5.

These methods are usually used in a code block, where passing a variable by reference is supported (precede it with a period). You'll need to do that with both the first and 3rd arguments in GetFieldStreamRaw().

It's also important to note that once you've used StoreFieldStreamRaw(), the target segment becomes immutable; no further changes can be made to it. That's why the remainder variable is so important as it populates the remainder of the segment at the time the stream is stored to the field.

The DTL flow would Look like this:

  1. Populate everything in the target message, up to the OBX
  2. In a Foreach over the OBX:
    1. Populate everything in the target OBX preceding OBX:5.5
    2. Execute a code block similar to the following:
// Get the stream data (no need to instantiate a stream object in advance)
do source.GetFieldStreamRaw(.tStream,"PIDgrpgrp(1).ORCgrp(1).OBXgrp("_k1_").OBX:5(1).1",.tRem)
//
// Insert code here to modify tRem to accommodate any changes needed to 
// fields after OBX:5(1).5
//
// Store the stream to the appropriate target field
do target.StoreFieldStreamRaw(tStream,"OBXgrp("_k1_").OBX:5(1).5",tRem)

Then populate any remaining segments as you normally would.

You cannot import formatted text in Excel with a tab delimited text file as the source. The file must be created in either native Excel format or HTML.

There are many posts and articles on this Developer Community that discuss the generation of Excel-compatible files that will support text formatting; too many options to list them all here. Search for "Excel files" and you may find an answer that will work for you.

I ran into this problem as well, and I vaguely recalled that the passthrough file service has issues with multiple targets. So I went to the documentation and tried to find any reference to the need to ensure the operations were called synchronously and (initially) couldn't. So I went ahead and configured the services with Work and Archive directories and waited for something bad to happen.

And of course, something bad did happen.

I later found the file adapter documentation that provides a clear description of how the Work and Archive directories impact synchronous delivery. It seems to indicate that the Work directory is more likely to break sync than the Archive directory, and that you can have an Archive directory configured while still supporting sync as long as the Work directory is not configured.

Oh yes, it finally settled down and worked normally, but it seemed to be spending a lot of time talking to the servers before displaying the package listing. The servers are on AWS and I'm connected via a VPN, but my connection is quite fast for a home office and I've never noticed that before.

I've also exited and relaunched VS Code since without any significant delay, so it must've been a one-off.

I upgraded to the new beta that uses the proposed apis (v2.12.9-beta.1), and when I reconnected to the server, it took quite some time (5-10 minutes) before I could get to work. Since then, performance has been normal. Is there some new feature that indexes classes/routines locally or something? I couldn't find anything in the usual logs that indicated what was going on.

While the EnsLib.FTP.OutboundAdapter class has no Filename property, the FTP Operation classes do. The problem is you can't obtain the Session Id in the ISC-supplied operation classes, but you can insert it in the message Source property (or OriginalFilename property of Ens.StreamContainer) via a DTL or other Business Process. That value is what the %f token uses to provide the filename in those Operations.

The mechanism to obtain the Session Id differs by the process used; In a DTL, the macro $$$JobSessionId should work. In a BPL, ..%PrimaryRequestHeader.SessionId should provide the same.

If you're building your own Operation class and are, for example, subclassing EnsLib.FTP.PassthroughOperation, you can override OnMessage() and prepend $$$JobSessionId to the filename variable passed to ..Adapter.PutStream().