Oh yes, it finally settled down and worked normally, but it seemed to be spending a lot of time talking to the servers before displaying the package listing. The servers are on AWS and I'm connected via a VPN, but my connection is quite fast for a home office and I've never noticed that before.

I've also exited and relaunched VS Code since without any significant delay, so it must've been a one-off.

I upgraded to the new beta that uses the proposed apis (v2.12.9-beta.1), and when I reconnected to the server, it took quite some time (5-10 minutes) before I could get to work. Since then, performance has been normal. Is there some new feature that indexes classes/routines locally or something? I couldn't find anything in the usual logs that indicated what was going on.

While the EnsLib.FTP.OutboundAdapter class has no Filename property, the FTP Operation classes do. The problem is you can't obtain the Session Id in the ISC-supplied operation classes, but you can insert it in the message Source property (or OriginalFilename property of Ens.StreamContainer) via a DTL or other Business Process. That value is what the %f token uses to provide the filename in those Operations.

The mechanism to obtain the Session Id differs by the process used; In a DTL, the macro $$$JobSessionId should work. In a BPL, ..%PrimaryRequestHeader.SessionId should provide the same.

If you're building your own Operation class and are, for example, subclassing EnsLib.FTP.PassthroughOperation, you can override OnMessage() and prepend $$$JobSessionId to the filename variable passed to ..Adapter.PutStream().

Yes, there's a "raw" syntax, but I think it's counterproductive in the long run. Segments and fields can be addressed numerically, i.e. target.{1:4} would reference MSH:4. Not very descriptive; one of the beauties of using the DTL editor and message DocTypes is that your transformations become somewhat self-documenting.

You could attempt to build an HL7 "SuperSchema" DocType/Category, I suppose, if your intent is to address message elements using the "symbolic" Virtual Document syntax. For that, you need a DocType.

$E is the shorthand version of $EXTRACT ...

ConvertDateTime(source.{PID:DateTimeOfBirth},"%Y%m%d%H%M%S","%Y%m%d") should work just fine.$E(source.{PID:DateTimeOfBirth},1,8) by itself should provide the same result.

But ConvertDateTime($e(source.{PID:DateTimeofBirth},1,8)) is basically taking the 8-digit date returned from $E() and converting it without any input and output patterns ... which gets you the exact same 8-digit date.

The field definitions are properties of the *.Record class, so you could perform a property query against %Dictionary.Properties using the *.Record class as the parent.

SELECT Name
FROM %Dictionary.PropertyDefinition
WHERE parent = 'OSUMC.RecordMap.Patient.Record' AND Name >='A' AND Name <= 'z'
ORDER BY SequenceNumber ASC

That would get you the field names in the same order as the data and exclude any percent properties.

Sorry Scott, nothing so straightforward as that 😉

When you create a RecordMap, you usually create up to 3 classes, depending on whether or not you're using a Batch Class.

So you'll have something like:

  • OSUMC.RecordMap.Patient (the "template")
  • OSUMC.RecordMap.Patient.Record (the actual record class)
  • OSUMC.RecordMap.Patient.Batch (if you're using batch class)

If the RecordMap is the source object in your DTL, it should be an instance of OSUMC.RecordMap.Patient.Record and will be the first argument in the method below.

You'll need to create an instance of OSUMC.RecordMap.Patient with %New(), and pass it as the second argument.

Class HICG.Util.RecordMap [ Abstract ]
{
ClassMethod GetRecordAsString(pRec As %Persistent, pTmpl As %RegisteredObject) As %String
{
    Set tStream = ##class(%Stream.TmpCharacter).%New()
    Set tIOStream = ##class(%IO.MetaCharacterStream).%New(tStream)
    Set tSC = pTmpl.PutObject(tIOStream,pRec)
    If $$$ISOK(tSC)
    {
        Do tStream.Rewind()
        Return tStream.Read(tStream.Size)
    }
    // Empty string if PutObject fails *shrug*
    Return ""
}
}

In the DTL:

The value in tRecStr should be the formatted record.

Or you could write a custom Business Process to do it. Here's an example with inadequate error processing (😉) that should give you some ideas:

/// Business Process to Modify the MSH:7 field
Class HICG.Sample.SetMSHDate Extends Ens.BusinessProcess [ ClassType = persistent ]
{
/// Downstream processes or operations to send messages to
Property TargetConfigNames As %String(MAXLEN = 1000);

Parameter SETTINGS = "TargetConfigNames:Basic:selector?multiSelect=1&context={Ens.ContextSearch/ProductionItems?targets=1&productionName=@productionId}";

/// Clone, modify and send the message downstream
Method OnRequest(pRequest As Ens.Request, Output pResponse As Ens.Response) As %Status
{
  Set tClone = pRequest.%ConstructClone()
  Set tCurDtTm = ##class(Ens.Rule.FunctionSet).CurrentDateTime("%Y%m%d%H%M%S")
  Do tClone.SetValueAt(tCurDtTm,"MSH:7")
  Do tClone.%Save()
  For i=1:1:$LENGTH(..TargetConfigNames,",") 
  {
    Set tSC = ..SendRequestAsync($PIECE(..TargetConfigNames,",",i),tClone,0)
    Return:$$$ISERR(tSC) tSC
  }
  Return $$$OK
}

/// Return an array of connections for drawing lines on the config diagram
ClassMethod OnGetConnections(Output pArray As %String, pItem As Ens.Config.Item)
{
    Do ##super(.pArray,pItem)
    If pItem.GetModifiedSetting("TargetConfigNames",.tValue) {
        For i=1:1:$LENGTH(tValue,",") { Set tOne=$ZSTRIP($P(tValue,",",i),"<>W")  Continue:""=tOne  Set pArray(tOne)="" }
    }
}
}

The reason you need to clone the inbound message is that Start-of-Session messages are immutable. You must clone them, modify the clone and send it.

The code block action in a DTL is for writing arbitrary ObjectScript, not Javascript. It's commonly used for for data manipulation that can't be satisfied by the methods available in the FunctionSet; for example, extracting and decoding a base64-encoded PDF from an OBX:5.5 field and writing it to a file. It can also be used to interact with globals to maintain state between invocations of the DTL, or perform a database lookup, or even write values to the default device that will display in the Test tool. Very useful for debugging.

I would not recommend using it for operations that could potentially block. There's no built-in mechanism for setting a timeout so use a BPL for those cases.