Jeffrey Drumm · Sep 22, 2024 go to post

While the EnsLib.FTP.OutboundAdapter class has no Filename property, the FTP Operation classes do. The problem is you can't obtain the Session Id in the ISC-supplied operation classes, but you can insert it in the message Source property (or OriginalFilename property of Ens.StreamContainer) via a DTL or other Business Process. That value is what the %f token uses to provide the filename in those Operations.

The mechanism to obtain the Session Id differs by the process used; In a DTL, the macro $$$JobSessionId should work. In a BPL, ..%PrimaryRequestHeader.SessionId should provide the same.

If you're building your own Operation class and are, for example, subclassing EnsLib.FTP.PassthroughOperation, you can override OnMessage() and prepend $$$JobSessionId to the filename variable passed to ..Adapter.PutStream().

Jeffrey Drumm · Sep 19, 2024 go to post

I've used autofs to mount CIFS/Windows shares on-demand. It works well, and the credentials are stored outside of Cache/IRIS; this way the file operation doesn't have a need to authenticate.

Jeffrey Drumm · Sep 17, 2024 go to post

Yes, there's a "raw" syntax, but I think it's counterproductive in the long run. Segments and fields can be addressed numerically, i.e. target.{1:4} would reference MSH:4. Not very descriptive; one of the beauties of using the DTL editor and message DocTypes is that your transformations become somewhat self-documenting.

You could attempt to build an HL7 "SuperSchema" DocType/Category, I suppose, if your intent is to address message elements using the "symbolic" Virtual Document syntax. For that, you need a DocType.

Jeffrey Drumm · Sep 17, 2024 go to post

The MSH segment is itself a document element, so there's no way to reference it without a document structure (which is associated with a document type).

Jeffrey Drumm · Sep 13, 2024 go to post

Any of the tokens supported by the method FormatDateTime() in class Ens.Util.Time can be used to generate a dynamic value for the archive filename; the default of 1 uses the pattern %f_%Q, where %f is the original filename and %Q is the current time in ODBC format.

Jeffrey Drumm · Sep 12, 2024 go to post

$E is the shorthand version of $EXTRACT ...

ConvertDateTime(source.{PID:DateTimeOfBirth},"%Y%m%d%H%M%S","%Y%m%d") should work just fine.$E(source.{PID:DateTimeOfBirth},1,8) by itself should provide the same result.

But ConvertDateTime($e(source.{PID:DateTimeofBirth},1,8)) is basically taking the 8-digit date returned from $E() and converting it without any input and output patterns ... which gets you the exact same 8-digit date.

Jeffrey Drumm · Sep 11, 2024 go to post

I'm not sure why ConvertDateTime() failed; informat should be "%Y%m%d%H%M%S" and outformat "%Y%m%d" for your example transformation. Regardless, you can also use $EXTRACT(<sourcedate>,1,8). There's also the SubString() method in the DTL function list, which takes the same arguments.

Jeffrey Drumm · Aug 22, 2024 go to post

Editors and other features generally open in new tabs. I think a "Close" button was simply seen as redundant ...

Jeffrey Drumm · Aug 13, 2024 go to post

Ya see, I think this is the root cause of the sad face. The new editor is depressed because it's feeling unloved 😢

Jeffrey Drumm · Aug 12, 2024 go to post

The field definitions are properties of the *.Record class, so you could perform a property query against %Dictionary.Properties using the *.Record class as the parent.

SELECT Name
FROM %Dictionary.PropertyDefinition
WHERE parent = 'OSUMC.RecordMap.Patient.Record' AND Name >='A' AND Name <= 'z'
ORDER BY SequenceNumber ASC

That would get you the field names in the same order as the data and exclude any percent properties.

Jeffrey Drumm · Aug 12, 2024 go to post

Use either the breadcrumb link above the "New/Open/Save" buttons, or the Menu button in the upper right corner to navigate to where you want to go next. Once the DTL is saved, you can navigate away from the page without getting the "unsaved changes" prompt. Until it's compiled, though, the changes won't be available to your production.

Jeffrey Drumm · Aug 10, 2024 go to post

Sorry Scott, nothing so straightforward as that 😉

When you create a RecordMap, you usually create up to 3 classes, depending on whether or not you're using a Batch Class.

So you'll have something like:

  • OSUMC.RecordMap.Patient (the "template")
  • OSUMC.RecordMap.Patient.Record (the actual record class)
  • OSUMC.RecordMap.Patient.Batch (if you're using batch class)

If the RecordMap is the source object in your DTL, it should be an instance of OSUMC.RecordMap.Patient.Record and will be the first argument in the method below.

You'll need to create an instance of OSUMC.RecordMap.Patient with %New(), and pass it as the second argument.

Class HICG.Util.RecordMap [ Abstract ]
{
ClassMethod GetRecordAsString(pRec As %Persistent, pTmpl As %RegisteredObject) As %String
{
    Set tStream = ##class(%Stream.TmpCharacter).%New()
    Set tIOStream = ##class(%IO.MetaCharacterStream).%New(tStream)
    Set tSC = pTmpl.PutObject(tIOStream,pRec)
    If $$$ISOK(tSC)
    {
        Do tStream.Rewind()
        Return tStream.Read(tStream.Size)
    }
    // Empty string if PutObject fails *shrug*
    Return ""
}
}

In the DTL:

The value in tRecStr should be the formatted record.

Jeffrey Drumm · Aug 1, 2024 go to post

Or you could write a custom Business Process to do it. Here's an example with inadequate error processing (😉) that should give you some ideas:

/// Business Process to Modify the MSH:7 field
Class HICG.Sample.SetMSHDate Extends Ens.BusinessProcess [ ClassType = persistent ]
{
/// Downstream processes or operations to send messages to
Property TargetConfigNames As %String(MAXLEN = 1000);

Parameter SETTINGS = "TargetConfigNames:Basic:selector?multiSelect=1&context={Ens.ContextSearch/ProductionItems?targets=1&productionName=@productionId}";

/// Clone, modify and send the message downstream
Method OnRequest(pRequest As Ens.Request, Output pResponse As Ens.Response) As %Status
{
  Set tClone = pRequest.%ConstructClone()
  Set tCurDtTm = ##class(Ens.Rule.FunctionSet).CurrentDateTime("%Y%m%d%H%M%S")
  Do tClone.SetValueAt(tCurDtTm,"MSH:7")
  Do tClone.%Save()
  For i=1:1:$LENGTH(..TargetConfigNames,",") 
  {
    Set tSC = ..SendRequestAsync($PIECE(..TargetConfigNames,",",i),tClone,0)
    Return:$$$ISERR(tSC) tSC
  }
  Return $$$OK
}

/// Return an array of connections for drawing lines on the config diagram
ClassMethod OnGetConnections(Output pArray As %String, pItem As Ens.Config.Item)
{
    Do ##super(.pArray,pItem)
    If pItem.GetModifiedSetting("TargetConfigNames",.tValue) {
        For i=1:1:$LENGTH(tValue,",") { Set tOne=$ZSTRIP($P(tValue,",",i),"<>W")  Continue:""=tOne  Set pArray(tOne)="" }
    }
}
}

The reason you need to clone the inbound message is that Start-of-Session messages are immutable. You must clone them, modify the clone and send it.

Jeffrey Drumm · Jul 31, 2024 go to post

Thanks! I knew it was something like that but didn't get it quite right.

And it appears the online documentation is back up again ...

Jeffrey Drumm · Jul 26, 2024 go to post

The code block action in a DTL is for writing arbitrary ObjectScript, not Javascript. It's commonly used for for data manipulation that can't be satisfied by the methods available in the FunctionSet; for example, extracting and decoding a base64-encoded PDF from an OBX:5.5 field and writing it to a file. It can also be used to interact with globals to maintain state between invocations of the DTL, or perform a database lookup, or even write values to the default device that will display in the Test tool. Very useful for debugging.

I would not recommend using it for operations that could potentially block. There's no built-in mechanism for setting a timeout so use a BPL for those cases.

Jeffrey Drumm · Jul 24, 2024 go to post

You'll need to generate your own XML schema by importing either an xsd or wsdl. Once you've done that, you'll be able to use the schema in your DTL by selecting the Source or Target class as EnsLib.EDI.XML.Document, with the Document Type set to the name of your imported XML.

Jeffrey Drumm · Jul 23, 2024 go to post

I came across this article when troubleshooting a connectivity issue with %Net.SSH.Session and needing to use a public/private key pair for authentication. For those that also end up here because they're unable to establish a session with an ssh-rsa key:

The SHA1 signing algorithm has been deprecated for a few years and is now disabled in the latest versions of many Linux flavors. That affects ssh-rsa, as it uses SHA1. You can enable SHA1 via /etc/crypto-policies/config on RHEL 9, but you probably shouldn't.

Fortunately, ed25519 is supported and can be used with %Net.SSH.Session. The default format for both the public and private keys works; no need to create the private key in PEM format (and you likely can't anyway since ssh-keygen ignores the -m directive with ed25519).

$ ssh-keygen -t ed25519

Jeffrey Drumm · Jul 23, 2024 go to post

The WRC recommended I try signing the key with ed25519, and that works without having to re-enable SHA1.

$ ssh-keygen -t ed25519

copy the id_ed25519.pub file from the .ssh directory to authorized_keys in the remote host's .ssh directory and make sure the permissions are set to 400 for ~/.ssh and the files within.

Jeffrey Drumm · Jul 23, 2024 go to post

So ... after trying a LOT of different options, I finally uncovered the issue. The version of %Net.SSH.Session() in the HealthConnect release I'm working with (2023.1.2) requires ssh-rsa to be enabled on the remote server. And ssh-rsa requires the deprecated SHA1 algorithm which is disabled on RHEL 9.

The workaround is to issue the following command as root:

[root ~]# update-crypto-policies --set DEFAULT:SHA1

I'm hoping there's an update that eliminates the need to do this; the WRC has been notified.

Jeffrey Drumm · Jul 22, 2024 go to post

Was there a resolution for this issue? I'm encountering the exact same error on RedHat Linux 9. I've verified that the public and private keys are in the correct formats and that the permissions are properly set for the files and directories. But AuthenticateWithKeyPair() generates the same error.

The same key pair work properly to initiate a ssh/scp/sftp session in the Linux shell. They're in the .ssh directory under the irisusr account, which is the account under which HealthConnect runs, $ZV IRIS for UNIX (Red Hat Enterprise Linux 9 for x86-64) 2023.1.2 (Build 450U) Mon Oct 16 2023 11:29:24 EDT.

Jeffrey Drumm · Jul 18, 2024 go to post

Also pointed out by others was not to have the Response object be anything other than Ens.Response. Any other type would cause an Orphaned message created even if you don't use it.

Is this true even when the response object extends Ens.Response? That's a bit surprising ...

Jeffrey Drumm · Jul 18, 2024 go to post

An option that can be performed without Studio, also nice! (You do need VS Code though)

And @Robert Cemper's solution can be performed exclusively via the Management Console, which is also a great alternative.

I'm guessing that the WebSocket Terminal would also provide IRIS command shell access without an ssh session but I haven't played with that yet.

Jeffrey Drumm · Jul 16, 2024 go to post

I wrote a quick classmethod in my custom FunctionSet class to test your observation and found that I can use the full mnemonic property path name, for example:

ClassMethod GetControlID(pMsg As EnsLib.HL7.Message) As %String
{
    // Also works with "MSH:10"
    Return pMsg.GetValueAt("MSH:MessageControlID")
}

Example from a rule (I used Document, but HL7 also works):

And the resulting trace from the Visual Trace:

I'm thinking that your inbound messages might not have the DocCategory (ex. "2.3.1") and DocName (ex. "ADT_A01") properties set ... ?

Jeffrey Drumm · Jul 12, 2024 go to post

I'm pretty sure it's because a message header isn't created for an ACK, since (in most cases) it's not sent anywhere. They're tracked in Ens_Util.IOLogObj, and cleaned up from there if selected in the Message Purge task.

Jeffrey Drumm · Jul 8, 2024 go to post

Yes, the process runs as the irisusr account, but does not have irisusr's environment, it has root's.

Jeffrey Drumm · Jul 8, 2024 go to post

Great article, @Sylvain Guilbaud!

One suggestion I would make is to configure the User value as irisusr (assuming that was the user specified at installation for stopping/starting IRIS). Otherwise IRIS obtains the root environment, which can have unexpected consequences.