Assuming you cut and pasted from your DTL, the double-quote characters around the 2nd dash are incorrect. They appear to be the distinct open and close quote characters that Word automatically substitutes for the "standard" double-quote character:

set            tSSN                                                                  source.{PID:SSNNumberPatient}                               
set            target.{PID:SSNNumberPatient}                  $E(tSSN,1,3)_"-"_$E(tSSN,4,5)_-_$E(tSSN,6,9)

If you create the following class in IRIS Studio, a "FormatSSN()" function will be available in the DTL editor's expression editor, and you can use it in your set rules:

Class Misc.Util.StringFunction Extends Ens.Rule.FunctionSet
{ 
ClassMethod FormatSSN(pStr As %String) As %String
{
   Set tStr = $ZSTRIP(pStr,"*AWP")
   Return $E(tStr,1,3)_"-"_$E(tStr,4,5)_"-"_$E(tStr,6,9)
}
}

The method strips all non-numeric characters from the value passed to it, then formats it per the SSN format. It doesn't verify that the proper number of digits are present, but that's something that can be easily added.

In a DTL action, it would look like this:

<assign value='$PIECE($ZCVT(source.{ibex_medical_chart.patient_info.admdoc.name},"i","XML"),",",1)' property='target.{PV1:AdmittingDoctor(1).FamilyName}' action='set' />
<assign value='$PIECE($ZCVT(source.{ibex_medical_chart.patient_info.admdoc.name},"i","XML"),",",2)' property='target.{PV1:AdmittingDoctor(1).GivenName}' action='set' />

Or a bit more efficiently:

<assign value='$ZCVT(source.{ibex_medical_chart.patient_info.admdoc.name},"i","XML")' property='tFullName' action='set' />
<assign value='$PIECE(tFullName,",",1)' property='target.{PV1:AdmittingDoctor(1).FamilyName}' action='set' />
<assign value='$PIECE(tFullName,",",2)' property='target.{PV1:AdmittingDoctor(1).GivenName}' action='set' />

Note that this isn't actually replacing the comma character so much as it's splitting the full name value supplied in the source path on that character and assigning the individually extracted values to their associated HL7 components in the Admitting Doctor field.

Depends on how the attachment is stored by the sender. If it's a binary attachment, the file data will be represented as a stream object in Part.BinaryData. If it's text data, Part.TextData. You just need to create a %Stream object and copy Part.BinaryData (or Part.TextData) to it:

Set stream=##class(%Stream.FileBinary).%New()
Set sc=stream.LinkToFile("/path/to/"_Part.FileName)
Do stream.CopyFrom(Part.BinaryData)
Do stream.Flush() // may not be necessary
Do stream.%Save() // ditto, but better safe than sorry ya know?

You can determine the type of data in the Part via the IsBinary boolean property ... 

You would only need to remove the repetition value from the argument passed to GetFieldNameFromNumber(); you would still continue to pass the full field index to GetValueAt(). Something like this:

set segment = msg.getSegmentByIndex(3)
Set fieldIndex = "3(2).1"
Set fieldNameIndex = ##class(User.Util.StringFunctions).ChangePattern(fieldIndex,"(\d)","()")
set fieldName = ##class(EnsLib.HL7.Schema).GetFieldNameFromNumber("2.5", segment.Name, fieldNameIndex)
set fieldValue = segment.GetValueAt(fieldIndex)

The ChangePattern method:

Class User.Util.StringFunctions Extends Ens.Util.FunctionSet
{
ClassMethod ChangePattern(pStr As %String, pPat As %String, pRep As %String) As %String
{
	Set tOut = pStr
	Do {
		Set tLoc = $LOCATE(tOut,pPat,,,tFnd)
		Set:$DATA(tFnd) tOut = $REPLACE(tOut,tFnd,pRep)
	} While tLoc '= 0
	Return tOut
}
}

If you're trying to get the 4 digit year from an HL7-formatted time string (YYYYmmddHHMM), your method should look like this (using GetYear rather than DateTime as method name for clarity):

ClassMethod GetYear(pDate As %String) As %String
{
   If ($LENGTH(pDate) > 4)
   {
      Return $EXTRACT(pDate,1,4)
   }
   Return ""
}
}

You would then call that class as follows:

Set Year = ##class(CUSTSOM.Training.Functions).GetYear(source.GetValueAt("PIDgrpgrp(1).PIDgrp.PID:7.1"))

The variable Year should then contain the 4 digit year, or the empty string if the value in PID:7.1 is 4 characters or less.

No, I don't think that's the issue.

My suspicion is that somewhere in your BPL you're cloning a message body and then not sending it anywhere, meaning that it never gets "attached" to a message header record. The records in Ens.MessageHeader are queried for the message body IDs to delete in the purge process, so if there's no header record with that message's ID as its MessageBodyID, it's by definition an orphaned message and does not get purged.

The source message is of course referenced by a header record, but a cloned message body doesn't get a header until you pass it to another process or operation.

Hi Scott,

You can get a list of message body IDs (and their associated event types) that have no corresponding header record with this query:

SELECT HL7.ID, HL7.Name FROM EnsLib_HL7.Message HL7 LEFT JOIN Ens.MessageHeader hdr ON HL7.Id=hdr.MessageBodyId WHERE hdr.MessageBodyId IS NULL AND HL7.OriginalDocId IS NULL

And yes, this query can take a very long time to run ... long enough that it will usually time out in the Management Console SQL UI. It shouldn't time out if you run it from the Caché (or IRIS) SQL Shell though.

What would you want to use for WHERE or ORDER BY criteria? The list of "fields" (properties) available to reference directly via SQL can be viewed in the Body panel of an HL7 message displayed in the Message Viewer, but their usefulness in determining what made them orphans is limited.

The problem with orphans is that knowing "where they came from" is a bit challenging. The source and destination services/processes/operations in the Production are stored in the message header table and not the body table; the reason the messages are orphans is because they're no longer linked (by MessageBodyId) to any records in the Message Header table.

The most common reason for orphans is the configuration of the message purge task. There's a "bodies too" checkbox that, if left unchecked, will prevent message bodies from being deleted. The headers still get deleted, though, and that makes the bodies "orphans."

Are you using the private key you created at the time you generated the CSR, or one provided by your server folks? You need to use the one you generated.

And I'm assuming your private key is encrypted, and therefore has the following header in the file:

-----BEGIN ENCRYPTED PRIVATE KEY-----

Have you tried decrypting it with openssl?

openssl rsa -in /etc/pki/tls/private/ssl_vd01.key -text

You should be prompted for a passphrase; use the one you provided when you generated the CSR. If it decrypts OK, you'll get something similar to this:

RSA Private-Key: (2048 bit, 2 primes)
modulus:
    00:98:42:c5:37:28:e4:b9:69:e4:a0:45:86:b1:20:
    39:5f:78:36:96:14:f8:e9:4f:49:7d:44:31:16:3c:
<remainder elided>

If you don't get something like this, the passphrase is wrong for the key file. If you do, verify that you've provided the proper passphrase in both the %SuperServer SSL configuration and the Web Gateway Server Access configuration.

You'll also need to provide the passphrase when starting httpd, which may not be obvious if SELinux is blocking it; running the following command will allow the prompting for a password when starting/restarting httpd:

setsebool -P httpd_read_user_content 1

I've installed standalone web servers/gateways on both the same host as IRIS and on separate servers that support multiple standalone IRIS instances. I often use the standalone web host/gateway as the arbiter host for a mirrored server pair. The customer I'm currently working with has all of their PROD and STAGE hosts (mirror pairs and DR servers) accessible through two separate web server/arbiter hosts.

If you're doing this to enhance security, it's not just about installing certificates and turning on TLS for the web server. You also need to make sure the communication between the web gateway module and the IRIS server is encrypted via TLS (primarily when the gateway is on a host separate from the IRIS instance). And if you're doing that, you probably ought to make sure that ODBC/jdbc and Studio connections are also encrypted, you've enabled STARTTLS for LDAP/AD authentication, TLS for mirror synchronization, etc.

set tmpStr = ##class(%Stream.FileCharacter).%New()
do tmpStr.LinkToFile("/some/writeable/location/temp.json")
do tmpStr.CopyFrom(newMsg)
do tmpStr.%Save()

You should then be able to open the JSON file outside of Caché/Ensemble, from the location to which it was written.

EDIT: The location could potentially be /<cache-install-dir>/csp/user/<filename> ... in which case you may be able to access it through the Caché/Ensemble web server and display it in your web browser:

http://<hostname>:<port>/csp/user/<filename>

For those that use a Windows workstation but code on a Linux/Unix-based server, here's a configuration that provides a remote IRIS terminal session. It uses the ssh client that is included with Windows 10 (I'm assuming there's one in Windows 11 as well).

Add it to your user settings to make it available across all of your projects, or to your workspace settings to have a custom terminal session per workspace:

    "terminal.integrated.profiles.windows": {
        "IRIS Session": {
            "overrideName": true,
            "path": "C:\\Windows\\System32\\OpenSSH\\ssh.exe",
            "args": [
                "-t",
                "<user>@<hostname>",
                "iris session <instance>"
            ]
        }
    }

Do you have a corporate mail server that supports SMTP relay? You would need to get the details from the administrator of the mail system. Some require TLS, some don't. Some require credentials, others don't. 

You can use GMail, in which case you would need a configuration of something like this:

Mail server? smtp.gmail.com 
Mail server port? 587
Mail server SSLConfiguration? <SSLConfigName> <-- You would need to create this in Security | SSL/TLS Configurations
Mail server UseSTARTTLS? 1 
 

You will also need to supply your Gmail email address for the username, and an app password (created in your Google Account Security settings) for the Set Authentication option.

Your best bet would be to discuss your email delivery requirements with the mail server administrator of your organization; they should be able to provide you with the required values. If this is for a private/personal installation of Caché or IRIS, GMail is probably the easiest to configure and best documented.

If you create a class that extends Ens.Rule.FunctionSet, you can have a method that's selectable from the expression editor in either a DTL or a Routing Rule ...

Class User.Util.MetaData Extends Ens.Rule.FunctionSet
{
/// Retrieves the UserValue associated with key <var>pKey</var> from the message object supplied as
/// <var>pMsg</var> (normally <strong>source</strong> in a DTL or <strong>Document</strong> in
/// a Routing Rule) as a %String. Returns an empty string if the key is undefined.
ClassMethod UserValueGet(pMsg As EnsLib.HL7.Message, pKey As %String) As %String
{
    If pMsg.UserValues.IsDefined(pKey)
    {
        Return pMsg.UserValues.GetAt(pKey)
    }
    Return ""
}
}

I haven't found support for this yet. I have a need for this as well and am considering writing an adapter to support get/put operations using smbclient. If there's a better/quicker way to accomplish this, I'd be very excited to learn about it laugh

We had toyed with the idea of mounting all of the shares as cifs filesystems on the RHEL 8.5 hosts, but there are quite a few ... Dynamically establishing a connection and then performing the required operation is preferred.