When I run the message through Interoperability | Interoperate | HL7 v2.x | HL7 v2.x Message Viewer with the following options:

I get the following errors:

Segments required by the HL7 2.5 OML_O21 specification are not present and the message is failing to parse after the first ORC. You can create a custom Doc Category and DocType Structure to eliminate the required segment(s) and associate it with the OML_O21 message type (alternately, you can just assign the ORM_O01 Structure to the OML_O21 message type).

Then use that custom Doc Category as the Message Schema Category in the inbound service and Doc Category/Doc Type in your DTL.

Without seeing the source message, it will be difficult to provide a definitive answer. My guess would be that there are segments in the message that would normally appear in an OBRgrp (or sub-group) but no OBR segment is present.

Also ... your screenshot indicates that the target message type/trigger event is being set to OML^O21, not ORM^O01. This differs from the description.

@Robert Cemper pointed you in the right direction. However the USER namespace is not normally enabled for Interoperability. You can create your own database(s)/namespace(s) in the Management Console and they would default to being interoperability-enabled, or you can enable USER for Interoperability with the following command:

Do ##class(%EnsembleMgr).EnableNamespace("USER",1)

The credentials entered at the Web Gateway login page are not related to any credentials stored in IRIS. They're controlled solely by the Username and Password entries in the [SYSTEM] section of CSP.ini. If those entries are deleted from CSP.ini, you should bypass the login page completely.

Note that if you're using a standalone web server on an IRIS installation that's been upgraded from an earlier version, there may be multiple CSP.ini files. For example, I work with an IRIS installation that has been upgraded many times and is now using a standalone web server; the original CSP.ini for the previously-included "private" web server is in the <install-dir>/sys/csp/bin/ folder, but the active CSP.ini is in /opt/webgateway/conf/.

There are (at least) two sets of credentials used in the web gateway: one to control access to the web gateway's configuration forms, and the other to authenticate the web gateway to an IRIS instance. Removing the Password entry from the [SYSTEM] section of CSP.ini will give you unauthenticated access to the web gateway's configuration, as long as you're either accessing it locally or are remote but have an IP address that matches the filter set for the System_Manager entry.

Once you have access to the management pages, you can then configure the gateway's credentials for connecting to IRIS in the Configuration | Server Access | [Select Server] | Edit Server page. The credentials entered in the Connection Security section must match the Web Gateway credentials in IRIS (usually user CSPSystem with whatever password was originally set at installation). If you don't have access to that password anymore, you can sign on via IRIS terminal, and in the %SYS namespace execute d ^SECURITY. Option 1, User Setup, allows you to change passwords for users, as long as your account has the necessary roles/permissions.

Back in the Web Gateway's configuration forms, you can add a password for access to its configuration in the Configuration | Default Parameters page.

^Ens.AppData is the target of the $$$EnsStaticAppData macro, which is referenced in these classes:

Ens.Adapter.cls
Ens.Util.File.cls
EnsLib.EDI.EDIFACT.Operation.BatchStandard.cls
EnsLib.EDI.X12.Document.cls
EnsLib.EDI.X12.Operation.BatchStandard.cls
EnsLib.EDI.X12.Operation.FileOperation.cls
EnsLib.File.InboundAdapter.cls
EnsLib.FTP.InboundAdapter.cls
EnsLib.HL7.Operation.BatchStandard.cls
EnsLib.RecordMap.Operation.FileOperation.cls
EnsLib.SQL.InboundAdapter.cls
EnsLib.SQL.InboundProcAdapter.cls
EnsLib.SQL.Operation.GenericOperation.cls
EnsLib.SQL.Snapshot.cls

You can't specify a DocType Name in a routing rule, at least directly. By specifying the docName, you're both selecting the message by Message Type/Trigger Event and identifying the structure (DocType Name) that will be used to parse the message in the rule. If you look at the HL7 v2.3 DocType Category via the Management Console in Interoperability | Interoperate | HL7 v2.x | HL7 v2.x Message Structures | 2.3, then select the ADT_A04 Message Type, you'll see this:

This means that an A04 event will be evaluated/parsed using the structure defined for an A01; the DocType Name (Message Structure) is ADT_A01.

docName is not the same as DocType Name. The former is the HL7 event (i.e. "ADT_A04") while the latter is the message structure associated with that event. Many events use the same structure, so there's not a 1-1 correspondence. For example, A01, A04, A05 and A08 messages all use the ADT_A01 DocType Name.

In your trace statements, you should use the "_" character for string concatenation, not the "&" character; that's the full-evaluation logical "AND" symbol

My guess at this point is that PV1:PatientClass is not equal to "E".

So I've created a custom message structure for stuffing PDFs extracted from HL7 messages into a COLD feed. I've been using %Stream.FileBinary as the object type for the stream property in the class. I hadn't given much thought to the fact that those streams might hang around after a message purge, so I went back and modified the class to use %Stream.TmpBinary. I mean, that seems to make sense, right?

Except that with %Stream.TmpBinary, the stream goes away as soon as the message leaves the business process, and no file gets written by the operation. Oops.

So I'm back to using %Stream.FileBinary ... I would hope that the Interoperability message purge task would "do the right thing" and delete the stream since the message object extends Ens.Request, but I suppose I should do some experimentin' 😉

Are you dealing with multiple DocTypes and Categories of messages going to Epic, or are all messages the same schema?

I know you're not crazy about a custom operation, but if you take a look at the source for EnsLib.HL7.Operation.TCPOperation, you'll see that it would be dead simple to copy/extend it, add a check for the population of the Ordering Provider field, and the logic to populate it with a default if it's empty.

From within your task, you can obtain the task's classname with $CLASSNAME() and query the %SYS.Task table using the TaskClass of your task to fetch the OutputFilename column.

If you want to use that file under your direct control, you can set "Open output file when task is running" to "yes," enter the full path of the filename, then set the previous setting back to "no." The filename will remain in the table.

If you're calling the same class under multiple Task schedules or with different configurations, the schedule values and settings are also available to help you refine your selection. Settings are stored in $LIST() format.

EDIT: You can also define a Filename property in the task class, assuming it's a custom class. It will show up as a configurable field in the task editor. That way you don't have to deal with the OutputFilename settings.

Usually, OBX segments are either defined as repeating segments or members of repeating segment groups. The syntax you'll use will vary depending on the HL7 Document Category and Structure you're using in your DTL. In HL7 2.5.1, the OBX segment itself is non-repeating, but is part of a repeating group (OBXgrp) inside another repeating group (ORCgrp) inside yet another repeating group (PIDgrpgrp).

You first need to get the count of the total number of OBX segments, which you can do by supplying the "*" character in the iteration argument to the source path's OBXgrp(). Add 1 to that, and you have your iteration for the new OBX segment.

Use that value as the iteration for the new OBX segment and populate the fields as needed, as in the example below:

The above assumes that the OBX segments are the last segments in the message. If they're not and the message requires another OBX at the very end, it's a bit more complicated ... you'd create a new segment object, populate it, then use the AppendSegment method to slap it on the end of the target: