What version are you running where $EXTRACT is not available? Not sure I’ve heard of such a situation.  
 

edit: noted you referenced ..ReplaceStr which is an Interoperability function. There are semi-equivalents of $EXTRACT and $FIND in there as well - $EXTRACT is ..SubString. But note if you use the solution I or David presented, don’t use .. in front of $EXTRACT, $FIND or $PIECE as these aren't interoperability functions but pure ObjectScript functions. 
 

my suggestion, that I know works as we do something similar, is as follows:

“(“_$PIECE(input,”(“,2)

Have to re-add the opening paren since we’re using that as our splitter, but some find it easier than chaining multiple functions together. David’s solution is certainly valid too.

https://docs.intersystems.com/iris20221/csp/docbook/Doc.View.cls?KEY=RCO...

Hey @Jimmy Christian - Read through your responses to others to understand a bit more of what you're after. Short of creating a custom process class, there's no way to expose a setting on the Router in the Management Portal Production UX.

That said, if I understand what you are ultimately trying to achieve, might I suggest a simple Lookup Table named something like 'RouterDowntimeSettings' and then in that table, simply have some entries defined like:

Key Value
MyPkg.Rules.ClassName 1

Then inside your rules where you want to use this, you simply use the built-in Lookup function on that table and pass in the class name that you specified as the key. Might look something like this:

 

<rule name="Discard" disabled="false">
<when condition="Lookup('RouterDowntimeSettings','MyPkg.Rules.ClassName')=1">
<return></return>

Since Lookup returns empty string if the key is not found, this would only return true if a key was explicitly specified in the table, thus making it easy to implement as needed without causing unexpected errors or functionality.

Just an alternate approach that may work for you and reduce the amount of effort (no custom class creation that would need to be maintained). 

Edit: Fixed the boolean, didn't see your original was using returns.

While technically this could be written using a custom class extending Ens.BusinessService, what you describe has you playing the 'Operation' role more than the 'Service' role. We do this with many integrations and have a design pattern that works well for us.

In short, you need:

  • Custom Adapterless Trigger Service (extends Ens.BusinessService). Only purpose is to send a simple Ens.Request to a Busines Process (BPL of custom class that extends Ens.BusinessProcess) on a timed interval... either using a schedule or the call interval.
  • Custom Business Operation likely extending EnsLib.HTTP.GenericOperation or something similar.
  • Custom Business Process to handle the business logic flow...
    • at time of Ens.Request from the trigger service, it formats a request object and sends to the Business Operation that executes your GET call against the webservice to receive the JSON payload.
    • JSON Payload returned by Business Operation to the Business Process as a custom message object ideally (no longer JSON) and from there, any manner of normal Ensemble workflows can take place (data transforms, ID logic, call outs to other business operations and so forth.)

You appear to be on a very old version of Ensemble so not sure how much recent documentation will be relevant to your use case and you likely will face a lot more difficultly using Ensemble 2014.1 than you would with something 2019.x or newer, but here's a few reference links to get the thought processes going:

Using the HTTP Outbound Adapter | Using HTTP Adapters in Productions | InterSystems IRIS for Health 2021.2

Creating REST Operations in Productions | Using REST Services and Operations in Productions | InterSystems IRIS for Health 2021.2
 

I may be misunderstanding your complete use-case but we are also doing something similar for 21st Century Cures and the easiest way to handle this, that I identified anyway, is through dynamic terminology mapping.

So DS can go in for the DocumentType.Code per usual but if you look at the FHIR Annotations/Mappings, you'll note DocumentType is a CodeTableDetail item - HS.SDA3.CodeTableDetail.DocumentType - so you can setup a simple CodeTable to map from DS to the appropriate LOINC without needing to mess with extensions and custom pairs. Just ensure you have an appropriate Translation Profile setup for your access gateway that serves out FHIR (ODS likely) so that it picks up the map to get from your HL7v2 DocumentType code table to LOINC.

EDIT to add - thus far, I have not once had to modify a built in transform to support US CDI Requirements. I suspect with the UCR release coming out in the next month or two, most of the US CDI v2/v3 mappings will also be handled better with standard CodeTable/Terminology Maps filling in any gaps.

Unclear what you mean exactly but if you're wondering if IRIS will install on something like a Synology Disk Station - yes - it works just fine using either Docker w/ containers or full VM Server experience where you first install an OS like RedHat or Ubuntu into a VM and then install the appropriate kit.

Obviously would only be done for development purposes and not a production like scenario. Most NAS servers can't push the required IOPS to properly support a production IRIS server.

I have tackled a challenge like this - with additional complex wrinkles where the related records were all in separate files in one big zip-file pulled down via SFTP by my production Service and I needed to generate a combined file for the Complex Record Map to process with the appropriate prefixes in place.

I hope this will be useful to you - after getting it working, the vendor I had to write it for went belly up so I didn't finish cleaning up my traces or comments but this was fully working.  The general flow is an SFTP pickup of a zip file daily that contains 4 files... a PAT (patient) file, a data element 1, data element 2 and data element 3 file - all comma separated. the Patient Identifier in the first file could link to the other 3 files in one of the columns.

At the end, I up with a single file I pushed into CRM that looks like (where PAT ID is 123456):

PAT|123456,SMITH,JOHN,M,moredata,etc
DATA1|49492,123456,data1data,data1moredata,data1etc
DATA2|577545,123456,data1data,data1moredata,data1etc
DATA3|454543,123456,data1data,data1moredata,data1etc

I hope it's useful to you to at least get an idea of how to get started on your particular use case. Happy to try and clarify anything if needed.

 

/// Custom business service to handle ingesting multiple related delimited flat files (contained in a single ZIP!) and 
/// combining them into a single message per Patient that is then fed into a Complex RecordMap.
Class MyPkg.Services.CRM.BatchZip Extends EnsLib.RecordMap.Service.ComplexBatchStandard [ Final ]
{

Parameter ADAPTER = "EnsLib.FTP.InboundAdapter";

Parameter SETTINGS = "ZipUtility:Basic";

Parameter Slash = {$Case($System.Version.GetOS(),"Windows":"\",:"/")};

/// Operating System Utility, with parameters, that will be executed to extract zip file from vendor.
/// Use {filename} as placeholder for the dynamic ZIP file name within the parameters.
/// 
/// Note that any utility used must write the filenames of the contents to stdout for interrogation
/// by this service.
/// 
/// Default: unzip (GNU) linux utility for unix/linux based operating systems
Property ZipUtility As %String [ InitialExpression = "unzip -o {filename}" ];

Method OnProcessInput(
    pInput As %FileBinaryStream,
    pOutput As %RegisteredObject,
    ByRef pHint As %String) As %Status
{
    Set tSC = $$$OK
    Set instanceID = $System.Util.CreateDecimalGUID()
    $$$TRACE("Unique InstanceID: "_instanceID)
    Set ^MyPkg.Temp(instanceID) = "Creating CRM-compatible masterfile for this batch file input: "_pInput.Filename
    $$$TRACE("Starting to process "_##class(%File).GetFilename(pInput.Filename))
    $$$TRACE("Executing GetZipContents")
    Set tSC = ..GetZipContents(pInput.Filename, .files)
    If $$$ISERR(tSC) Quit tSC

    // Process each sub-file into a temporary global so we can add our CRM fixed leading data and join the records together
    $$$TRACE("Processing each file into a temporary global: ^MyPkg.Temp("_instanceID_")")
    Set ptr=0
    While $ListNext(files,ptr,file)
    {
        $$$TRACE("Processing file "_file)
        If ..startsWith($P(file,..#Slash,*,*),"patients_")
        {
            Set tSC = ..ProcessFile(file, ",", instanceID)
        }
        ElseIf ..startsWith($P(file,..#Slash,*,*),"dataElement1_")
        {
            Set tSC = ..ProcessFile(file, ",", instanceID)
        }
        ElseIf ..startsWith($P(file,..#Slash,*,*),"dataElement2_")
        {
            Set tSC = ..ProcessFile(file, ",", instanceID)
        }
        ElseIf ..startsWith($P(file,..#Slash,*,*),"dataElement3_")
        {
            Set tSC = ..ProcessFile(file, ",", instanceID)
        }
        Else
        {
            Do ##class(%File).Delete(file)
        }
    }
    
    $$$TRACE("Creating MasterInputFile that we'll feed into a Complex Record Map.")
    Set tSC = ..CreateMasterInputFile(pInput.Filename, instanceID, .masterInputFile)
    
    $$$TRACE("MasterInputFile: "_masterInputFile)
    $$$TRACE("Now processing MasterInputFile into Complex RecordMap.")
    Try {
        Set masterInputFileStream = ##class(%FileBinaryStream).%New()
        Set masterInputFileStream.Filename = masterInputFile
        Set tLookAhead = ""
        Set tIOStream = ##class(EnsLib.RecordMap.Service.FileServiceStream).%New(masterInputFileStream)
        Set tIOStream.Name = ..GetFileName(masterInputFileStream)
        
        While 'tIOStream.AtEnd {
            Set tPosition = tIOStream.Position
            Set tSC = ..GetBatch(tIOStream, .tBatch,,.tLookAhead)
            If $$$ISERR(tSC) || (tPosition=tIOStream.Position) Quit
            
            Set ..%SessionId = ""
            Set tStatus = ..ForceSessionId()
            If $$$ISERR(tStatus) Quit
            
            Set tSC = ..SendRequest(tBatch,'..SynchronousSend)
            If $$$ISERR(tSC) Quit
        }
        If $$$ISERR(tSC) Quit
        
        If 'tIOStream.AtEnd {
            $$$LOGWARNING($$$FormatText($$$Text("Failed to advance record stream. Stopped reading file '%1' at position %2, not at end.","Ensemble"),tIOStream.Name,tIOStream.Position))
        }
    }
    Catch ex {
        Set tSC = $$$EnsSystemError
    }
    If $get(tLookAhead) '= "" {
        $$$LOGINFO("Discarding trailing characters: '"_tLookAhead_"'")
    }
    
    $$$TRACE("Cleaning up the temporary global we created.")
    Set tSC = ..CleanUp(instanceID)
    $$$TRACE("Completed "_##class(%File).GetFilename(pInput.Filename))
    Quit tSC
}

Method ProcessFile(
    pFilename As %String,
    pDelimiter As %String = ",",
    pInstanceID As %String) As %Status [ Private ]
{
    Set tSC = $$$OK
    Set skipHeader = 1
    
    Set file=##class(%File).%New(pFilename)
    Set tSC = file.Open("RU")
    While 'file.AtEnd
    {
        Set line = file.ReadLine()
        If skipHeader
        {
            Set skipHeader = 0
            Continue    
        }
        
        If line '[ pDelimiter Continue
        
        If ..startsWith($P(pFilename,..#Slash,*,*),"patients_")
        {
            // How do we identify the 'key' value to link up the other pieces? Get a piece of the row and store it as a part of the global key!
            Set key = $Piece(line,pDelimiter,1,1)_","_$Piece(line,pDelimiter,2,2)
            // Let's give ourselves a prefix! PAT|
            Set ^MyPkg.Temp(pInstanceID,"PAT|",key) = "PAT|"_line
        }
        ElseIf ..startsWith($P(pFilename,..#Slash,*,*),"dataElement1_")
        {
            // Each dataElement has a key for itself but also a linking key to the PAT
            Set ^MyPkg.Temp(pInstanceID,"DATA1|",$Piece(line,pDelimiter,1,1),$Piece(line,pDelimiter,2,2)) = "DATA1|"_line
        }
        ElseIf ..startsWith($P(pFilename,..#Slash,*,*),"dataElement2_")
        {
            // Each dataElement has a key for itself but also a linking key to the PAT
            Set ^MyPkg.Temp(pInstanceID,"DATA2|",$Piece(line,pDelimiter,2,2),$Piece(line,pDelimiter,1,1)) = "DATA2|"_line
        }
        ElseIf ..startsWith($P(pFilename,..#Slash,*,*),"dataElement3_")
        {
            // Each dataElement has a key for itself but also a linking key to the PAT
            Set ^MyPkg.Temp(pInstanceID,"DATA3|",$Piece(line,pDelimiter,3,3),$Piece(line,pDelimiter,5,5)) = "DATA3|"_line
        }
    }
    
    Do file.Close()
    Do ##class(%File).Delete(pFilename)
    Quit tSC
}

/// Let's start putting everything together into one big file that CRM will process!
Method CreateMasterInputFile(
    pSourceFilename As %String,
    pInstanceID As %String,
    Output MasterInputFilename) As %Status [ Private ]
{
    Set tSC = $$$OK
    Set MasterInputFilename = $Replace(pSourceFilename,".zip",".txt")
    
    Set fileObj = ##class(%File).%New(MasterInputFilename)
    Set tSC = fileObj.Open("WSN")
    If ($SYSTEM.Status.IsError(tSC)) {
        Do $System.Status.DisplayError(tSC)
        Quit $$$NULLOREF
    }
    
    Set key=$Order(^MyPkg.Temp(pInstanceID,"PAT|",""))
    While key'=""
    {
        Set patID = $Piece(key,",",2,2)
        
        // Write out PAT| 
        Do fileObj.WriteLine(^MyPkg.Temp(pInstanceID,"PAT|",key))
        
        // Get dataElement1 for that PAT next... patID Key 1, dataElement1 Key 2
        Set data1Key = $Order(^MyPkg.Temp(pInstanceID,"DATA1|",""))
        While data1Key'=""
        {
            If data1Key = patID
            {
                Set data1Key2 = $Order(^MyPkg.Temp(pInstanceID,"DATA1|",data1Key,""))
                While data1Key2'=""
                {
                    Do fileObj.WriteLine(^MyPkg.Temp(pInstanceID,"DATA1|",data1Key,data1Key2))    
                    Set data1Key2 = $Order(^MyPkg.Temp(pInstanceID,"DATA1|",data1Key,data1Key2))
                }
            }
            Set data1Key = $Order(^MyPkg.Temp(pInstanceID,"DATA1|",data1Key))
        }
        
        // Get dataElement2 for that PAT next... patID Key 1, dataElement2 Key 2
        Set data2Key = $Order(^MyPkg.Temp(pInstanceID,"DATA2|",""))
        While data2Key'=""
        {
            If data2Key = patID
            {
                Set data2Key2 = $Order(^MyPkg.Temp(pInstanceID,"DATA2|",data2Key,""))
                While data2Key2'=""
                {
                    Do fileObj.WriteLine(^MyPkg.Temp(pInstanceID,"DATA2|",data2Key,data2Key2))    
                    Set data2Key2 = $Order(^MyPkg.Temp(pInstanceID,"DATA2|",data2Key,data2Key2))
                }
            }
            Set data2Key = $Order(^MyPkg.Temp(pInstanceID,"DATA2|",data2Key))
        }

        // Get dataElement3 for that PAT next... patID Key 1, dataElement3 Key 2
        Set data3Key = $Order(^MyPkg.Temp(pInstanceID,"DATA3|",""))
        While data3Key'=""
        {
            If data3Key = patID
            {
                Set data3Key2 = $Order(^MyPkg.Temp(pInstanceID,"DATA3|",data2Key,""))
                While data3Key2'=""
                {
                    Do fileObj.WriteLine(^MyPkg.Temp(pInstanceID,"DATA3|",data3Key,data3Key2))    
                    Set data3Key2 = $Order(^MyPkg.Temp(pInstanceID,"DATA3|",data3Key,data3Key2))
                }
            }
            Set data3Key = $Order(^MyPkg.Temp(pInstanceID,"DATA3|",data3Key))
        }
        
        Set key = $Order(^MyPkg.Temp(pInstanceID,"PAT|",key))
    }
    
    Do fileObj.Close()
    
    Quit tSC
}

/// Using full path, will extract Zip file using $ZF(-100) - OS-level execution - and read in the filenames
/// of what was extracted for further processing, returning as a list to OnProcessInput
Method GetZipContents(
    pFilename As %String,
    Output pContentFilenames As %List) As %Status [ Private ]
{
    Set tSC = $$$OK, tempFilenames = ""
    Set stdoutFilename = ##class(%File).TempFilename("myTempCRMBatch")
    
    Set unzipCmd = $Replace(..ZipUtility,"{filename}",pFilename)
    $$$TRACE("Executing OS command: "_unzipCmd)
    Set workingDirectory = $Piece(pFilename,..#Slash,1,*-1)
    Set unzipCmd = "cd "_workingDirectory_";"_unzipCmd
    Set sc = $ZF(-100,"/SHELL /NOQUOTE /STDOUT+="""_stdoutFilename_""" /STDERR+="""_stdoutFilename_"""",unzipCmd)
    
    Set stdout=##class(%File).%New(stdoutFilename)
    Set stdout.LineTerminator = $char(10)
    Set tSC = stdout.Open("RU")
    While 'stdout.AtEnd
    {
        Set stdoutLine = stdout.ReadLine()
        If stdoutLine [ ".csv"
        {
            Set temp = $LFS(stdoutLine," ")
            Set ptr = 0
            While $ListNext(temp,ptr,piece)
            {
                If $ZStrip(piece,"*W") [ ".csv"
                {
                    //$$$TRACE("Found file in zip: "_$ZStrip(piece,"*W"))
                    If tempFilenames '= "" Set tempFilenames = tempFilenames_","
                    Set tempFilenames = tempFilenames_workingDirectory_..#Slash_$ZStrip(piece,"*W")
                }
            }
            
        }
    }
    
    Do stdout.Close()
    Set tSC = ##class(%File).Delete(stdoutFilename)
    Set pContentFilenames = $LFS(tempFilenames,",")
    Quit tSC
}

Method CleanUp(pInstanceID As %String) As %Status [ Private ]
{
    Kill ^MyPkg.Temp(pInstanceID)
    Quit $$$OK
}

Method startsWith(
    value As %String,
    string As %String) As %Boolean [ CodeMode = expression, Internal, Private ]
{
($E($g(value),1,$L($g(string)))=$g(string))
}

}

Hey Scott - Your questions require a bit of clarification to best answer but I can help a bit as I just went through this for both internally served and secured IRIS Management Portal and externally served and secured IRIS-hosted web services. 
 

There’s two layers to securing to consider and that’s where I would need clarification on which part your questions are after:

  • Mutual TLS 1.2 encryption to/from the web gateway module installed on Apache that acts as a reverse proxy of sorts between the web server and the IRIS server’s SuperServer port. (Actual users don’t use this port directly in a web browser) 
  • HTTPS/SSL Encryption on the Apache Web Server that encrypts the traffic between the client browser and web server itself.

For the a production quality/secure setup, you want to always achieve both of these in my opinion.

For the first bullet, if you control both sides of the equation (the IRIS server and the web server), you could easily do a self-signed cert using your redhat server’s CA as you can specify the CA Chain of Authority that validates the signed cert on both sides. 
 

For the second bullet, you really want to use a certificate authority that your user’s web browsers will natively trust. Eg if youre just serving up internally and all your users are joined to an internal domain, that domain’s CA could generate a web server cert you could install to be used by port 443 on apache httpd and your user’s browsers will likely be a-ok with it as domain CAs generally update their domain members Keystore on login (keyword being usually.) That CA could also be used to generate appropriate Server/Client profile certs to be used for the mutual tls of the first bullet  

But easiest approach for the second bullet is using a external trusted CA (think Thawte, VeriSign, and many others) as browsers will generally trust these “out of the box.” External CAs can also be used for the mutual TLS piece but generally overkill if the web gateway and iris server are all on the same internal network (again, in my opinion) - proper securing of private keys is important with use of internal CA for mutual TLS especially but really should be doing that anyway.

Reference for the mutual tls: https://docs.intersystems.com/irisforhealthlatest/csp/docbook/DocBook.UI...

I posted this on a similar question a couple weeks back - this worked for me on my MBP M1:

If you prefer or need to use a kit (like myself), give multipass a look. Super easy to spin up a ubuntu VM in seconds.

Note that you need to pull down the arm64 version of ubuntu AND the arm64 iris kits, not traditional x86_64 architecture.

https://github.com/canonical/multipass
https://9to5linux.com/canonical-makes-it-easier-to-run-ubuntu-vms-on-app...

If you prefer or need to use a kit (like myself), give multipass a look. Super easy to spin up a ubuntu VM in seconds:

(edit: i should note that it would be using arm64 version of ubuntu and the iris kits that support arm64, not traditional x86_64 architecture like it sounds you were used to with intel based macs)

https://github.com/canonical/multipass
https://9to5linux.com/canonical-makes-it-easier-to-run-ubuntu-vms-on-app...

Your version is quite a bit older than the one I use but I can say this works for me. I'm not sure if it's technically supported in the sense that a future upgrade might break it but, like you, I needed this at the top of a process instead of having separate classes for each source config (ew) that were otherwise mostly identical.

..%Process.%PrimaryRequestHeader.SourceConfigName

So tread carefully if you use it.

EDIT: Found in our historical documentation that one of the methods I wrote for a Ens.Util.FunctionSet was also able to get to it this way and it was using a version around 2017.1 at the time:

Set SourceConfigName = %Ensemble("%Process").%PrimaryRequestHeader.SourceConfigName

I like your idea of handling this with an extended process - something I had not yet considered. I personally handle this by extending Operations (EnsLib.REST.Operation) to handle specific message types (XData blocks) and then use standard BPLs to manage the appropriate callouts/responses to determine what to do next.

But generally speaking, Processes can be extended with a custom class as well by extending Ens.BusinessProcess and implementing overrides to OnRequest/OnResponse. There's more documentation here:

Defining Business Processes - Developing Productions - InterSystems IRIS Data Platform 2020.4

Interested to hearing how you end up addressing this and suggestions others have as well. I feel like there are many ways to tackle this common need (REST API workflows) so probably several approaches I had not considered.

If using a pre-built outbound operation:

Those are the key settings (the checkbox is what you are asking about.)

In code on the %HttpRequest object, you're looking for

Set ..%HttpRequest.SSLConfiguration = "Default" (or whatever your SSL config name is)

Set ..%HttpRequest.SSLCheckServerIdentity = 1 (for true, 0 for false.)

Your error appears to deal more with authentication/authorization (HTTP 403) as I think SSL handshake failures throw a different status code but tinker with the settings above.

Hi Sebastian - You are correct in that I implement it all directly in the Operation and the BPL handler with respect to getting the token (Handled in the Operation) and the storing of it (handled in the BPL.) I am not currently using any %OAuth framework functionalities though I have started to peek at them. They don't seem to match my use-case and perhaps it has something to do with how the vendor I'm working with has implemented their OAuth messaging.

My GetBearerToken function ends up looking something like this - below is any early working version. I have since cleaned things up and encapsulated a lot of the build-up of the %Net.HttpRequest object using a custom framework I developed but this will give you a general idea.

My custom message classes extends Ens.Request (or Ens.Response as appropriate) and %JSON.Adaptor. The properties defined within are aligned with what the vendor expects to receive or send back. For instance, when sending this request to the vendor, they typically expect a Client Id, Client Secret, and the Audience and Grant Type being requested for the token. My BPL defines all those properties dynamically before sending the call to the operation.

 /// REST WS Method to fetch an environment specific OAuth Bearer Token.
Method GetBearerToken(pRequest As MyCustomMessageClass.Vendor.Request.GetBearerToken, Output pResponse As MyCustomMessageClass.Vendor.Response.GetBearerToken) As %Status
{

// Endpoint for retrieval of Bearer Token from Vendor
Set tURL = "https://vendor.com/oauth/token" 

Try {

Set sc = pRequest.%JSONExportToString(.jsonPayload)
THROW:$$$ISERR(sc) $$$ERROR($$$GeneralError, "Couldn't execute object to json conversion") 

// ..%HttpRequest is a reference to a class defined variable of type %Net.HttpRequest.
// Set HTTP Request Content-Type to JSON
Set tSC=..%HttpRequest.SetHeader("content-type","application/json") 

// Write the JSON Payload to the HttpRequest Body
Do ..%HttpRequest.EntityBody.Write()
tSC = ..%HttpRequest.EntityBody.Write(jsonPayload) 

// Call SendFormDataArray method in the adapter to execute POST. Response contained in tHttpResponse
Set tSC=..Adapter.SendFormDataArray(.tHttpResponse,"POST", ..%HttpRequest, "", "", tURL) 

// Validate that the call succeeded and returned a response. If not, throw error.
If $$$ISERR(tSC)&&$IsObject(tHttpResponse)&&$IsObject(tHttpResponse.Data)&&tHttpResponse.Data.Size 
{
Set tSC = $$$ERROR($$$EnsErrGeneral,$$$StatusDisplayString(tSC)_":"_tHttpResponse.Data.Read())
}
Quit:$$$ISERR(tSC) If $IsObject(tHttpResponse)
{

// Instantiate the response object
pResponse = ##class(MyCustomMessageClass.Vendor.Response.GetBearerToken).%New()
// Convert JSON Response Payload into a Response Object
tSC = pResponse.%JSONImport(tHttpResponse.Data)
}
Catch {
// If error anywhere in the process not caught previously, throw error here.
Set tSC = $$$SystemError
}
Quit tSC
}

We had this exact scenario a year ago or so and the easiest solution for us, since we aren't using System Default Settings like Eduard mentioned,  was to open up the production class in Studio and do a search/replace (Ctrl-H)

Search for: epicsup

Replace with epictst

Then save/compile. Note will take effect immediately (edit: trying to recall if we had to click the 'Update' button in the Production Monitor for the connections to disconnect from the old and reconnect to the new. Definitely verify in your production monitor that the connections established to their new destination.)

From your screenshot, I see you're using a custom message class ORMFARM.amplitudeHTTPRequest. Have you extended the class to include  Ens.Request? 

I recently had a similar project and to get the request and response message bodies to show in trace, I had to include Ens.Request and Ens.Response (respectively) in my class definition:
 

Class MyRequestClassExample Extends (Ens.Request, Ens.Util.MessageBodyMethods, %JSON.Adaptor)

{

Property Example As %String;

}

Good morning Eric -

The logical operator for OR in a DTL is a double pipe - || - whereas an AND is a double ampersand - &&.

See: http://oairsintp.ssfhs.org:57772/csp/docbook/DocBook.UI.Page.cls?KEY=TCOS_Logical

So in your case, I'd recommend an IF/ELSE block 

IF source.PROP45 = "whatever" || source.PROP45 = "whateverelse"

set target.{PD1.AdvanceDirectiveCode(1).identifierST}     trueoption

ELSE

set target.{PD1.AdvanceDirectiveCode(1).identifierST}      falseoption

There are also some built in functions the DTL editor provides to do it all in one line. It comes down to preference on readability, in my opinion. 

EDIT - here's an example using the AND operator in the DTL (&&) but in much the same way, you could use an OR (||) operator.

 

Good morning - I reported this issue when we installed 2017.1.0 last summer.

They recently addressed it in 2017.1.2 (released in Nov 2017 I think?):

http://docs.intersystems.com/documentation/cache/releasenotes/201712/rel...

Summary: Ensure All Matching Results Appear in Message Viewer

Description:
In the Message Viewer, users can filter the results which will be displayed by various criteria. However, when the search criteria included certain extended criteria and the selected Page Size was smaller than the total number of messages, some of the matching results were not displayed even if you navigated to the next page. This change ensures that all matching messages are displayed.

We updated our environments to 2017.1.2 and this issue no longer occurs.

Hope this helps!

-Craig