Appreciate the detailed response Timothy - I'd be happy to show you what I've been working on. Still a bit of a work-in-progress but I do actually have it in a functional state - just some small nuggets for our non-developers I need to button-up.

Yes, very much 'rolling my own' but I learned substantial amounts from existing OEX solutions of course - refactored, cleaned up a lot of functionality that didn't fit with our desired workflows, and integrated my GitHub Enterprise private org App I created so the server can act on behalf of the user after the user authorizes it via OAuth2.

Git is used on the server simply as the transport mechanism for the 'server' to talk to GitHub as the App but possibly removing that crutch as well long term.

Ben is correct that the window launch would be from SMP and that's where I've struggled a bit. The hooks are a lot friendlier within VS Code or Studio but the SMP side leaves me wanting - not just for these modal windows but also the general dynamic-ness of the menus.

e.g., early on I toyed with the idea of some 'short-lived branches' for our developers and having a menu DisplayName updated to show their current branch. This worked beautiful in VS Code/Studio but SMP just ignored any updates to the DisplayName essentially. The lack of consistency was frustrating so I just dropped that functionality for now.

In the scenario I'm referencing, the use would be inside Management Portal as our analysts do not and will likely never utilize a VS Code or even Studio IDE. They manage integration lookup tables within the Interoperability (formerly Ensemble) space and maybe a few other small items therein but therefore we must rely on the source control hooks to a degree to ensure we can capture their changes as well.

I agree, however, in principle and for our developers have another way to manage the 'backend' settings generally stored in globals with a nicer frontend. But nice to have a one-stop-shop whenever possible from a user-friendly standpoint (i.e., if it could be launched from a Source Control menu on the Production page within the Management Portal without being caught inside the iframe Dialog.Manager)

@Timothy Leavitt 
Does ISC have any plans to expand upon the server hooks further/modernize them a bit?

Using a detached model of source control (i.e., non-isfs mode of VS Code w/ ObjectScript) for our org that revolves around traditional hospital/health-org Ensemble usage doesn't really work well, in particular with our analysts that aren't developers in any sense but still make table updates and other minor tweaks here and there within the Management Portal that fall to the source control hooks to handle.

In particular, in the UserAction space, there isn't a clean way to launch a small modal window for providing a UI to manage a user's source control settings - as the source control hooks effectively assume whatever CSP you're launching from UserAction is going to show a document that needs to be 'controlled' by wrapping it in an iframe managed by %CSP.Portal.SourceControl.Dialog.Manager or similar.

I have figured out ways around this but it's kind of clunky. So hopeful may we could launch our own Angular page from UserAction so we can control the dialog size, UI, etc. (and maybe there already is an undocumented way I just haven't dug further enough to find yet.)

Your version is quite a bit older than the one I use but I can say this works for me. I'm not sure if it's technically supported in the sense that a future upgrade might break it but, like you, I needed this at the top of a process instead of having separate classes for each source config (ew) that were otherwise mostly identical.

..%Process.%PrimaryRequestHeader.SourceConfigName

So tread carefully if you use it.

EDIT: Found in our historical documentation that one of the methods I wrote for a Ens.Util.FunctionSet was also able to get to it this way and it was using a version around 2017.1 at the time:

Set SourceConfigName = %Ensemble("%Process").%PrimaryRequestHeader.SourceConfigName

Appreciate the info - we do supply chain stuff as well for our org so I can relate to the message sizes there tend to be a lot smaller and infrequent than say an ADT and/or base64 PDF ORU result feed.

It's certainly not that cloud can't scale to support it - definitely can - but if our clinical document imaging is on-prem and we're bouncing those PDF ORUs between on-prem and cloud, bandwidth usage/cost becomes a significant factor especially given that's just a couple integrations and we have around 400.

It'll be interesting to see how much this factors in long term to cloud plans.

Thanks Oliver - Could you speak to the workload of your Health Connect production? i.e., is it for traditional HL7-based integration between EMRs/clinical apps/etc for a healthcare org?

And if so, what are your thoughts/experiences with the egress/ingress traffic costs? Does most of your traffic exist in the same cloud to mitigate that or are you living in the hybrid cloud world where a lot of traffic is still going to an onprem data center?

Appreciate your thoughts (and anyone else's) on this as we've been exploring it as well.

I like your idea of handling this with an extended process - something I had not yet considered. I personally handle this by extending Operations (EnsLib.REST.Operation) to handle specific message types (XData blocks) and then use standard BPLs to manage the appropriate callouts/responses to determine what to do next.

But generally speaking, Processes can be extended with a custom class as well by extending Ens.BusinessProcess and implementing overrides to OnRequest/OnResponse. There's more documentation here:

Defining Business Processes - Developing Productions - InterSystems IRIS Data Platform 2020.4

Interested to hearing how you end up addressing this and suggestions others have as well. I feel like there are many ways to tackle this common need (REST API workflows) so probably several approaches I had not considered.

I'm not really clear on what you mean by "standard Module.int" so sounds like we may be approaching this in different ways and I apologize for any confusion I caused.

%HttpRequest is %Net.HttpRequest (you can find syntax for SetHeader here) and the Adapter in this case refers to the adapter attached to the EnsLib.REST.Operation class via Parameter, which in this case is EnsLib.HTTP.OutboundAdapter.

If using a pre-built outbound operation:

Those are the key settings (the checkbox is what you are asking about.)

In code on the %HttpRequest object, you're looking for

Set ..%HttpRequest.SSLConfiguration = "Default" (or whatever your SSL config name is)

Set ..%HttpRequest.SSLCheckServerIdentity = 1 (for true, 0 for false.)

Your error appears to deal more with authentication/authorization (HTTP 403) as I think SSL handshake failures throw a different status code but tinker with the settings above.

I had the same challenges as you when I was tackling this - documentation wasn't really fleshing it out well enough. ISC Sales Engineer helped me work through it.

Here is what I have used with success to submit both an XML doc and PDF doc to a vendor along with two parameters associated with the request (ReportId and CustomerId.) Requires use of MIME Parts. I hope this helps you. I had to genericize some of the code to share but it is commented by me what each part does and how it pulls together. 

Note this assumes you're passing in a variable pRequest that is a message class that holds your data. Also I am running this on 2019.1, not 2018 so not sure of the differences when using things like %JSONImport (may be none but I don't know that for certain.)

        Set tURL = "fully_qualified_endpoint_url_here"
        
        // Instantiate reportId MIME Part
        Set reportId = ##class(%Net.MIMEPart).%New()
        // Define/Set the Content-Disposition header indicating how this MIME part is encoded and what it contains.
        // Final string looks like: form-data; name="reportId"
        S tContentDisp1 = "form-data; name="_$CHAR(34)_"reportId"_$CHAR(34)
        Do reportId.SetHeader("Content-Disposition", tContentDisp1)
        // Get the ReportId from the incoming Request (from BPL) and write to the MIME Part body.
        S tReportId = pRequest.ReportId
        Set reportId.Body = ##class(%GlobalCharacterStream).%New()
        Do reportId.Body.Write(tReportId)
        
        // Instantiate customerId MIME Part
        Set customerId = ##class(%Net.MIMEPart).%New()
        // Define/Set the Content-Disposition header indicating how this MIME part is encoded and what it contains.
        // Final string looks like: form-data; name="customerId"
        S tContentDisp2 = "form-data; name="_$CHAR(34)_"customerId"_$CHAR(34)
        Do customerId.SetHeader("Content-Disposition", tContentDisp2)
        // Get the CustomerId from the incoming Request (from BPL) and write to the MIME Part body.
        S tCustomerId = pRequest.CustomerId
        Set customerId.Body = ##class(%GlobalCharacterStream).%New()
        Do customerId.Body.Write(tCustomerId)
        
        // Instantiate file1 (XML Structured Doc) MIME Part
        Set file1 = ##class(%Net.MIMEPart).%New()
        // Define/Set the Content-Disposition header indicating how this MIME part is encoded and what it contains.
        // Final string looks like: form-data; name="file1"; filename="<pRequest.CaseNumber>.xml"
        S tXmlFileName = pRequest.CaseNumber_".xml"
        S tContentDisp3 = "form-data; name="_$CHAR(34)_"file1"_$CHAR(34)_"; filename="_$CHAR(34)_tXmlFileName_$CHAR(34)
        Do file1.SetHeader("Content-Disposition", tContentDisp3)
        // Get the XML as a Stream from the incoming Request (from BPL) and write to the MIME Part body.
        Set tStream = ##class(%GlobalCharacterStream).%New()
        Set tSC = pRequest.XmlDoc.OutputToLibraryStream(tStream)
        Set file1.Body = tStream
        Set file1.ContentType = "application/xml"
        
        // Instantiate file1 (PDF Report) MIME Part
        Set file2 = ##class(%Net.MIMEPart).%New()
        // Define/Set the Content-Disposition header indicating how this MIME part is encoded and what it contains.
        // Final string looks like: form-data; name="file1"; filename="<pRequest.CaseNumber>.xml"
        S tPdfFileName = pRequest.CaseNumber_".pdf"
        S tContentDisp4 = "form-data; name="_$CHAR(34)_"file2"_$CHAR(34)_"; filename="_$CHAR(34)_tPdfFileName_$CHAR(34)
        Do file2.SetHeader("Content-Disposition", tContentDisp4)
        // Get the PDF Stream from the incoming Request (from BPL) and write to the MIME Part body.
        Set file2.Body = pRequest.PdfDoc.Stream
        Set file2.ContentType = "application/pdf"
        
        // Package sub-MIME Parts into Root MIME Part
        Set rootMIME = ##class(%Net.MIMEPart).%New()
        Do rootMIME.Parts.Insert(reportId)
        Do rootMIME.Parts.Insert(customerId)
        Do rootMIME.Parts.Insert(file1)
        Do rootMIME.Parts.Insert(file2)
        
        // Write out Root MIME Element (containing sub-MIME parts) to HTTP Request Body.
        Set writer = ##class(%Net.MIMEWriter).%New()
        Set sc = writer.OutputToStream(..%HttpRequest.EntityBody)
        if $$$ISERR(sc) {do $SYSTEM.Status.DisplayError(sc) Quit}
        Set sc = writer.WriteMIMEBody(rootMIME)
        if $$$ISERR(sc) {do $SYSTEM.Status.DisplayError(sc) Quit}
        
        // Set the HTTP Request Headers
        // Specify the Authorization header containing the OAuth2 Bearer Access Token.
        Set tToken = "set your token here or pull from wherever"
        Set tSC = ..%HttpRequest.SetHeader("Authorization","Bearer "_tToken)
        // Specify the Content-Type and Root MIME Part Boundary (required for multipart/form-data encoding.)
        Set tContentType = "multipart/form-data; boundary="_rootMIME.Boundary
        Set tSC = ..%HttpRequest.SetHeader("Content-Type",tContentType)

        // Call SendFormDataArray method in the adapter to execute POST. Response contained in tHttpResponse
        Set tSC=..Adapter.SendFormDataArray(.tHttpResponse,"POST", ..%HttpRequest, "", "", tURL)
        
        // Validate that the call succeeded and returned a response. If not, throw error.
        If $$$ISERR(tSC)&&$IsObject(tHttpResponse)&&$IsObject(tHttpResponse.Data)&&tHttpResponse.Data.Size 
        {
            Set tSC = $$$ERROR($$$EnsErrGeneral,$$$StatusDisplayString(tSC)_":"_tHttpResponse.Data.Read())
        }
        Quit:$$$ISERR(tSC)

        If $IsObject(tHttpResponse)
        {
            // Instantiate the response object
            S pResponse = ##class(Sample.Messages.Response.VendorResponseMsgClass).%New()
            // Convert JSON Response Payload into a Response Object
            S tSC = pResponse.%JSONImport(tHttpResponse.Data)

        }

Apologies as I haven't gotten into the whole Swagger generated API thing yet (working that direction though.) But to your desired output above, could you not do something like:

Set tRetObj = {}
Set tRetObj.article = {}.%FromJSON(article.%JSONExportToString())
Return tRetObj

Again, maybe I'm not fully understanding so I'll butt out after this reply and maybe someone else can help better. :-) I do see your concern re: MAXSTRING though and have encountered this concern myself. Though taking the export to string out of the return statement I think would allow you to handle that exception better.

Good morning Dmitriy - I'm not sure I 100% understand what you're asking but in my experience with %JSON.Adaptor,  there is only one additional step you need to do to get the string into %DynamicObject:

Set tDynObj = {}.%FromJSON(output)

While I agree it would be handy for %JSON.Adaptor to have a way to do this with one of their export methods, I think the intent may be to allow us to immediately take the JSON as a string to write it out to an HTTP Request body, which is where I use it most:

Set sc = pRequest.%JSONExportToString(.jsonPayload)
THROW:$$$ISERR(sc) $$$ERROR($$$GeneralError, "Couldn't execute object to json conversion") 
// Set HTTP Request Content-Type to JSON
Set tSC=..%HttpRequest.SetHeader("content-type","application/json") 
// Write the JSON Payload to the HttpRequest Body
Do ..%HttpRequest.EntityBody.Write()
S tSC = ..%HttpRequest.EntityBody.Write(jsonPayload)

Not sure what you're asking regarding the ID of the object. Which object? If you're referring to a persistent message class that extends %JSON.Adaptor, there is %Id() but I haven't used it so not sure if that's what you're after or not.

Hi Sebastian - You are correct in that I implement it all directly in the Operation and the BPL handler with respect to getting the token (Handled in the Operation) and the storing of it (handled in the BPL.) I am not currently using any %OAuth framework functionalities though I have started to peek at them. They don't seem to match my use-case and perhaps it has something to do with how the vendor I'm working with has implemented their OAuth messaging.

My GetBearerToken function ends up looking something like this - below is any early working version. I have since cleaned things up and encapsulated a lot of the build-up of the %Net.HttpRequest object using a custom framework I developed but this will give you a general idea.

My custom message classes extends Ens.Request (or Ens.Response as appropriate) and %JSON.Adaptor. The properties defined within are aligned with what the vendor expects to receive or send back. For instance, when sending this request to the vendor, they typically expect a Client Id, Client Secret, and the Audience and Grant Type being requested for the token. My BPL defines all those properties dynamically before sending the call to the operation.

 /// REST WS Method to fetch an environment specific OAuth Bearer Token.
Method GetBearerToken(pRequest As MyCustomMessageClass.Vendor.Request.GetBearerToken, Output pResponse As MyCustomMessageClass.Vendor.Response.GetBearerToken) As %Status
{

// Endpoint for retrieval of Bearer Token from Vendor
Set tURL = "https://vendor.com/oauth/token" 

Try {

Set sc = pRequest.%JSONExportToString(.jsonPayload)
THROW:$$$ISERR(sc) $$$ERROR($$$GeneralError, "Couldn't execute object to json conversion") 

// ..%HttpRequest is a reference to a class defined variable of type %Net.HttpRequest.
// Set HTTP Request Content-Type to JSON
Set tSC=..%HttpRequest.SetHeader("content-type","application/json") 

// Write the JSON Payload to the HttpRequest Body
Do ..%HttpRequest.EntityBody.Write()
tSC = ..%HttpRequest.EntityBody.Write(jsonPayload) 

// Call SendFormDataArray method in the adapter to execute POST. Response contained in tHttpResponse
Set tSC=..Adapter.SendFormDataArray(.tHttpResponse,"POST", ..%HttpRequest, "", "", tURL) 

// Validate that the call succeeded and returned a response. If not, throw error.
If $$$ISERR(tSC)&&$IsObject(tHttpResponse)&&$IsObject(tHttpResponse.Data)&&tHttpResponse.Data.Size 
{
Set tSC = $$$ERROR($$$EnsErrGeneral,$$$StatusDisplayString(tSC)_":"_tHttpResponse.Data.Read())
}
Quit:$$$ISERR(tSC) If $IsObject(tHttpResponse)
{

// Instantiate the response object
pResponse = ##class(MyCustomMessageClass.Vendor.Response.GetBearerToken).%New()
// Convert JSON Response Payload into a Response Object
tSC = pResponse.%JSONImport(tHttpResponse.Data)
}
Catch {
// If error anywhere in the process not caught previously, throw error here.
Set tSC = $$$SystemError
}
Quit tSC
}

Good morning Sebastian -

I read your post yesterday and was hopeful someone from InterSystems might respond with the best practices here as I tackled this situation myself a few months ago and had similar questions - the documentation isn't explicitly clear how this should be handled but certainly there are tools and classes available within HealthConnect to rig this up. As there has been no other response, I'll share how I handled this. Maybe others will chime in if they know of a better approach.

For my need, I am working with a vendor that requires we call their OAuth API with some initial parameters sent in a JSON body to receive back a response containing the Bearer token to be used in other API calls.

To achieve this, I created a custom outbound operation for this vendor that extends EnsLib.REST.Operation and using an XData MessageMap, defined a method that would execute the API call to get the Bearer token from the vendor (with the JSON body attributes passed in as a custom message class) and then another method that would execute the other API call that would utilize the Bearer token and pass along the healthcare data defined for this implementation (using a separate custom message class.) XData MessageMap looks similar to this:

XData MessageMap
{
<MapItems>
  <MapItem MessageType="MyCustomMessageClass.VendorName.Request.GetBearerToken">
    <Method>GetBearerToken</Method>
  </MapItem>
  <MapItem MessageType="MyCustomMessageClass.VendorName.Request.SubmitResult">
    <Method>SubmitResult</Method>
  </MapItem>
</MapItems>
}

Within that GetBearerToken method, I define the %Net.HttpRequest parameters, including the JSON body that I extract from the custom message class using %JSON.Adaptor's %JSONExportToString function, and execute the call. On successful status, I take the response coming back from the vendor and convert it into another custom message class (e.g. MyCustomMessageClass.VendorName.Response.GetBearerToken.)

From here, I simply need a way to use my custom outbound operation, define the values in the Request message class and utilize the values coming back to me in the  response message class. For that, I created a BPL that controls the flow of this process. In my case, the Bearer token defined by the vendor has a lifespan that is defined in the response message so I can also store off the Bearer token for a period of time to reduce the amount of API calls I make on a go-forward basis.

Here is an example BPL flow showing the call-outs to get the Bearer token:

So you can see at the top I'm executing a code block to check my current token for expiration based on a date/time I have stored with it. If expired, I call out to my custom operation with the appropriate message class (see the Xdata block I showed) to get a new token and if successful, I execute another code block to read the custom response message class to store off the Bearer token and its expiration date. 

From there it's just a matter of stepping into the next flow of the BPL to send a result that utilizes that Bearer token for its API calls. Each time this BPL is executed, the Bearer token is checked for validity (i.e. not expired) - If expired, it gets a new one, if not, it utilizes the one that was saved off and then the next part of the BPL (not shown), crafts a message for the SubmitResult part of the custom operation and inside that operation, SubmitResult utilizes the stored Bearer token as appropriate to execute its API call.

Hopefully I explained enough to get you going or basically echo what you were thinking of doing.

But I'd certainly be interested in hearing a better approach if there is one.

Regards,

Craig

We had this exact scenario a year ago or so and the easiest solution for us, since we aren't using System Default Settings like Eduard mentioned,  was to open up the production class in Studio and do a search/replace (Ctrl-H)

Search for: epicsup

Replace with epictst

Then save/compile. Note will take effect immediately (edit: trying to recall if we had to click the 'Update' button in the Production Monitor for the connections to disconnect from the old and reconnect to the new. Definitely verify in your production monitor that the connections established to their new destination.)

Good morning Gertjan - 

Thanks for confirming my suspicions re: a potential bug. Yes, I'm aware of the availability of local variables, certainly not a new user of InterSystems' products hence my confusion when what I suspected should work, based on my prior experiences, did not.

Since the 'case' is a conditional and usually treated like a shorthand 'if' in compilers, I had thought the IRIS compiler would boil it down much the same

I will open a WRC incident to let them know of the issue and utilize a workaround for now. Many thanks for your help proving I wasn't crazy!

Thanks for your response - here is a snippet that doesn't compile:

ERROR: FAIS.DTL.MyDtlName.cls(Transform+11) #1011: Invalid name : '(source.Items.(i).ItemNumber=".1")' : Offset:21

You see it appears to be complaining about the reference to the source object (at least if I'm reading the error right... maybe I'm not!)

If I change line 3 (edit: and line 4 and 5 too, of course, as appropriate) to source.Items.GetAt(i).ItemNumber=".1", it works as it should and compiles and runs perfectly. But as you point out, the compiler should boil that down for me.

Here's the source:

From your screenshot, I see you're using a custom message class ORMFARM.amplitudeHTTPRequest. Have you extended the class to include  Ens.Request? 

I recently had a similar project and to get the request and response message bodies to show in trace, I had to include Ens.Request and Ens.Response (respectively) in my class definition:
 

Class MyRequestClassExample Extends (Ens.Request, Ens.Util.MessageBodyMethods, %JSON.Adaptor)

{

Property Example As %String;

}