Hi

When did this issue start happening and what happened before this error that required the system to be shut down and restarted? Was there a system problem or just a normal restart?

I have found the answer. ZPM will only run on IRIS. So I will load it into my IRIS installation instead. 

Hi George

It is a top level folder on my laptop and no I haven't exported all of the classes, So that I shall do and confirm that it works. I will have an answer by tomorrow.  Busy deploying an application on a customer site but wanted to acknowledge that I have read your reply.

Thanks

Nigel

Ha ha ha, love the Zombies 🧟‍♂️. I have written little utilities based around ^$GLOBAL so that I can select a range of globals based on a wild card *{partial_global_name}* with an exclusion clause specified as -*{partial_global_name}* which will exclude those globals from the resultant selection. I then use the resultant array to perform actions like export, delete, kill the index globals for a range of D (data) globals and then run %BuildIndices(). I also wrote a global traverse function that would walk through a global nesting down through sub-nodes to N levels and then export the resultant tree to file using List to String for $List forms of globals. This is obviously something that the Management Portal -> Globals does but the code is well hidden. You could also specify an array of pseudo node syntax (global.level1...,levelN) followed by a field number and either an expression or classmethod call that would modify that value. The modified global(s) were then written to a new file which I could import. The import would kill the original globals and during the import convert the de-listed strings back into lists. It also allowed for fields to be inserted or removed, again with an expression or method call to populate the inserted field with a value.  It was very useful when working with old style globals and for retro populating existing data for classes that had been modified and had either changed the data type on a field, added a required field or where a property had been removed and the storage definition had been changed and the global was now out of sync with the new storage definition.

It was very useful if the days when creating globals with many levels as we used to do years ago. These days it's seldom that you design globals like that especially if you want to use SQL on them.

But it was fun to work out the traverse algorithm using @gblref@() syntax and avoid stack overflows 

It could also be the URL of the Patient Resource. In the REST Dispatcher XDATA Routing Map Block check the URL syntax:

Normally for FHIR REST Dispatchers you will see the routings will look like this:

<Route Url="/Patient" Method="POST" Call="PatientHandler"/>

<Route Url="/Patient/:id" Method="PUT" Call="PatientHandler"/>

In the IRIS for Health, HS.FHIRServer.resthandler the XDATA Block looks like this:

XData UrlMap
{
<Routes> <Route Url="/(.*)"   Method="GET"  Call="processRequest"/>
<Route Url="/(.*)"   Method="POST"  Call="processRequest"/>
<Route Url="/(.*)"   Method="PUT"  Call="processRequest"/>
<Route Url="/(.*)"   Method="DELETE"  Call="processRequest"/>
<Route Url="/(.*)"   Method="PATCH"  Call="processRequest"/>
<Route Url="/(.*)"   Method="HEAD"  Call="processRequest"/>
</Routes>
}

The Process Request will see that it is a Bundle and ultimately calls the class HS.FHIRServer.DefaultBundleProcessor. The Bundle processor will analyze the entries in the Bundle, it looks at the Entries in the Bundle to check if there are any dependencies for Entries where the request Method is POST.  There is comment in the code that says:

// Transaction Processing:
// 1. Map 'fullUrl' -> entry# for each POST entry
// 2. Visit all references, capture:
// References that are "search urls"
// References the target one of the 'POST' entry fullUrl values
// 3. Resolve the search urls to resource references
// 4. Resolve references for each POST result
// 5. execute other operations and return result.
 

Further on it goes the following code:

if isTransaction {
if (reqObj.method  = "POST") {
// Grab the UUID if the fullUrl is specified
Set referenceKey = ..ExtractUUID(entryObj.fullUrl)
 

It uses this for the response where it returns the 'Location' and the response Bundle will specify the entry.response.fullURL

In your example you have specified the 'url' as "Patient" 

Despite all of the rest of the code I searched through I could not find any code that checked this for a leading "/"

Ultimately each Post is then dispatched to the FHIRService which will perform the POST (Interaction="Insert") and send back the Response Status that will potentially be converted into an Operation Outcome.

It's just a hunch and I suspect that adding the "/" after R4 might have the same effect but try putting a "/" into the url i.e. "url" : "/Patient"

or you could use the property 'fullUrl' instead of 'url' and see if that makes a difference

Yours

Nigel  

That is very neat. I have a situation at the moment where I want to force a single object in a class and I use %OnBeforeSave() to check that the field that I have used as a PK, IDKey, Unique has a specific value. Though this is not quite the same as a one:one relationship it is similar in the sence that I amtrying to do something that is not specifically addressed in the documentation or through a class attribute.

In general, the following concepts should be noted when working with REST.

When you create a Namespace Definition, unless you specify otherwise, a default Web Application will be defined. If you namespace is named "Application-DEV" then when you save the Namespace definition a Web Application will be created in the form /csp/application-dev/.

Likewise, you are likely to have a QC and PROD namespaces as well and the default Web Applications will be /csp/application-qc/ and /csp/application-prod/

This default Web Application is used by by the "Management Portal" and  "View Class Documentation"

If you design normal CSP Pages bound to a class in your application then the default Web Application will be used. You can customise this by specifying your own Login Page and a couple of other properties.

When it comes to REST, and specifically the REST Dispatcher you will define one or more Web Applications. All of which point to the same namespace.

For example, if you have three namespaces, DEV, QC and PRD and you have a Business Service and you want to channel HTTP Requests based on 3 different User Roles then, for each namespace, you define a Web Application for each Role for each namespace.

Lets assume that the roles are User, Administrator and SuperUser then you would create the following Web Applications:

/csp/dev/user/

/csp/dev/administrator/

/csp/dev/superuser/

generically the format is /csp/{namespace_abbreviation}/{role}

When you define these Web Applications you need to have written a class that inherits from %CSP.REST

In you Interface Production you add three business services that are named "User Service", "Administrator Service" and "SuperUser Service". Every Service Production Item has the same underlying Business Service Class. In you Web Application Definition there is a field called REST Dispatcher. You enter the name of your REST Dispatcher class there. The rest of the form greys out

In your Rest Dispatcher class, lets call it REST.Dispatcher there is an XDATA routing block that defines what to do when  different HTTP Methods are used to pass in your request. They are very simply POST, PUT, DELETE, GET and GET with parameters (essentially a search)

Lets assume that you are going to InsertData, UpdataData, DeleteData, FetchData, SearchData

Then the XDATA Route Maps would look something like this:

<Route URL="/InsertData" Method="POST" Call "InsertDataMethod" />

<Route URL="/UpdateData/:id" Method="PUT" call "UpdateDataMethod" />

<Route URL="/SearchData/:id" Method="GET" call "SearchDataMethod" />

The methods are defined in the REST.Dispatcher class and one of the variables available to you is %request.URL which is the full /csp/{ns}/{role}/ and from this you can determine which Production Service Name you want to pass the request to.

So you Production Item Name turns out to be "Administrator Service" and is held in a variable tProductionItem

you then execute the following line of code:

Set tSC = ##class(Ens.Director).CreateBusinessService(tProductionItem,.tService) if 'tSC quit

You then create your Ensemble Request Message (call it tMyAppRequest) based on data from the %request object, the %request.Headers List and the %request.Data(tParam, .tKey)=tValue,. You $order through the Parameters, and then the Keys and for each combination of Param and Key there will be a value even if it is null. Bear in mind that in a URL you can specify a parameter name more than once so it is best to build a $List tParamList=tParamList_$lb(tValue, tParam) and then insert the list into a property in your MyApplicationRequest.Parameters.Insert(tParamList) ten you move onto the next Parameter

Once your Request message is constructed you pass the request message to the Instantiated Business Service as follows:

Set tSC = tService.ProcessInput(.tMyApplicationRequest,.tMyApplicationResponse) and that will invoke the OnProcesInput(tRequest, .tResponse) method of your Business Service.

When it come s to CSP and REST I suspect that invoke similar logic to determine which Business Service in which Namespace the request will be directed to.

CSP REST calls do not use the REST Dispatcher. But you do Instantiate the correct Business Service Name though I see no reason why it can't but I would need to dig deep in the documentation to make sure.

Nigel

By the way I edited my original reply and added some more information.

Nigel

That is why we differentiate between GlobalCharacter Streams for plain text and BinaryStreams for binary data such as photo's. The reason for this is that binary data can contain characters that IRIS might interpret as terminators or have other undesirable effects and so we handle the content differently.

In Character Streams you can use any of the first 128 ascii characters except for control characters whereas Binary data can use the full 256 ASCII character set without worrying about control characters and the like.

You can of course convert the binary data into base64 encoding which will result in a pure character stream without control characters etc. Then you could use the character streams. Then when you want to export the data again your would have to do a base64 decode to get it back to pure binary (I might have got my encoding and decoding in the wrong order but the documentation will assist you with that).

Nigel