I have found the answer. ZPM will only run on IRIS. So I will load it into my IRIS installation instead. 

In general, the following concepts should be noted when working with REST.

When you create a Namespace Definition, unless you specify otherwise, a default Web Application will be defined. If you namespace is named "Application-DEV" then when you save the Namespace definition a Web Application will be created in the form /csp/application-dev/.

Likewise, you are likely to have a QC and PROD namespaces as well and the default Web Applications will be /csp/application-qc/ and /csp/application-prod/

This default Web Application is used by by the "Management Portal" and  "View Class Documentation"

If you design normal CSP Pages bound to a class in your application then the default Web Application will be used. You can customise this by specifying your own Login Page and a couple of other properties.

When it comes to REST, and specifically the REST Dispatcher you will define one or more Web Applications. All of which point to the same namespace.

For example, if you have three namespaces, DEV, QC and PRD and you have a Business Service and you want to channel HTTP Requests based on 3 different User Roles then, for each namespace, you define a Web Application for each Role for each namespace.

Lets assume that the roles are User, Administrator and SuperUser then you would create the following Web Applications:




generically the format is /csp/{namespace_abbreviation}/{role}

When you define these Web Applications you need to have written a class that inherits from %CSP.REST

In you Interface Production you add three business services that are named "User Service", "Administrator Service" and "SuperUser Service". Every Service Production Item has the same underlying Business Service Class. In you Web Application Definition there is a field called REST Dispatcher. You enter the name of your REST Dispatcher class there. The rest of the form greys out

In your Rest Dispatcher class, lets call it REST.Dispatcher there is an XDATA routing block that defines what to do when  different HTTP Methods are used to pass in your request. They are very simply POST, PUT, DELETE, GET and GET with parameters (essentially a search)

Lets assume that you are going to InsertData, UpdataData, DeleteData, FetchData, SearchData

Then the XDATA Route Maps would look something like this:

<Route URL="/InsertData" Method="POST" Call "InsertDataMethod" />

<Route URL="/UpdateData/:id" Method="PUT" call "UpdateDataMethod" />

<Route URL="/SearchData/:id" Method="GET" call "SearchDataMethod" />

The methods are defined in the REST.Dispatcher class and one of the variables available to you is %request.URL which is the full /csp/{ns}/{role}/ and from this you can determine which Production Service Name you want to pass the request to.

So you Production Item Name turns out to be "Administrator Service" and is held in a variable tProductionItem

you then execute the following line of code:

Set tSC = ##class(Ens.Director).CreateBusinessService(tProductionItem,.tService) if 'tSC quit

You then create your Ensemble Request Message (call it tMyAppRequest) based on data from the %request object, the %request.Headers List and the %request.Data(tParam, .tKey)=tValue,. You $order through the Parameters, and then the Keys and for each combination of Param and Key there will be a value even if it is null. Bear in mind that in a URL you can specify a parameter name more than once so it is best to build a $List tParamList=tParamList_$lb(tValue, tParam) and then insert the list into a property in your MyApplicationRequest.Parameters.Insert(tParamList) ten you move onto the next Parameter

Once your Request message is constructed you pass the request message to the Instantiated Business Service as follows:

Set tSC = tService.ProcessInput(.tMyApplicationRequest,.tMyApplicationResponse) and that will invoke the OnProcesInput(tRequest, .tResponse) method of your Business Service.

When it come s to CSP and REST I suspect that invoke similar logic to determine which Business Service in which Namespace the request will be directed to.

CSP REST calls do not use the REST Dispatcher. But you do Instantiate the correct Business Service Name though I see no reason why it can't but I would need to dig deep in the documentation to make sure.



Use the %Stream.FileBinary class. A simple exmple is as follows:

Set stream=##class(%Stream.FileBinary).%New()

Set sc=stream.LinkToFile("c:\myfile.txt")

While 'stream.AtEnd { Set line=stream.Read()

; Process the chunk here


Typically you would read each chunk from the file into your object 'stream' and once you have reached the end of the file (AtEnd=1) you would then use the Stream Copy method to copy the stream object into your class property e.g.

Property MyPicture as %BinaryStream

and likewise you can copy the MyPicture stream into new %BinaryStream and then write that stream out to file.

Of course you can also just read from file directly into your Property MyPicture and write it out to another file or another stream object.

You don't strictly speaking need the intemediary step of reading the file contents into an instance of %BinaryStream . You can read it directly into your 'MyPicture' property however there may be cases where you might want to analyse the stream object before writing it into your class.

When you %Save() your class that contains the property 'MyPicture' the steam data is written into the 'S' global in the Cache default storage architecture. That is to say, if I have a class "Company.Staff" and for each staff member apart from their names, addresses and so on you may have indices on certain properties and you may have a property such as "StaffMemeberPicture as %Stream.FileBinary.

By the way the IRIS documentation on LinkTo() for Binary Streams warns that if the picture is edited outside of IRIS then the version of the picture stored in IRIS will be different from the edited picture external to IRIS. Hence the reason why you would read it into a intermediary %BinaryStream and then copy i into your class property. If you suspect that the external picture may have changed then if you ever export the BinaryStream back to file you might want to write it to a different filename so that you won't overwrite the edited photo and you can then compare the file size of the files to see if there is a difference which will tell you if the original has been edited. Or that's how I interpreted the documentation. 

When IRIS creates the storage definition the regular data fields go into the global ^Company.StaffD, the Indices are created in the global ^Company.StaffI and the stream data will be stored in ^Company.StaffS




Dimitry is correct in his reply that this is a memory issue. Every cache connection or ensemble production class as well as all of the system processes run in individual instances of cache.exe or iris.exe (in IRIS). Each of these processes is in effect an operating system process (or job) and when a new user process is created Cache allocates a certain amount of memory to that process. The memory is divided into chucks, there is a chunk of memory where the code being executed is stored, there is a chunk of memory where system variables and other process information is stored and then there is a chunk of memory that is used to store variables that are created by the code being executed. Whether it is a simple variable [variable1="123"] or a complex structure such as an object (which is basically a whole load of variables and arrays that are related together as the attributes of an object instance). If you are using Cache Objects then when you create variables or manipulate objects in a (class)method those variables are killed when the method quits. Another feature of Cache Objects is that if you open an instance of a very large object with lots of properties, some of which are embedded objects, collections, streams and relationships Cache does not load the entire object into memory. it just loads the basic object and then as you reference properties that are serial objects, collections and so on then only then does cache pull that data into your variable memory area. And in normal situations you can generally speaking create a lot of variables and open many objects and still have memory left over. However there are a couple of things that can mess with this memory management and they are:

1) Specifying variables used in a method as PUBLIC which means that once they are created they remain in memory until you either kill them or use the NEW command on them. Secondly, it is possible to write code that gets into a nested loop and within each loop more variables are created and more objects are created or opened and eventually you will run out of memory and a <STORE> error is generated. 

I did a quick check to see where %SYS.BNDSRV is referenced and there is one line of code in the %Library.RegisteredObject class in a method called %BindExport what calls a method in %SYS.BINDSRV. The documentation for %BindExport says the following:

/// This method is used by Language Binding Engine to
/// send the whole object and all objects it referes to
/// to the client.

So my guess is that you have a Java, .Net or some other binding and when %BindExport is called it is trying to pass the contents of your object (and any directly linked objects) to the client and that is filling up your variable memory and generating the store error. 

I also see that the %Atelier class is also referencing %SYS.BINDSRV. 

So to investigate further do you use Atelier and/or are you using class bindings (Java etc....)

If you are then something you are doing with Atelier or in you application is periodically trying to manipulate a lot of objects all at once and killing your process memory. You can increase the amount of memory allocated to cache processes but bear in mind that if you increase the process memory allocation then that setting will be applied to all cache processes. I suspect there may be a way of creating a cache process with a larger memory allocation for just that process but I have no idea if it is possible or how to do it.

It is quite likely that even if you increase the process memory it may not cure the problem in which case I would suggest that you contact WRC and log a call with them.



In one of my Zen applications I built a custom Login Screen and in the Web Application Definition I call that login csp page. In my case I maintain my own User/Role tables and when I add or modify a User I use the Ensemble security method calls to create or update the user in Ensemble.

The Zen Custom Login forms are designed to validate the user against the Ensemble Users Table.

Given that in your login page the User and Password fields can be accessed through the DOM model you can get the values from those fields by specifying a "OnChange" or equivalent JS event and you can then get the value entered by the user and you can then write those values into your own "Login Audit Table". Likewise you can detect when the Zen session has closed and so you could trap that and update your "Login Audit Table" to indicate when the user logged out.

In the Zen documentation they give an example of a simple custom login form:

<page xmlns="http://www.intersystems.com/zen" title="">
  <loginForm id="loginForm" >
    <text name="CacheUserName" label="User:" />
    <password name="CachePassword" label="Password:" />
    <submit caption="Login" />

So you could trap the user name and password values (the password value will have been encrypted so if you needed to see the actual password value you would have to unencrypt it).

The only problem I foresee is that once you have submitted the form you won't know if the user has logged in successfully until your Home page displays so you would have to store those values in a temp global and then if you get to your Home screen you would know that they had logged in successfully and you could create your login Audit. 

Given the way that the Cache/Ensemble/IRIS documentation is structured you may well find a section elsewhere in the Zen documentation that might tell you how to trap the outcome of the 'submit' but I have not attempted to see if my theory returns any more useful information.



The way that I have dealt with this in the past is as follows:

1) Create a DTL that accepts the incoming HL7 message as its source and  a target class of EnsLib.HL7.Message with a message structure of 2.5.1:ACK. I am using HL7 2.5.1 but it will work with any version of HL7 from HL7 2.3 upwards (probably earlier versions as well but I have not worked with versions earlier that 2.3)

2) when you invoke the Transform() method of a DTL you will notice that there are three parameters

  1. pRequest (which is your source message)
  2. pResponse (which is the generated target message)
  3. aux

If you read the documentation, if the transform i invoked from a Business Rule then aux is an object and contains information about bthe Rule that invoked the Transform and a couple of other properties. However if you are invoking the transform from Cache ObjectScript then aux can be an instance of a class you create.

The way that I use 'aux' is as a mechanism for getting information into the DTL that is not present in either the sourc or target objects. In this example I want to send in the ACKCode and the ACKMessage.

So my DTL looks like this: (I apologise for having to paste in the class code but I have never found a way to attach classes to a Community Reply) so here goes.

My DTL Class reads as follows:

Class Example.Transformations.CreateNACKDTL Extends Ens.DataTransformDTL [ DependsOn = EnsLib.HL7.Message ]


Parameter REPORTERRORS = 1;


XData DTL [ XMLNamespace = "http://www.intersystems.com/dtl" ]
<transform sourceClass='EnsLib.HL7.Message' targetClass='EnsLib.HL7.Message' sourceDocType='2.5.1:ADT_A01' targetDocType='2.5.1:ACK' create='new' language='objectscript' >
<assign value='source.{MSH:FieldSeparator}' property='target.{MSH:FieldSeparator}' action='set' />
<assign value='source.{MSH:EncodingCharacters}' property='target.{MSH:EncodingCharacters}' action='set' />
<assign value='source.{MSH:SendingApplication.NamespaceID}' property='target.{MSH:SendingApplication.NamespaceID}' action='set' />
<assign value='source.{MSH:SendingApplication.UniversalID}' property='target.{MSH:SendingApplication.UniversalID}' action='set' />
<assign value='source.{MSH:SendingApplication.UniversalIDType}' property='target.{MSH:SendingApplication.UniversalIDType}' action='set' />
<assign value='source.{MSH:SendingFacility.NamespaceID}' property='target.{MSH:SendingFacility.NamespaceID}' action='set' />
<assign value='source.{MSH:SendingFacility.UniversalID}' property='target.{MSH:SendingFacility.UniversalID}' action='set' />
<assign value='source.{MSH:SendingFacility.UniversalIDType}' property='target.{MSH:SendingFacility.UniversalIDType}' action='set' />
<assign value='source.{MSH:ReceivingApplication.NamespaceID}' property='target.{MSH:ReceivingApplication.NamespaceID}' action='set' />
<assign value='source.{MSH:ReceivingApplication.UniversalID}' property='target.{MSH:ReceivingApplication.UniversalID}' action='set' />
<assign value='source.{MSH:ReceivingApplication.UniversalIDType}' property='target.{MSH:ReceivingApplication.UniversalIDType}' action='set' />
<assign value='source.{MSH:ReceivingFacility.NamespaceID}' property='target.{MSH:ReceivingFacility.NamespaceID}' action='set' />
<assign value='source.{MSH:ReceivingFacility.UniversalID}' property='target.{MSH:ReceivingFacility.UniversalID}' action='set' />
<assign value='source.{MSH:ReceivingFacility.UniversalIDType}' property='target.{MSH:ReceivingFacility.UniversalIDType}' action='set' />
<assign value='$tr($zdt($h,3),",: ","")' property='target.{MSH:DateTimeOfMessage}' action='set' />
<assign value='source.{MSH:Security}' property='target.{MSH:Security}' action='set' />
<assign value='source.{MSH:MessageControlID}' property='target.{MSH:MessageControlID}' action='set' />
<assign value='"ACK"' property='target.{MSH:MessageType.MessageCode}' action='set' />
<assign value='source.{MSH:MessageType.TriggerEvent}' property='target.{MSH:MessageType.TriggerEvent}' action='set' />
<assign value='"ACK"' property='target.{MSH:MessageType.MessageStructure}' action='set' />
<assign value='source.{MSH:ProcessingID}' property='target.{MSH:ProcessingID}' action='set' />
<assign value='source.{MSH:VersionID}' property='target.{MSH:VersionID}' action='set' />
<assign value='source.{MSH:SequenceNumber}' property='target.{MSH:SequenceNumber}' action='set' />
<assign value='source.{MSH:ContinuationPointer}' property='target.{MSH:ContinuationPointer}' action='set' />
<assign value='source.{MSH:AcceptAcknowledgmentType}' property='target.{MSH:AcceptAcknowledgmentType}' action='set' />
<assign value='source.{MSH:ApplicationAcknowledgmentTyp}' property='target.{MSH:ApplicationAcknowledgmentTyp}' action='set' />
<assign value='source.{MSH:CountryCode}' property='target.{MSH:CountryCode}' action='set' />
<assign value='source.{MSH:PrincipalLanguageOfMessage}' property='target.{MSH:PrincipalLanguageOfMessage}' action='set' />
<assign value='source.{MSH:AltCharsetHandlingScheme}' property='target.{MSH:AltCharsetHandlingScheme}' action='set' />
<assign value='aux.ACKCode' property='target.{MSA:AcknowledgmentCode}' action='set' />
<assign value='aux.ACKMessage' property='target.{MSA:TextMessage}' action='set' />


My AUX class definition looks like this:

Class Example.Transformations.CreateNACKDTL.AUX Extends %Persistent

Property ACKCode As %String;

Property ACKMessage As %String(MAXLEN = 200);


To generate the HL7 ACK my code reads:

classmethod GenerateACKMessage(pRequest as EnsLib.HL7.Message, byref pResponse as EnsLib.HL7.Message, pACKCode as %String(VALUELIST=",CA,CE,CR,AA,AE,AR")="AA", pACKMessage as %String(MAXLEN=500)="") as %Status

set tSC=$$$OK

try {

    set aux=##class(Example.Transformations.CreateNACKDTL.AUX).%New()

    set aux.ACKCode=pACKCode,aux.ACKMessage=pACKMessage

    set tSC=##class(Example.Transformations.CreateNACKDTL).Transform(pRequest,.pResponse,.aux) if 'tSC quit


catch ex {set tSC=ex.AsStatus()}

quit tSC


Notice that the DTL takes the SendingFaclity, SendingApplication, ReceivingFacility and ReceivingApplication and swaps them around in the DTL.

You should keep the MessageControlID the same as the incoming HL7 Message MessageControlId so that the ACK can be linked to the original HL7 request. 

The MessageTimeStamp can be updated to current Date/Time

The 'aux' mechanism is very useful. Unfortunately the documentation has one line of comment that says "if the transform is called from ObjectScript aux can contain anything you want" or words to that effect.

So I tested it in a simple example like the one above and it does indedd work and I now use it in the 30+ DTL's I amworking with at the moment.



The type of content that is most appreciated by members of the community are articles on some aspect of the technology (Cache, Ensemble, IRIS) or specific usage of the technology that you have worked with, have a good understanding of, maybe have learnt a few tips and tricks about the  using the technology that is not covered by the core product documentation. This is especially true when you happen to make use of one of the more obscure features of the technology in a development project you have worked on and maybe battled to get it to work, found the documentation to be lacking, found that there are few if any posts on the subject in past community posts and no examples  in the various samples supplied in the Cache/Ensemble/IRIS "Samples" namespace or the "Dev" directory in the ../intersystems/..../dev directory.

To give you an example, some years back I was working on a project where we were building a prototype Robot. I was writing the Ensemble component that would interface the outside world and translate those instructions into calls to a Java code base that controlled the motors and sensors and other mechanical components of the robot. The developer I was working with knew all of the Java stuff and I knew all the Ensemble stuff and to make the two technologies talk to each other we had to make use of the Java Gateway. We read the documentation. It seemed straight forward enough. I had had a lot of experience working with most of the Ensemble Adapters so I was expecting things to go smoothly.

But they didn't. We re-read the documentation. We looked at the examples, we asked questions in the Developer Community, we contacted WRC but still we could not get it to work. Eventually my colleague fond a combination of bits and pieces of the Java Gateway that he merged together and eventually we got the interface working.

To this day I still don't understand why the gateway did not work the way the documentation said it should. I don't exactly understand how the solution we eventually put in place that did work, worked.   

At the time we were still experimenting with the Java Gateway and realised that the documentation only took us so far, it would have been great if we had been able to find an article in the Developer Community written by someone who had used the Gateway, had found some specific things that needed to be setup correctly for it to work, included some code snippets in the article and so on. If I had found such an article and it helped us get our Gateway to work (we had struggled with it for 2 months, it should have taken 2 days to get it to work) I would have sent a bottle of our famous South African Artisan Gin and a large packet of the South African delicacy, "Biltong" (dried Beef, Kudu, Springbok, Ostrich meat) to that man as a thank you.

These days the focus is on IRIS and IRIS for Health. There is huge interest in  FHIR and the various interoperability options for IHE, PIX, PDQ,  HL7, Dicom, CDA and so on. 

I have been quite active on the DC for many years and since the Covid19 Lockdown I have had more time to answer questions and I too am thinking of articles, code solutions and such like that i can write up or package for the Open Exchange applications page. I have even got to the point where I have invested in some lighting gear and a tool called Doodly which allows you to create animated videos to explain concepts or processes to achieve a desired solution or outcome. I hope to start publishing some of these articles in the near future.

So I hope that these observations will encourage you to find good subject material to write up and publish



There are system utilities that allow you to retrieve a list of globals based on a wildcard.

Here is some code that gets a list of Globals in a namespace. You can modify it to suit your needs:

Method GetGlobalList(ByRef globallist As %String(MAXLEN=3000) = "", globalnameprefix As %String) As %Status
    set $ztrap="Error",tSC=$$$OK,globallist=""
    set gbl=""
    for {
        set gbl=$o(^$GLOBAL(gbl)) q:gbl=""
        if globalnameprefix[$p(gbl,".",1) set globallist=globallist_gbl_","
    set globallist=$e(globallist,1,$l(globallist)-1)
End ;
    quit tSC
Error ;
    set $ztrap="",tSC=$$$ERROR(5001,"Code Error: "_$ze) goto End

This uses old style $ztrap error handling and would be better written as a TRY/CATCH

I hope this helps




No, the OnInit() method is called when the Business Service Starts Up, the OnTearDown() is invoked when the Business Service stops. The OnInit() is not aware of the request message and therefore It is not aware of any Request messages at this point. The ProcessInput() and more specifically the OnProcessInput() method is the first time you get to see the incoming request message and it is in the OnProcessInput() method that you decide what you are going to do with the request HL& Message, whether you route it to a Business Process based on the Ensemble HL7 Passthrough Architecture or whether you pass it to a custom Business Process However I made the assumption that your Business Service is a conventional HL7 TCP Inbound Adapter based service. 

If however it is an HTTP REST service then that is another matter altogether. If it is an HTTP REST service then by default the request will be handled by the %CSP.REST Dispatch class. The basic functionality of the %CSP.REST class is to route HTTP requests to a Business Service. You can inherit the %CSP.REST class into your own REST dispatcher class. 

I have a REST Dispatcher class in Ensemble that emulates the IRIS FHIR End Point functionality.

I have 4 csp applications defined:





All 4 csp applications invoke my custom Rest.Dispatch class (which inherits from %CSP.REST) 

I have a Business Service Class named BusinessService.MyEndPointService

I create 4 Business Services in my production

End Point General (/csp/endpoint)

End Point A (/csp/endpoint/endpointA

and so on

In the Rest Dispatch class I look at the request URL and based on the URL I invoke the OnRequest() method of the Appropriate Business Service using the Production Service Name.

However as I am writing this there is something in the back of my mind that is telling me that if you are using the EnsLib.HL7.TCP Adapter that you can reroute an incoming message to another service but I would have to go back to the documentation to remind myself of what exactly you can do and how it works.

The most common way that developers normally use is the EnsLib.HL7.MsgRouter architecture where you create routine rules that test the MSH Message Structure and you can then route the message to another specific Business Process to process that specific Message Type. This is all handled through the Management Portal->Ensemble->Build set of functions which allow you to create Business Processes, Business Rules, Transformations and so on.

If you are using HTTP REST and want more information on how I have implemented that then I would send you a far more detailed description of my implementation.


Historically Cache and Ensemble did not support WebSockets and so you could not have two processes using the same (incoming) port but if I remember correctly IRIS supports WebSockets and though I can't remember how these work something in the depths of my mind tells me that I think WebSockets were aimed at this specific requirement

Check out the IRIS documentation on WebSockets