One thing I always recommend is to get familiar with the API using a standalone tool before attempting to code the programmatic interface.  I like Postman as it has pretty good UI to work with.  If you are at the command line you can use curl.


Postman can also give you the code for different programming environments.  Unfortunately not Objectscript, but it is fairly easy to translate from the examples you can see.


I was going through some unanswered questions and came across yours. If you could share your spec to DM it to me I can take a look.

However, understand that the %Stream.Object probably contains the JSON payload that you need.  As such you can get your dynamic object with the following command:

set dyObject = {}.%FromJSON(body)

hope that helps.

First the content type is not correct.  This should be application/json.  Next I would add the following statement

do httprequest.SetHeader("Accept","application/json")

This tells the request what type of response your application will accept.  The default is text/html I believe which can't be supplied by the api.

You could try %SYS.MONLBL.  This is intended as a performance trace, but it MAY work for this purpose too.  You would have your application running in one session and get the the process is for that session.  Open a new session and run %SYS.MONLBL.  Choose the options to monitor ALL routines and the specific process id (PID) where you are running your application.  Go back to the application session and perform the function you want to trace.  Immediately go back to %SYS.MONLBL and generate the report before doing anything else in the application.

NOTE: This might not work with deployed code and even if it does will likely not provide any details of the deployed programs.  Hopefully it will at least show an entry for the deployed routine so you can see what is being called.

Good luck


The best way to approach this would to engage with your sales engineer.  We are always available to help evaluate use cases for our technology and to assist you in understanding the implementation of such interfaces.

You can additionally begin your investigation with our documentation and learning resoures.  Here are a couple of links to get you started.

Enabling Productions to Use Managed File Transfer Services

First Look: Managed File Transfer (MFT) with Interoperability Productions

Managed File Transfer video


The cause is that fact that Desktop Docker (Docker for Windows) is a bit of a miss-leading.  This is not really using Windows containers at all by default, though it can.  From what I understand, while it is getting better, true Windows containers are still a bit problematic.  Also all our images are Ubuntu based anyway.  What is really happening is that there is a small Linux (Moby) vm running under Desktop Docker.  So when you share volumes from the windows file system to the container you are going through two transitions.  One from the Container to the Host Linux then from the Host Linux out to windows.   While this works for just Docker volumes, there are issues when trying to do Durable %SYS around permissions as you surmised.   I have heard of people getting into the Moby Linux environment and messing with the permissions, but this seems too much trouble and too fragile for my tastes.

You do have some options though.

  1. A better option might be to use WSL2 (Windows Sub-system for Linux).  This will enable you to run an Ubuntu environment within Windows.  You do need to be on WSL2 and not the first version which as too limited.  Here are a couple of links: https://code.visualstudio.com/blogs/2020/03/02/docker-in-wsl2https://www.hanselman.com/blog/HowToSetUpDockerWithinWindowsSystemForLinuxWSL2OnWindows10.aspx  
  2. you could just run a Linux VM  
  3. You could go with the nuclear option and just switch to Linux as a Desktop.  I took this route over a year ago, before WSL2 came out, and have not looked back.

Hope this helps.


The  best option is to work with  IRIS for Health Community Edition which is free for development and education.  You can get this from either Docker as a container you can use on your system or on AWS, Azure, or GCP if you want to work in the cloud.  AWS, at least, has a free tier that is good for 750 hours a month up to a year.  This is more than adequate for education and simple development.  I have used this for demos for a time.


if you follow the link in the top bar for 'Learning'  you will find many education resources including some quick start topics on IRIS.  And, of course, you can ask questions here.

Well one thing is to be sure that the external database implements a FHIR server and that you have the necessary access information and credentials.   They have to be able to accept the REST call per the FHIR standard.  If this is not in place all is not lost.  You can still use other methods access the external database, the method depends on what options are provided.  You just could not use FHIR.

BTW, if I understand you correctly Health Connect would be acting as a FHIR client in this usage, not a server.


You state that Request.JSON is a simple string.  The %ToJSON() method only exists as part of the DynamicObject and DynamicArray classes.  Actually I am surprised that this is not failing completely before you even send the request because of this.  If your variable already has well formed JSON as the value then you can just write that into the EntityBody.

BTW, When you deal with JSON in the future you may find an advantage in using the Dynamic object handling for JSON.  For example take the following JSON {"NAME":"RICH","STATE":"MA"}

If this is in a variable, say jsonStr, I can load this into a Dynamic Object using the following command:

set obj = {}.%FromJSON(jsonStr)

Now you can use object references to work with the JSON

Write obj.NAME   -> displays RICH

I can add new properties dynamically

set obj.ZIPCODE = "99999"

Finally convert this back to a json string with:

write obj.%ToJSON()

which would display  {"NAME":"RICH","STATE":"MA","ZIPCODE":"99999"}

See the documentation at https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=GJSON_preface

I find the easiest way is to log into the CSP Gateway for the web server you are using.  If this is a development machine and you are using the stripped down web server internal to Cache you can access this from the management portal.  The path is System Adminstration -> Configuration -> CSP Gateway Management.  If you looking to do this against the traffic on an external web server then you need to the path to the Module.xxx.  On my Windows VM this is http://192.168.x.xxx/csp/bin/Systems/Module.cxw.

You will need the Web Gateway Management username and password.  The user is typically CSPSystem.  Once logged in look for the View HTTP Trace on the left hand menu.

Click on that and you will see a screen with 'Trace OFF' and 'Trace ON' at the top of the left hand menu.  You will also see options to refresh and clear.  Below that will appear any requests that have been traced. This is probably blank at this time.  Click Trace ON (it should change to RED.  Now go make a request that you want to trace.  Once your request is complete go back and turn off the trace so you don't get a bunch of requests that have nothing to do with what you want to examine.  I did this and made a request for the System Management Portal.  Here is the list I get.

Note that I see two requests.  Only one is what I want to look at which, in my case is the first.  When you select a trace you will see the request a the top followed by the response.  Note if the body of the response is not readable you likely have gzip compression on.  Go to the Application settings in the web gateway and turn this off to actually be able to see the body.  Remember to turn it back on later though.

Here is my results (truncated).  Hope this helps you