Let me elaborate a bit more on Dmitry's suggestion.  IRIS for Health has a full FHIR server capability built-in.  Rather than implement the API yourself, and have to keep up with the changing FHIR versions, you could use that.  Now where the data comes from is a separate issue.  For this you can use the interoperability of IRIS to reach out to your external systems to supply the data needed to complete the FHIR request.  This stays with your use case of IRIS as an ESB to the rest of your environment.  You can still use the InterSystems API manager to provide access to the service and mange that inteface.

Alexey,

I feel that this would be counter productive.  Let me explain why.  There is a fundamental difference in the purpose of journaling versus Auditing.  Journals protect against data loss.  The developers are in a position to determine whether or not a particular update to the database is important to the integrity of the system.  Auditing it to help protect the Security of the data.  Giving a developer the opportunity to turn off an auditing event deemed important to capture kind of defeats that purpose.

It might be worth looking into what this external program is.  Perhaps there is a native api that would accomplish this.  You could also take a look at our gateways to see if you could ingest this external functionality to use directly in Cache.

I'd also look at our IRIS product to see if a migration to that platform would provide the needed functionality or a better pathway to utilizing the external program.

Finally, look at why this external program is called so often.  Perhaps the calls can be optimized to reduce the audit events if this is a major issue.

The Community edition uses a core based license.  It appears that your instance is running successfully and that some routines do execute.  Therefore I do not believe that this is a license issue.  If you had exceeded the number of allowed cores then the instance would not start.  

I would look at the routines that are not executing in the background successfully.  It is possible that they are using Cache syntax that are no longer supported or has changed names.   Try executing these routines in the foreground instead of as a background job.  Verify that you get the results you expect.  If that works try jobbing this off from the terminal session to see if it will run in the background at all.

 I would also examine the log files to see if you are getting any errors that are captured from the background execution.

The best way to approach this would to engage with your sales engineer.  We are always available to help evaluate use cases for our technology and to assist you in understanding the implementation of such interfaces.

You can additionally begin your investigation with our documentation and learning resoures.  Here are a couple of links to get you started.

Enabling Productions to Use Managed File Transfer Services

First Look: Managed File Transfer (MFT) with Interoperability Productions

Managed File Transfer video

Robert,

The cause is that fact that Desktop Docker (Docker for Windows) is a bit of a miss-leading.  This is not really using Windows containers at all by default, though it can.  From what I understand, while it is getting better, true Windows containers are still a bit problematic.  Also all our images are Ubuntu based anyway.  What is really happening is that there is a small Linux (Moby) vm running under Desktop Docker.  So when you share volumes from the windows file system to the container you are going through two transitions.  One from the Container to the Host Linux then from the Host Linux out to windows.   While this works for just Docker volumes, there are issues when trying to do Durable %SYS around permissions as you surmised.   I have heard of people getting into the Moby Linux environment and messing with the permissions, but this seems too much trouble and too fragile for my tastes.

You do have some options though.

  1. A better option might be to use WSL2 (Windows Sub-system for Linux).  This will enable you to run an Ubuntu environment within Windows.  You do need to be on WSL2 and not the first version which as too limited.  Here are a couple of links: https://code.visualstudio.com/blogs/2020/03/02/docker-in-wsl2https://www.hanselman.com/blog/HowToSetUpDockerWithinWindowsSystemForLinuxWSL2OnWindows10.aspx  
  2. you could just run a Linux VM  
  3. You could go with the nuclear option and just switch to Linux as a Desktop.  I took this route over a year ago, before WSL2 came out, and have not looked back.

Hope this helps.

Kevin,

The  best option is to work with  IRIS for Health Community Edition which is free for development and education.  You can get this from either Docker as a container you can use on your system or on AWS, Azure, or GCP if you want to work in the cloud.  AWS, at least, has a free tier that is good for 750 hours a month up to a year.  This is more than adequate for education and simple development.  I have used this for demos for a time.

https://hub.docker.com/_/intersystems-iris-for-health
https://aws.amazon.com/marketplace/pp/B07N87JLMW?qid=1587469562959&sr=0-3&ref_=srh_res_product_title
https://azuremarketplace.microsoft.com/en-us/marketplace/apps/intersystems.intersystems-iris-health-community?tab=Overview
https://console.cloud.google.com/marketplace/details/intersystems-launcher/intersystems-iris-health-community-edition?filter=category:database&filter=price:free&id=31edacf5-553a-4762-9efc-6a4272c5a13c&pli=1
 

if you follow the link in the top bar for 'Learning'  you will find many education resources including some quick start topics on IRIS.  And, of course, you can ask questions here.

Damiano,

Keep in mind that Studio is VERY namespace centric.  A single running instance of CStudio can only talk to a single namespace at a time. Even running multiple copies of CStudio can run into issues related to this and how projects track the information.

As Dmitriy Maslennikov has indicated you can look at Visual Studio code with the VSCode-ObjectScript plug-in as long as you are on Cache 2016.2+ or IRIS.  You can also use the Atelier plugin for Eclipse (Photon version only) which has much the same capabilities.

One last thought is why you have two namespaces?  If this is just to separate the code from the application data then you really don't need two namespaces.  You need to configure a single namespace to reference the two databases, one for data and one for code.  I would review the documentation for Namespace to be sure you are on the right track.

https://cedocs.intersystems.com/ens201813/csp/docbook/DocBook.UI.Page.cls?KEY=GSA_config#GSA_config_namespace
 

I would also encourage you to engage with your sales engineer to review your architecture and development direction.

Well one thing is to be sure that the external database implements a FHIR server and that you have the necessary access information and credentials.   They have to be able to accept the REST call per the FHIR standard.  If this is not in place all is not lost.  You can still use other methods access the external database, the method depends on what options are provided.  You just could not use FHIR.

BTW, if I understand you correctly Health Connect would be acting as a FHIR client in this usage, not a server.

Edrian,

You state that Request.JSON is a simple string.  The %ToJSON() method only exists as part of the DynamicObject and DynamicArray classes.  Actually I am surprised that this is not failing completely before you even send the request because of this.  If your variable already has well formed JSON as the value then you can just write that into the EntityBody.

BTW, When you deal with JSON in the future you may find an advantage in using the Dynamic object handling for JSON.  For example take the following JSON {"NAME":"RICH","STATE":"MA"}

If this is in a variable, say jsonStr, I can load this into a Dynamic Object using the following command:

set obj = {}.%FromJSON(jsonStr)

Now you can use object references to work with the JSON

Write obj.NAME   -> displays RICH

I can add new properties dynamically

set obj.ZIPCODE = "99999"

Finally convert this back to a json string with:

write obj.%ToJSON()

which would display  {"NAME":"RICH","STATE":"MA","ZIPCODE":"99999"}

See the documentation at https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=GJSON_preface

I find the easiest way is to log into the CSP Gateway for the web server you are using.  If this is a development machine and you are using the stripped down web server internal to Cache you can access this from the management portal.  The path is System Adminstration -> Configuration -> CSP Gateway Management.  If you looking to do this against the traffic on an external web server then you need to the path to the Module.xxx.  On my Windows VM this is http://192.168.x.xxx/csp/bin/Systems/Module.cxw.

You will need the Web Gateway Management username and password.  The user is typically CSPSystem.  Once logged in look for the View HTTP Trace on the left hand menu.

Click on that and you will see a screen with 'Trace OFF' and 'Trace ON' at the top of the left hand menu.  You will also see options to refresh and clear.  Below that will appear any requests that have been traced. This is probably blank at this time.  Click Trace ON (it should change to RED.  Now go make a request that you want to trace.  Once your request is complete go back and turn off the trace so you don't get a bunch of requests that have nothing to do with what you want to examine.  I did this and made a request for the System Management Portal.  Here is the list I get.

Note that I see two requests.  Only one is what I want to look at which, in my case is the first.  When you select a trace you will see the request a the top followed by the response.  Note if the body of the response is not readable you likely have gzip compression on.  Go to the Application settings in the web gateway and turn this off to actually be able to see the body.  Remember to turn it back on later though.

Here is my results (truncated).  Hope this helps you

Leo,

I would go to the "Building Your First HL7 Produciton" learning path in the links I sent earlier.  There are several learning resources listed here.  If you are in  hurry you can skip the introduction components and go direct to the "Integration Architecture" course.  Then follow along with the other courses in order.  I would recommend at least the:

  • HL7 I/O course
  • all three courses under the message router section
  • Data Transformation Basics
  • Practice building data transformations
  • the two courses under troubleshooting would be advisable too
  • Do the Final Exercise.

You can always go back  and review other courses as needed.   Also search our Learning Services area (Learning on the top bar) for other courses and presentations.

You can contact me directly (my email is in my profile)) if you want to take this offline.

Leo,

First can I ask which InterSystems product you are working with?  Ensemble, Health Connect, and IRIS for Health all provide tools that make handling of HL7 much easier.  Further these are all interoperability platforms which provide tools for building these types of integrations efficiently and fast.

A further question is what are you intending to do with this HL7 message?  Just put the entire message file contents into SQL Server as a blob or are you looking to pull out specific data to put into columns in a SQL table?  From your further comments I believe it is the latter,  but clarity would help here.

For the moment I make an assumption that you are using either Ensemble or Health Connect and point you at some documentation that can help you.

Ensemble HL7 Version 2 Development Guide

Also some Learning Services links

Integration Architecture

Building Your First HL7 Production learning path

Hope this helps

You don't indicate which product you are using (IRIS, Cache, Ensemble) which may impact on some details.  However the basic code to load the form data and execute the Post call is going to be  the same.

You need an instance of %Net.HTTPRequest.  If you are in Ensemble then you would work through the adapter.  For Cache or IRIS create an instance of this class and configure it for the server end point.

Add your form data to the request by calling the InsertFormData method.

ex. req.InsertFormData("NameFld", "NameValue")

To cause the request to be performed call the Post method

ex.  set status = req.Post(REST resource URL)

If you assign the server and port properties of the request then do not include them in the URL here.  

You can access the response in the HTTPResponse property of the request.  This will be an instance of the %Net.HTTPResponse class.

You can read more on the %Net.Request class in the class reference at https://docs.intersystems.com/latest/csp/documatic/%25CSP.Documatic.cls 

Let me add my experience to this comment.  I have been wading into the Docker ocean.  I am on Widows and really did not want to run a Linux VM to get Docker Containers (seemed a bit redundant to me) so Docker for Windows was the way to go.  So far this has worked extremely well for me.  I am running an Ubuntu container with Ensemble added int.   My dockerfile is a simplistic version of the one earlier in these comments.   I am having only one issue related to getting the SSH daemon to run on when the container starts.

 I hope to have all my local instances moved into containers soon.

My feeling is that this will be great for demonstrations, local development, and proofs of concept.   I would agree that for any production use having a straight Linux environment with Docker would be a more robust and stable solution.

Below is a class method that returns details of a Contact.  Note the line that that is highlighted below.  Basically any output written from your REST Classmethod becomes part of the body of the response.  So the WRITE of the output of %ToJSON() puts a JSON object in the response body.

ClassMethod ContactDetail(ContactID As %String) As %Status
{
    #dim %response as %CSP.Response
    #dim %request as %CSP.Request
    #dim ContactItem as WebREST.Data.Contact
    #dim ActionItem as WebREST.Data.Actions
    
    set tSC = $System.Status.OK()
    Try {
        if ContactID {
            set ObjSC = ""
            set ContactItem = ##Class(WebREST.Data.Contact).%OpenId(ContactID,,.ObjSC)
            if $System.Status.IsError(ObjSC) {
                // test for object not found error
                if $System.Status.GetErrorCodes(ObjSC) = "5809" {
                    set %response.Status = ..#HTTP404NOTFOUND
                } else {
                    throw ##class(%Exception.StatusException).CreateFromStatus(ObjSC)
                }
            } else {
                set ResponseObj = {}
                set ResponseObj.ContactID = ContactItem.ContactID
                set ResponseObj.FirstName = ContactItem.FirstName
                set ResponseObj.LastName = ContactItem.LastName
                set ResponseObj.Position = ContactItem.Position
                set ResponseObj.Company = ContactItem.Company.Name
                set ResponseObj.Phone = ContactItem.Phone
                set Address = {}
                set Address.Street = ContactItem.Address.Street
                set Address.City = ContactItem.Address.City
                set Address.State = ContactItem.Address.State
                set Address.PostalCode = ContactItem.Address.PostalCode
                set ResponseObj.Address = Address
                set ResponseObj.Agent = ContactItem.Agent.Name
                
                // now load each action into an array
                set Actions = []
                set ActionKey = ""
                do {
                    set ActionResponse = {}
                    set ActionItem = ContactItem.Actions.GetNext(.ActionKey)
                    if ActionItem '= "" {
                        set ActionResponse.ActionType = ActionItem.ActionType
                         set ActionResponse.DueDate = $zdate(ActionItem.DueDate,3)
                        set ActionResponse.Notes = ActionItem.Notes
                         set ActionResponse.Complete = ActionItem.Complete
                        set ActionResponse.Agent = ActionItem.Agent.Name
                        do Actions.%Push(ActionResponse)
                    }
                } while ActionKey '= ""
                set ResponseObj.Actions = Actions
                Write ResponseObj.%ToJSON()
            }
        } else {
            set %response.Status = ..#HTTP400BADREQUEST
        }
    } catch Exception {
        set tSC = Exception.AsStatus()
    }

    Quit tSC
}
 

In Short DeepSee is an analytics tool to incorporate Actionable BI views of your data embedded within your application.  DeepSee sits right on top of your application data store so there is no extract or transformation of data.  This allows the dashboards created to be as up-to-date with transactional events as is appropriate up to near real-time.  Users of these dashboard can then initiate action within the application directly from the Analytical view hence the "Actionable BI" label.

A great way to get started is to go to our online learning portal.    Here is a link to the learning resource guide for DeepSee.  https://learning.intersystems.com/course/view.php?id=19