Ok, so the data is basically a parsed report and each daily run replaces the previous run with no Delta or net change processing.  So on error they could just rerun the process to get the data.  

One suggestion on the alerting, use a distribution group as the address to send to and make sure a at least one of the Business consumers of the data are on that list.  Just alerting a technical resource could lead the business side to use incomplete data as they don't know that something went wrong.  Relying on the technical resource to pass on this information  can be risky.  Not a dig at the customer, just reality.

So, back to your question, I would still lean towards a solution that separates the loading of the data from the export to MS SQL Server.   Since you want to avoid the use of messages for the data I would still lean towards a global mapped to IRISTEMP, note NOT a process private global as the business hosts will be in separate processes.  This is a repetitive process so the fact that the DB will grow should not be an issue.  It will reach a size necessary to handle the load and stay there.  You can, if needed, programatically handle the mapping using the Config.MapGlobals API ( https://docs.intersystems.com/irislatest/csp/documatic/%25CSP.Documatic.cls?&LIBRARY=%25SYS&CLASSNAME=Config.MapGlobals).

So this would look something like this:

  1. Business Service accepts and processes the data into a non-journaled global on a schedule.  When the data has been consumed a message is generated that is sent to the Business Operation to trigger the next phase
  2. Business operation to write to MS SQL Server reacts to receiving a message to process, including how to identify the dataset to load.  The process consumes the data out of the Global to write to SQL Server.  Once complete the data in the Global for this run is purged.

Anyway that is my two cents on this.  I am sure others will have different ideas.

First lets get the adapter concerns out of the way.  An adapter, in general, is only the "how do I connect" part of the.  What you need to do is write an Business Operation that has this set as its adapter.  Within your service you can do anything you want with the payload that is returned.

If you are not familiar with how this works I would start with our documentation.  Here are a couple of links to get you started:



There are also Learning Service resources you could use.

Now to your process.  First off what is the reason you don't want messages or any tracing?  There are a number of good reasons to utilize the structure of IRIS interoperability including the ability to see a trace if something goes wrong and queue persistence to provide resilience in delivery.  

If messaging is still not viable for any reason I would consider creating a temporary holding class/Table for the data. then sending a message to another Business operation based on the SQL adapter that will pick this data up, write it to MS, and then remove it from the temp storage.  You can map this data to either a non-journaled database or to IRISTEMP depending on your needs.  Also allow for the fact that there could be multiple batches in progress so how you key the temporary storage will be important.

Some important questions:

What happens to the data if the process aborts in the middle?

Would missing any of the data being pulled have a negative effect on the business?

Is there any concern over data lineage for security or external auditing ?

Here is a quick example program I wrote a couple of months ago.  This uses JDBC and the JayDeBeapi library others have mentioned.  Note the Credentials import provides a set of  login credentials in the following format.

LocalCreds = {"user":"SuperUser", "password":"SYS"}

Here is the code:

import jaydebeapi
import credentials
def get_database_connection(inpDBInstance):
    IRIS_JARFILE = "/home/ritaylor/InterSystemsJDBC/intersystems-jdbc-3.2.0.jar"
    IRIS_DRIVER = "com.intersystems.jdbc.IRISDriver"
    AA_JARFILE = "/home/ritaylor/Downloads/AtScale/hive-jdbc-uber-"
    AA_DRIVER = "org.apache.hive.jdbc.HiveDriver"
    # note connecting to two data sources in the same program requires that you 
    # reference the paths to BOTH jar files in a list in every call. Otherwise the
    # second connection attempt will fail.  It appears that the paths only get
    # added once within a process.
    # Database settings - this should be in a config file somewhere
    if (inpDBInstance == "local"):
        dbConn = jaydebeapi.connect(IRIS_DRIVER,
         dbConn = None
    return dbConn
def run_database_query(inpQuery, inpDBInstance):
    resultSet = None
    dbConn = get_database_connection(inpDBInstance)
    cursor = dbConn.cursor()
    resultSet = cursor.fetchall()
    return resultSet
def print_db_result_set(resultSet):
    if (resultSet != None):
        for row in resultSet:
        print("Input result set is empty")


One thing I always recommend is to get familiar with the API using a standalone tool before attempting to code the programmatic interface.  I like Postman as it has pretty good UI to work with.  If you are at the command line you can use curl.


Postman can also give you the code for different programming environments.  Unfortunately not Objectscript, but it is fairly easy to translate from the examples you can see.


I was going through some unanswered questions and came across yours. If you could share your spec to DM it to me I can take a look.

However, understand that the %Stream.Object probably contains the JSON payload that you need.  As such you can get your dynamic object with the following command:

set dyObject = {}.%FromJSON(body)

hope that helps.

One way to be sure that the request is or is not reaching IRIS is to go into the IRIS Web Gateway on the web server and use the trace utility to see if the request is coming in and what it looks like.  Turn on the trace, make a request and then come back and turn off the trace.  Also the default on the Web Gateway server definition is the use gzip compression which will make the body unreadable.  You can temporarily turn this off while you do this test.  

Hopefully you are doing this in a development environment so this will have no impact on production.

UPDATE:  One other thing is to check the audit logs for an security issues that you may hit.  This does not sound like the issue for you, but its worth checking.

You should understand that while InterSystems employees are on the community this is really a public forum and not an "official" support path.  Questions are answered by the  community at large when they can.  For people in the forum to help more information is needed.  You indicated you are working with HealthShare however this is really a family of solutions. Which specific product are you referring to?  What part of that product are you trying understand better?  The more specific you can be the easier it is for the community to help.

If you have an immediate need I would suggest that you contact the Worldwide Response Center (referred to as the WRC) for immediate support.  Here is the contact information:

+44 (0) 844 854 2917
0800615658 (NZ Toll Free)
1800 628 181 (Aus Toll Free)


Finally, learning services (learning.instersystems.com) and documentation (docs.intersystems.coms) can be of great help.  For HealthShare specific areas you do need to be a registered HealthShare user.  If you are not work with your organization and the WRC to get that updated.

Is it expected that this will be a single socket connection that is continually available for bi-directional communications?  I ask because my initial thought was that we have to completely separate interfaces here.  On the remote side (ip) there is a listener on the indicated port number.  You will be connecting to this ip+port to send your ADT.

A completely separate communication is initiated by the remote system to YOUR ip address where you would have a listener on the same port.  This would be limited to accept communications only from the remote ip.  The remote system would send the A19 over this connection which would.  If this is the case then you can simply use our built-in HL7 TCP operation and service  to accomplish this.

If this is truly a bi-directional communications over the same open TCP connection then @Jeffrey Drumm is correct.  They would need to provide the custom protocol they use to manage the communications.

I think John is basically correct, but I don't see this as an issue.  When you are doing client-side editing, which is what I normally do, you need to export code to your project to edit it.  This is an action you need to perform as every definition you look at should not necessarily become part of your project.  When you choose to 'Go To Definition' the InterSystems extension looks to see if it's local and opens that if it is, local meaning client-side.  If not it opens it from the server which is always read-only when working in client-side editing.  To edit export it then open the local copy.

You could try %SYS.MONLBL.  This is intended as a performance trace, but it MAY work for this purpose too.  You would have your application running in one session and get the the process is for that session.  Open a new session and run %SYS.MONLBL.  Choose the options to monitor ALL routines and the specific process id (PID) where you are running your application.  Go back to the application session and perform the function you want to trace.  Immediately go back to %SYS.MONLBL and generate the report before doing anything else in the application.

NOTE: This might not work with deployed code and even if it does will likely not provide any details of the deployed programs.  Hopefully it will at least show an entry for the deployed routine so you can see what is being called.

Good luck


I have had great luck with Udemy.  You do have to pay for the course, but watch the sales which are constantly going on.  You can get courses for $12 to $50 US dollars.  Here is the one I bought (currently $2.99).


I bought this one as it talked about having lots of projects to work on.  Some are interesting, but not all.  It did cover the language fairly well.  However, I found the instructors style not quite to my taste.   

Here is another couple of course I have found.  I plan on getting one of these to improve my Python knowledge



I am taking a course on Java with the instructor from the last link (Tim Buchalka) and so far I do like his style.

Good Luck!

I have not tried this but if you enable LDAP security and select it for that Web Application then passing a username and password it may work.  You would setup LDAP per the documentation for Cache.  

As I said I am not sure that this would work in this context.  An alternative is to use Delegated Authentication.  Here is a link to a Global Summit presentation on dealing with LDAP in the Delegated Authentication zAuthenticate routine.  https://community.intersystems.com/post/global-summit-2016-ldap-beyond-s...

This  is a bit old and it is focused on dealing with custom LDAP schemas, but it will help you understand how to work with LDAP in code.

Please expand a bit on what you are attempting to do.  Are you trying to execute queries FROM IRIS to the HP NonStop SQL environment?   Or are you trying to execute NonStop SQL syntax against IRIS?  If this is the latter understand that IRIS implements ANSI Standard SQL with some IRIS specific extensions.  NonStop SQL syntax may not work entirely against IRIS as that environment will have its own extensions and syntaxes which may not be standard.  

It would help if you could post the SQL you are attempting to run along with any error messages you are receiving.