Hi Prashanth,

I had a similar requirement once. The following is how I managed it:

First, I setup a method in a CSP dispatch class, which respond to a REST endpoint, to invoke a Business Service in the current namespace working production:

ClassMethod SomeRestEndpointCallback(body As %DynamicArray) As %DynamicObject
    $$$TOE(st, ##class(Ens.Director).CreateBusinessService("BusinessServiceName", .service))
    $$$ThrowOnError(service.ProcessInput(body, .output))
    Return output

Then, I created a adapterless Business Service (https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls...) in order to don't pooling for data but just wait for an external trigger instead:

Class some.package.AdapterlessBS Extends Ens.BusinessService

/// Configuration item(s) to which to send file stream messages
Property TargetConfigNames As Ens.DataType.ConfigName(MAXLEN = 1000);

Parameter SETTINGS = "TargetConfigNames:Basic:selector?multiSelect=1&context={Ens.ContextSearch/ProductionItems?targets=1&productionName=@productionId}";

Method OnProcessInput(request As %RegisteredObject, Output response As %RegisteredObject) As %Status
    Set tSC = $$$OK
    Try {
        // Could be any message... adapter to your needs
        Set tMsg = ##class(Ens.StringResponse).%New()
        Set tMsg.StringValue = "some value"
        // Send the message to the targets
        Set targets = $LFS(..TargetConfigNames)
        For i=1:1:$LL(targets) {
            Set target = $LG(targets, i)
            // can be sync or async... it's up to you decide
            //Set tSC = ..SendRequestSync(target, tMsg)
            Set tSC = ..SendRequestAsync(target, tMsg)
    Catch (ex) {
        Set tSC = ex.AsStatus()
    Quit tSC

Now, you can add this Business Service to a interoperability production and set the desired Business Process as its target. So, when your REST endpoint is accessed, it will call the BS and then the BP.


Hi David.

Thank you for your interest int the project.

I didn't try it in GitHub Codespaces, but I just test it in my local Windows PC and it worked:

As you can see, the command in README.md worked for me. I like to use this flags to get more information on possible issues.

But, I changed such a commnad to just docker-compose up -d, once this command automatically builds the image if it does not exists.

Thank you for your feedback, really appreciate it!

Hi @Evgeny Shvarov / @Semion Makarov !

I just added an IRIS Interoperability Production show how to use the code generated by IRIS-FHIRfy to convert a simple CSV into FHIR and persist it to IRIS for Health. Evidences could be found herehere or here.

Could you add the points for Digital Health Interoperability bonus to IRIS-FHIRfy, please?

Thank you!


May be this example could help you:

ClassMethod ExecTestQuery(pParams)
	Set mdx = 
		"WITH "_
		"	%PARM pSelectedDim as 'value:Trimestre' "_
		"	%PARM pSelectedYear as 'value:NOW' "_
		"SELECT "_
		"	[Measures].[QtdAtendimento] ON 0, "_
		"	NON EMPTY [DataD].[H1].@pSelectedDim.Members ON 1 "_
		"%FILTER [DATAD].[H1].[ANO].&[@pSelectedYear]"
	Set rs = ##class(%DeepSee.ResultSet).%New()
	Try {
		$$$TOE(st, rs.%PrepareMDX(mdx))
		Write "Parameters: "
		Write:($D(pParams) = 0) "(default)"
		Write !
		ZW pParams
		$$$TOE(st, rs.%Execute(.pParams))
		Do rs.%Print()
	} Catch(e) {
		Write e.DisplaytString(),!

ClassMethod TestDeepSeeResultSet()
	Write "Test 1", !
	Do ..ExecTestQuery()
	Write "------",!
	Write "Test 2", !
	Set params("pSelectedDim") = "MesAno"
	Set params("pSelectedYear") = "2022"
	Do ..ExecTestQuery(.params)
Do ##class(teste.NewClass1).TestDeepSeeResultSet()
Test 1
Parameters: (default)
                           Qtd Atendimento
Q1 2023                                          4
Test 2
                           Qtd Atendimento
1 Ago-2022                                         15
2 Set-2022                                         30
3 Out-2022                                         25
4 Nov-2022                                          9
5 Dez-2022                                          5

Some resources that may be useful:






Don't know if it's your case, but if you are able to generate the global data, you could use the $INCREMENT() function, which automatically stores the array length into global's head:

Set ^test($INCREMENT(^test)) = "aa"

Set ^test($INCREMENT(^test)) = "aa"

Set ^test($INCREMENT(^test)) = "aa"

Set ^test($INCREMENT(^test)) = "aa"

ZWrite ^test

Write ^test



Hi @Ori Tsarfati!

Recently, I had a similar requirement in a personal project and found JSON2Persistent in OpenExchange from @Michael Braam.

I don't know if this is exactly what you need, but using this tool you can transform an ordinary JSON into in a set of persistent IRIS classes which could be used in DTLs.

For instance, I took this FHIR resrouce example and save it to a file.

JSON input

Then I exctracted a set of persistent classes organized in a package called tmp.FHIRObservationSchema from that file using JSON2Persistent, like this:

$$$TOE(sc, ##class(ISC.SE.Tools.JSON).GenerateClasses("/tmp/file.json", "tmp", "FHIRObservationSchema", 0, 1, "crk", 1))

After that, I was able to create a DTLs using the schema created from the FHIR resource JSON:

DTL (Code)

So, I create a method to test it:

DTL test method

And got this output:




I grabbed some pieces of code from a previous project. In this project I could connect to Cache 2018.

PS: I didn't test this mashup.

import irisnative
import jaydebeapi
import pandas as pd

def create_conn(type, host, port, namespace, user, password):
    if type == "cache":
        url = f"jdbc:Cache://{host}:{port}/{namespace}"
        driver = "com.intersys.jdbc.CacheDriver"
        jarfile = "C:/InterSystems/Cache2018/dev/java/lib/JDK18/cache-jdbc-2.0.0.jar"
        conn = jaydebeapi.connect(driver, url, [user, password], jarfile)
        conn = irisnative.createConnection(host, port, namespace, user, password, sharedmemory = True)
    return conn

conn = create_conn("cache", "x.x.x.x", "56772", "namespace", "user", "password")
sql = "select ..."
df = pd.read_sql(sql, conn)



But the way that this IA understands and creates text is impressive, no doubts. I think this is something we'll learn how to deal with our daily tasks.

As the zdnet article says, Stack Overflow removes **temporarily**, so it may be a matter of time until we get handed by IA in our development tasks, with services like GitHub copilot.

So thank you for bringing this topic to discussion! smiley