Clear filter
Article
Kristina Lauer · Jul 29, 2024
Updated 2/27/25
Hi Community,
You can unlock the full potential of InterSystems IRIS—and help your team onboard—with the full range of InterSystems learning resources offered online and in person, for every role in your organization. Developers, system administrators, data analysts, and integrators can quickly get up to speed.
Onboarding Resources for Every Role
Developers
Online Learning Program: Getting Started with InterSystems IRIS for Coders (21h)
Classroom Training: Developing with InterSystems Objects and SQL (5 days)
System Administrators
Learning Path: InterSystems IRIS Management Basics (10h)
Classroom Training: Managing InterSystems Servers (5 days)
Data Analysts
Video: Introduction to Analytics with InterSystems (6m)
Learning Paths for every tool:
Analyzing Data with InterSystems IRIS BI
Delivering Data Visually with InterSystems Reports (1h 15m)
Build Data Models Using Adaptive Analytics (2h 15m)
Classroom Training: Using InterSystems Embedded Analytics (5 days)
Integrators
Learning Program: Getting Started with InterSystems IRIS for Health for Integrators (14h)
Classroom Training: Developing System Integrations and Building and Managing HL7 Integrations (5 days each)
Implementers
Learning Path: Deploying InterSystems IRIS in Containers and the Cloud (3h)
Learning Program: Getting Started with InterSystems IRIS for Implementers (26h)
Project managers
Watch product overview videos.
Read success stories to get inspired—see how others are using InterSystems products!
Other Resources from Learning Services
💻 Online Learning: Register for free at learning.intersystems.com to access self-paced courses, videos, and exercises. You can also complete task-based learning paths or role-based programs to advance your career.
👩🏫 Classroom Training: Check the schedule of live, in-person or virtual classroom training, or request a private course for your team. Find details at classroom.intersystems.com.
📘 InterSystems IRIS documentation: Comprehensive reference materials, guides, and how-to articles. Explore the documentation.
📧 Support: For technical support, email support@intersystems.com.
Certification Opportunities
Once you and your team members have gained enough training and experience, get certified according to your role!
Learn from the Community
💬Engage in learning on the Developer Community: Chat with other developers, post questions, read articles, and stay updated with the latest announcements. See this post for tips on how to learn on the Developer Community.
With these learning resources, your team will be well equipped to maximize the capabilities of InterSystems IRIS, driving your organization’s growth and success. For additional assistance, post questions here or ask your dedicated Sales Engineer. New certification opportunity added to the list: InterSystems IRIS SQL Specialist! Resources for implementers added!
Announcement
Anastasia Dyubaylo · Apr 11
Hi Community,
We're happy to announce that registration for the event of the year — InterSystems Ready 2025 — is now open. This is the Global Summit we all know and love, but with a new name!
➡️ InterSystems Ready 2025
🗓 Dates: June 22-25, 2025
📍 Location: Signia Hilton Bonnet Creek, Orlando, FL, USA
InterSystems READY 2025 is a friendly and informative environment for the InterSystems community to meet, interact, and exchange knowledge.
READY 2025 event includes:
Sessions: 3 and a half days of sessions geared to the needs of software developers and managers. Sessions repeat so you don’t have to miss out as you build your schedule.
Inspiring keynotes: Presentations that challenge your assumptions and highlight new possibilities.
What’s next: In the keynotes and breakout sessions you’ll learn what’s on the InterSystems roadmap, so you’ll be ready to go when new tech is released.
Networking: Meet InterSystems executives, members of our global product and innovation teams, and peers from around the world to discuss what matters most to you.
Workshops and personal training: Dive into exactly what you need with an InterSystems expert, including one-on-ones.
Startup program: Demonstrate your tech, connect with potential buyers, and learn how InterSystems can help you accelerate growth of your business.
Partner Pavilion: Looking for a consultant, systems integrator, tools to simplify your work? It’s all in the pavilion.
Fun: Demos and Drinks, Tech Exchange, and other venues.
Learn more about the prices on the official website and don't forget that the super early bird discount lapses on April 16th!
We look forward to seeing you at the InterSystems Ready 2025!
Article
Guillaume Rongier · Feb 7, 2022
# 1. interoperability-embedded-python
This proof of concept aims to show how the **iris interoperability framework** can be used with **embedded python**.
## 1.1. Table of Contents
- [1. interoperability-embedded-python](#1-interoperability-embedded-python)
- [1.1. Table of Contents](#11-table-of-contents)
- [1.2. Example](#12-example)
- [1.3. Register a component](#13-register-a-component)
- [2. Demo](#2-demo)
- [3. Prerequisites](#3-prerequisites)
- [4. Installation](#4-installation)
- [4.1. With Docker](#41-with-docker)
- [4.2. Without Docker](#42-without-docker)
- [4.3. With ZPM](#43-with-zpm)
- [4.4. With PyPI](#44-with-pypi)
- [4.4.1. Known issues](#441-known-issues)
- [5. How to Run the Sample](#5-how-to-run-the-sample)
- [5.1. Docker containers](#51-docker-containers)
- [5.2. Management Portal and VSCode](#52-management-portal-and-vscode)
- [5.3. Open the production](#53-open-the-production)
- [6. What's inside the repository](#6-whats-inside-the-repository)
- [6.1. Dockerfile](#61-dockerfile)
- [6.2. .vscode/settings.json](#62-vscodesettingsjson)
- [6.3. .vscode/launch.json](#63-vscodelaunchjson)
- [6.4. .vscode/extensions.json](#64-vscodeextensionsjson)
- [6.5. src folder](#65-src-folder)
- [7. How it works](#7-how-it-works)
- [7.1. The `__init__.py`file](#71-the-__init__pyfile)
- [7.2. The `common` class](#72-the-common-class)
- [7.3. The `business_host` class](#73-the-business_host-class)
- [7.4. The `inbound_adapter` class](#74-the-inbound_adapter-class)
- [7.5. The `outbound_adapter` class](#75-the-outbound_adapter-class)
- [7.6. The `business_service` class](#76-the-business_service-class)
- [7.7. The `business_process` class](#77-the-business_process-class)
- [7.8. The `business_operation` class](#78-the-business_operation-class)
- [7.8.1. The dispacth system](#781-the-dispacth-system)
- [7.8.2. The methods](#782-the-methods)
- [7.9. The `director` class](#79-the-director-class)
- [7.10. The `objects`](#710-the-objects)
- [7.11. The `messages`](#711-the-messages)
- [7.12. How to regsiter a component](#712-how-to-regsiter-a-component)
- [7.12.1. register\_component](#7121-register_component)
- [7.12.2. register\_file](#7122-register_file)
- [7.12.3. register\_folder](#7123-register_folder)
- [7.12.4. migrate](#7124-migrate)
- [7.12.4.1. setting.py file](#71241-settingpy-file)
- [7.12.4.1.1. CLASSES section](#712411-classes-section)
- [7.12.4.1.2. Productions section](#712412-productions-section)
- [7.13. Direct use of Grongier.PEX](#713-direct-use-of-grongierpex)
- [8. Command line](#8-command-line)
- [8.1. help](#81-help)
- [8.2. default](#82-default)
- [8.3. lists](#83-lists)
- [8.4. start](#84-start)
- [8.5. kill](#85-kill)
- [8.6. stop](#86-stop)
- [8.7. restart](#87-restart)
- [8.8. migrate](#88-migrate)
- [8.9. export](#89-export)
- [8.10. status](#810-status)
- [8.11. version](#811-version)
- [8.12. log](#812-log)
- [9. Credits](#9-credits)
## 1.2. Example
```python
from grongier.pex import BusinessOperation,Message
class MyBusinessOperation(BusinessOperation):
def on_init(self):
#This method is called when the component is becoming active in the production
self.log_info("[Python] ...MyBusinessOperation:on_init() is called")
return
def on_teardown(self):
#This method is called when the component is becoming inactive in the production
self.log_info("[Python] ...MyBusinessOperation:on_teardown() is called")
return
def on_message(self, message_input:MyRequest):
# called from service/process/operation, message is of type MyRequest with property request_string
self.log_info("[Python] ...MyBusinessOperation:on_message() is called with message:"+message_input.request_string)
response = MyResponse("...MyBusinessOperation:on_message() echos")
return response
@dataclass
class MyRequest(Message):
request_string:str = None
@dataclass
class MyResponse(Message):
my_string:str = None
```
## 1.3. Register a component
Thanks to the method grongier.pex.Utils.register_component() :
Start an embedded python shell :
```sh
/usr/irissys/bin/irispython
```
Then use this class method to add a python class to the component list for interoperability.
```python
from grongier.pex import Utils
Utils.register_component(,,,,)
```
e.g :
```python
from grongier.pex import Utils
Utils.register_component("MyCombinedBusinessOperation","MyCombinedBusinessOperation","/irisdev/app/src/python/demo/",1,"PEX.MyCombinedBusinessOperation")
```
This is a hack, this not for production.
# 2. Demo
The demo can be found inside `src/python/demo/reddit/` and is composed of :
- An `adapter.py` file that holds a `RedditInboundAdapter` that will, given a service, fetch Reddit recent posts.
- A `bs.py` file that holds three `services` that does the same thing, they will call our `Process` and send it reddit post. One work on his own, one use the `RedditInBoundAdapter` we talked about earlier and the last one use a reddit inbound adapter coded in ObjectScript.
- A `bp.py` file that holds a `FilterPostRoutingRule` process that will analyze our reddit posts and send it to our `operations` if it contains certain words.
- A `bo.py` file that holds :
- Two **email operations** that will send a mail to a certain company depending on the words analyzed before, one works on his own and the other one works with an OutBoundAdapter.
- Two **file operations** that will write in a text file depending on the words analyzed before, one works on his own and the other one works with an OutBoundAdapter.
New json trace for python native messages :
# 3. Prerequisites
Make sure you have [git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) and [Docker desktop](https://www.docker.com/products/docker-desktop) installed.
# 4. Installation
## 4.1. With Docker
Clone/git pull the repo into any local directory
```sh
git clone https://github.com/grongierisc/interpeorability-embedded-python
```
Open the terminal in this directory and run:
```sh
docker-compose build
```
Run the IRIS container with your project:
```sh
docker-compose up -d
```
## 4.2. Without Docker
Install the *grongier_pex-1.2.4-py3-none-any.whl* on you local iris instance :
```sh
/usr/irissys/bin/irispython -m pip install grongier_pex-1.2.4-py3-none-any.whl
```
Then load the ObjectScript classes :
```ObjectScript
do $System.OBJ.LoadDir("/opt/irisapp/src","cubk","*.cls",1)
```
## 4.3. With ZPM
```objectscript
zpm "install pex-embbeded-python"
```
## 4.4. With PyPI
```sh
pip3 install iris_pex_embedded_python
```
Import the ObjectScript classes, open an embedded python shell and run :
```python
from grongier.pex import Utils
Utils.setup()
```
### 4.4.1. Known issues
If the module is not updated, make sure to remove the old version :
```sh
pip3 uninstall iris_pex_embedded_python
```
or manually remove the `grongier` folder in `/lib/python/`
or force the installation with pip :
```sh
pip3 install --upgrade iris_pex_embedded_python --target /lib/python/
```
# 5. How to Run the Sample
## 5.1. Docker containers
In order to have access to the InterSystems images, we need to go to the following url: http://container.intersystems.com. After connecting with our InterSystems credentials, we will get our password to connect to the registry. In the docker VScode addon, in the image tab, by pressing connect registry and entering the same url as before (http://container.intersystems.com) as a generic registry, we will be asked to give our credentials. The login is the usual one but the password is the one we got from the website.
From there, we should be able to build and compose our containers (with the `docker-compose.yml` and `Dockerfile` files given).
## 5.2. Management Portal and VSCode
This repository is ready for [VS Code](https://code.visualstudio.com/).
Open the locally-cloned `interoperability-embedeed-python` folder in VS Code.
If prompted (bottom right corner), install the recommended extensions.
**IMPORTANT**: When prompted, reopen the folder inside the container so you will be able to use the python components within it. The first time you do this it may take several minutes while the container is readied.
By opening the folder remote you enable VS Code and any terminals you open within it to use the python components within the container. Configure these to use `/usr/irissys/bin/irispython`
## 5.3. Open the production
To open the production you can go to [production](http://localhost:52773/csp/irisapp/EnsPortal.ProductionConfig.zen?PRODUCTION=PEX.Production).
You can also click on the bottom on the `127.0.0.1:52773[IRISAPP]` button and select `Open Management Portal` then, click on [Interoperability] and [Configure] menus then click [productions] and [Go].
The production already has some code sample.
Here we can see the production and our pure python services and operations:
New json trace for python native messages :
# 6. What's inside the repository
## 6.1. Dockerfile
A dockerfile which install some python dependancies (pip, venv) and sudo in the container for conviencies.
Then it create the dev directory and copy in it this git repository.
It starts IRIS and activates **%Service_CallIn** for **Python Shell**.
Use the related docker-compose.yml to easily setup additional parametes like port number and where you map keys and host folders.
This dockerfile ends with the installation of requirements for python modules.
Use .env/ file to adjust the dockerfile being used in docker-compose.
## 6.2. .vscode/settings.json
Settings file to let you immedietly code in VSCode with [VSCode ObjectScript plugin](https://marketplace.visualstudio.com/items?itemName=daimor.vscode-objectscript)
## 6.3. .vscode/launch.json
Config file if you want to debug with VSCode ObjectScript
[Read about all the files in this article](https://community.intersystems.com/post/dockerfile-and-friends-or-how-run-and-collaborate-objectscript-projects-intersystems-iris)
## 6.4. .vscode/extensions.json
Recommendation file to add extensions if you want to run with VSCode in the container.
[More information here](https://code.visualstudio.com/docs/remote/containers)

This is very useful to work with embedded python.
## 6.5. src folder
```
src
├── Grongier
│ └── PEX // ObjectScript classes that wrap python code
│ ├── BusinessOperation.cls
│ ├── BusinessProcess.cls
│ ├── BusinessService.cls
│ ├── Common.cls
│ ├── Director.cls
│ ├── InboundAdapter.cls
│ ├── Message.cls
│ ├── OutboundAdapter.cls
│ ├── Python.cls
│ ├── Test.cls
│ └── _utils.cls
├── PEX // Some example of wrapped classes
│ └── Production.cls
└── python
├── demo // Actual python code to run this demo
| `-- reddit
| |-- adapter.py
| |-- bo.py
| |-- bp.py
| |-- bs.py
| |-- message.py
| `-- obj.py
├── dist // Wheel used to implement python interoperability components
│ └── grongier_pex-1.2.4-py3-none-any.whl
├── grongier
│ └── pex // Helper classes to implement interoperability components
│ ├── _business_host.py
│ ├── _business_operation.py
│ ├── _business_process.py
│ ├── _business_service.py
│ ├── _common.py
│ ├── _director.py
│ ├── _inbound_adapter.py
│ ├── _message.py
│ ├── _outbound_adapter.py
│ ├── __init__.py
│ └── _utils.py
└── setup.py // setup to build the wheel
```
# 7. How it works
## 7.1. The `__init__.py`file
This file will allow us to create the classes to import in the code.
It gets from the multiple files seen earlier the classes and make them into callable classes.
That way, when you wish to create a business operation, for example, you can just do:
```python
from grongier.pex import BusinessOperation
```
## 7.2. The `common` class
The common class shouldn't be called by the user, it defines almost all the other classes.
This class defines:
`on_init`: The on_init() method is called when the component is started. Use the on_init() method to initialize any structures needed by the component.
`on_tear_down`: Called before the component is terminated. Use it to free any structures.
`on_connected`: The on_connected() method is called when the component is connected or reconnected after being disconnected.Use the on_connected() method to initialize any structures needed by the component.
`log_info`: Write a log entry of type "info". :log entries can be viewed in the management portal.
`log_alert`: Write a log entry of type "alert". :log entries can be viewed in the management portal.
`log_warning`: Write a log entry of type "warning". :log entries can be viewed in the management portal.
`log_error`: Write a log entry of type "error". :log entries can be viewed in the management portal.
## 7.3. The `business_host` class
The business host class shouldn't be called by the user, it is the base class for all the business classes.
This class defines:
`send_request_sync`: Send the specified message to the target business process or business operation synchronously.
**Parameters**:
- **target**: a string that specifies the name of the business process or operation to receive the request.
The target is the name of the component as specified in the Item Name property in the production definition, not the class name of the component.
- **request**: specifies the message to send to the target. The request is either an instance of a class that is a subclass of Message class or of IRISObject class.
If the target is a build-in ObjectScript component, you should use the IRISObject class. The IRISObject class enables the PEX framework to convert the message to a class supported by the target.
- **timeout**: an optional integer that specifies the number of seconds to wait before treating the send request as a failure. The default value is -1, which means wait forever.
description: an optional string parameter that sets a description property in the message header. The default is None.
**Returns**:
the response object from target.
**Raises**:
TypeError: if request is not of type Message or IRISObject.
`send_request_async`: Send the specified message to the target business process or business operation asynchronously.
**Parameters**:
- **target**: a string that specifies the name of the business process or operation to receive the request.
The target is the name of the component as specified in the Item Name property in the production definition, not the class name of the component.
- **request**: specifies the message to send to the target. The request is an instance of IRISObject or of a subclass of Message.
If the target is a built-in ObjectScript component, you should use the IRISObject class. The IRISObject class enables the PEX framework to convert the message to a class supported by the target.
- **description**: an optional string parameter that sets a description property in the message header. The default is None.
**Raises**:
TypeError: if request is not of type Message or IRISObject.
`get_adapter_type`: Name of the registred Adapter.
## 7.4. The `inbound_adapter` class
Inbound Adapter in Python are subclass from grongier.pex.InboundAdapter in Python, that inherit from all the functions of the [common class](#72-the-common-class).
This class is responsible for receiving the data from the external system, validating the data, and sending it to the business service by calling the BusinessHost process_input method.
This class defines:
`on_task`: Called by the production framework at intervals determined by the business service CallInterval property.
The message can have any structure agreed upon by the inbound adapter and the business service.
Example of an inbound adapter ( situated in the src/python/demo/reddit/adapter.py file ):
```python
from grongier.pex import InboundAdapter
import requests
import iris
import json
class RedditInboundAdapter(InboundAdapter):
"""
This adapter use requests to fetch self.limit posts as data from the reddit
API before calling process_input for each post.
"""
def on_init(self):
if not hasattr(self,'feed'):
self.feed = "/new/"
if self.limit is None:
raise TypeError('no Limit field')
self.last_post_name = ""
return 1
def on_task(self):
self.log_info(f"LIMIT:{self.limit}")
if self.feed == "" :
return 1
tSC = 1
# HTTP Request
try:
server = "https://www.reddit.com"
request_string = self.feed+".json?before="+self.last_post_name+"&limit="+self.limit
self.log_info(server+request_string)
response = requests.get(server+request_string)
response.raise_for_status()
data = response.json()
updateLast = 0
for key, value in enumerate(data['data']['children']):
if value['data']['selftext']=="":
continue
post = iris.cls('dc.Reddit.Post')._New()
post._JSONImport(json.dumps(value['data']))
post.OriginalJSON = json.dumps(value)
if not updateLast:
self.LastPostName = value['data']['name']
updateLast = 1
response = self.BusinessHost.ProcessInput(post)
except requests.exceptions.HTTPError as err:
if err.response.status_code == 429:
self.log_warning(err.__str__())
else:
raise err
except Exception as err:
self.log_error(err.__str__())
raise err
return tSC
```
## 7.5. The `outbound_adapter` class
Outbound Adapter in Python are subclass from grongier.pex.OutboundAdapter in Python, that inherit from all the functions of the [common class](#72-the-common-class).
This class is responsible for sending the data to the external system.
The Outbound Adapter gives the Operation the possibility to have a heartbeat notion.
To activate this option, the CallInterval parameter of the adapter must be strictly greater than 0.
Example of an outbound adapter ( situated in the src/python/demo/reddit/adapter.py file ):
```python
class TestHeartBeat(OutboundAdapter):
def on_keepalive(self):
self.log_info('beep')
def on_task(self):
self.log_info('on_task')
```
## 7.6. The `business_service` class
This class is responsible for receiving the data from external system and sending it to business processes or business operations in the production.
The business service can use an adapter to access the external system, which is specified overriding the get_adapter_type method.
There are three ways of implementing a business service:
- Polling business service with an adapter - The production framework at regular intervals calls the adapter’s OnTask() method,
which sends the incoming data to the the business service ProcessInput() method, which, in turn calls the OnProcessInput method with your code.
- Polling business service that uses the default adapter - In this case, the framework calls the default adapter's OnTask method with no data.
The OnProcessInput() method then performs the role of the adapter and is responsible for accessing the external system and receiving the data.
- Nonpolling business service - The production framework does not initiate the business service. Instead custom code in either a long-running process
or one that is started at regular intervals initiates the business service by calling the Director.CreateBusinessService() method.
Business service in Python are subclass from grongier.pex.BusinessService in Python, that inherit from all the functions of the [business host](#73-the-business_host-class).
This class defines:
`on_process_input`: Receives the message from the inbond adapter via the PRocessInput method and is responsible for forwarding it to target business processes or operations.
If the business service does not specify an adapter, then the default adapter calls this method with no message and the business service is responsible for receiving the data from the external system and validating it.
**Parameters**:
- **message_input**: an instance of IRISObject or subclass of Message containing the data that the inbound adapter passes in.
The message can have any structure agreed upon by the inbound adapter and the business service.
Example of a business service ( situated in the src/python/demo/reddit/bs.py file ):
```python
from grongier.pex import BusinessService
import iris
from message import PostMessage
from obj import PostClass
class RedditServiceWithPexAdapter(BusinessService):
"""
This service use our python Python.RedditInboundAdapter to receive post
from reddit and call the FilterPostRoutingRule process.
"""
def get_adapter_type():
"""
Name of the registred Adapter
"""
return "Python.RedditInboundAdapter"
def on_process_input(self, message_input):
msg = iris.cls("dc.Demo.PostMessage")._New()
msg.Post = message_input
return self.send_request_sync(self.target,msg)
def on_init(self):
if not hasattr(self,'target'):
self.target = "Python.FilterPostRoutingRule"
return
```
## 7.7. The `business_process` class
Typically contains most of the logic in a production.
A business process can receive messages from a business service, another business process, or a business operation.
It can modify the message, convert it to a different format, or route it based on the message contents.
The business process can route a message to a business operation or another business process.
Business processes in Python are subclass from grongier.pex.BusinessProcess in Python, that inherit from all the functions of the [business host](#73-the-business_host-class).
This class defines:
`on_request`: Handles requests sent to the business process. A production calls this method whenever an initial request for a specific business process arrives on the appropriate queue and is assigned a job in which to execute.
**Parameters**:
- **request**: An instance of IRISObject or subclass of Message that contains the request message sent to the business process.
**Returns**:
An instance of IRISObject or subclass of Message that contains the response message that this business process can return
to the production component that sent the initial message.
`on_response`: Handles responses sent to the business process in response to messages that it sent to the target.
A production calls this method whenever a response for a specific business process arrives on the appropriate queue and is assigned a job in which to execute.
Typically this is a response to an asynchronous request made by the business process where the responseRequired parameter has a true value.
**Parameters**:
- **request**: An instance of IRISObject or subclass of Message that contains the initial request message sent to the business process.
- **response**: An instance of IRISObject or subclass of Message that contains the response message that this business process can return to the production component that sent the initial message.
- **callRequest**: An instance of IRISObject or subclass of Message that contains the request that the business process sent to its target.
- **callResponse**: An instance of IRISObject or subclass of Message that contains the incoming response.
- **completionKey**: A string that contains the completionKey specified in the completionKey parameter of the outgoing SendAsync() method.
**Returns**:
An instance of IRISObject or subclass of Message that contains the response message that this business process can return
to the production component that sent the initial message.
`on_complete`: Called after the business process has received and handled all responses to requests it has sent to targets.
**Parameters**:
- **request**: An instance of IRISObject or subclass of Message that contains the initial request message sent to the business process.
- **response**: An instance of IRISObject or subclass of Message that contains the response message that this business process can return to the production component that sent the initial message.
**Returns**:
An instance of IRISObject or subclass of Message that contains the response message that this business process can return to the production component that sent the initial message.
Example of a business process ( situated in the src/python/demo/reddit/bp.py file ):
```python
from grongier.pex import BusinessProcess
from message import PostMessage
from obj import PostClass
class FilterPostRoutingRule(BusinessProcess):
"""
This process receive a PostMessage containing a reddit post.
It then understand if the post is about a dog or a cat or nothing and
fill the right infomation inside the PostMessage before sending it to
the FileOperation operation.
"""
def on_init(self):
if not hasattr(self,'target'):
self.target = "Python.FileOperation"
return
def on_request(self, request):
if 'dog'.upper() in request.post.selftext.upper():
request.to_email_address = 'dog@company.com'
request.found = 'Dog'
if 'cat'.upper() in request.post.selftext.upper():
request.to_email_address = 'cat@company.com'
request.found = 'Cat'
if request.found is not None:
return self.send_request_sync(self.target,request)
else:
return
```
## 7.8. The `business_operation` class
This class is responsible for sending the data to an external system or a local system such as an iris database.
The business operation can optionally use an adapter to handle the outgoing message which is specified overriding the get_adapter_type method.
If the business operation has an adapter, it uses the adapter to send the message to the external system.
The adapter can either be a PEX adapter, an ObjectScript adapter or a [python adapter](#75-the-outbound_adapter-class).
Business operation in Python are subclass from grongier.pex.BusinessOperation in Python, that inherit from all the functions of the [business host](#73-the-business_host-class).
### 7.8.1. The dispacth system
In a business operation it is possbile to create any number of function [similar to the on_message method](#782-the-methods) that will take as argument a [typed request](#711-the-messages) like this `my_special_message_method(self,request: MySpecialMessage)`.
The dispatch system will automatically analyze any request arriving to the operation and dispacth the requests depending of their type. If the type of the request is not recognized or is not specified in any **on_message like function**, the dispatch system will send it to the `on_message` function.
### 7.8.2. The methods
This class defines:
`on_message`: Called when the business operation receives a message from another production component [that can not be dispatched to another function](#781-the-dispacth-system).
Typically, the operation will either send the message to the external system or forward it to a business process or another business operation.
If the operation has an adapter, it uses the Adapter.invoke() method to call the method on the adapter that sends the message to the external system.
If the operation is forwarding the message to another production component, it uses the SendRequestAsync() or the SendRequestSync() method.
**Parameters**:
- **request**: An instance of either a subclass of Message or of IRISObject containing the incoming message for the business operation.
**Returns**:
The response object
Example of a business operation ( situated in the src/python/demo/reddit/bo.py file ):
```python
from grongier.pex import BusinessOperation
from message import MyRequest,MyMessage
import iris
import os
import datetime
import smtplib
from email.mime.text import MIMEText
class EmailOperation(BusinessOperation):
"""
This operation receive a PostMessage and send an email with all the
important information to the concerned company ( dog or cat company )
"""
def my_message(self,request:MyMessage):
sender = 'admin@example.com'
receivers = 'toto@example.com'
port = 1025
msg = MIMEText(request.toto)
msg['Subject'] = 'MyMessage'
msg['From'] = sender
msg['To'] = receivers
with smtplib.SMTP('localhost', port) as server:
server.sendmail(sender, receivers, msg.as_string())
print("Successfully sent email")
def on_message(self, request):
sender = 'admin@example.com'
receivers = [ request.to_email_address ]
port = 1025
msg = MIMEText('This is test mail')
msg['Subject'] = request.found+" found"
msg['From'] = 'admin@example.com'
msg['To'] = request.to_email_address
with smtplib.SMTP('localhost', port) as server:
# server.login('username', 'password')
server.sendmail(sender, receivers, msg.as_string())
print("Successfully sent email")
```
If this operation is called using a MyRequest message, the my_message function will be called thanks to the dispatcher, otherwise the on_message function will be called.
## 7.9. The `director` class
The Director class is used for nonpolling business services, that is, business services which are not automatically called by the production framework (through the inbound adapter) at the call interval.
Instead these business services are created by a custom application by calling the Director.create_business_service() method.
This class defines:
`create_business_service`: The create_business_service() method initiates the specified business service.
**Parameters**:
- **connection**: an IRISConnection object that specifies the connection to an IRIS instance for Java.
- **target**: a string that specifies the name of the business service in the production definition.
**Returns**:
an object that contains an instance of IRISBusinessService
`start_production`: The start_production() method starts the production.
**Parameters**:
- **production_name**: a string that specifies the name of the production to start.
`stop_production`: The stop_production() method stops the production.
**Parameters**:
- **production_name**: a string that specifies the name of the production to stop.
`restart_production`: The restart_production() method restarts the production.
**Parameters**:
- **production_name**: a string that specifies the name of the production to restart.
`list_productions`: The list_productions() method returns a dictionary of the names of the productions that are currently running.
## 7.10. The `objects`
We will use `dataclass` to hold information in our [messages](#711-the-messages) in a `obj.py` file.
Example of an object ( situated in the src/python/demo/reddit/obj.py file ):
```python
from dataclasses import dataclass
@dataclass
class PostClass:
title: str
selftext : str
author: str
url: str
created_utc: float = None
original_json: str = None
```
## 7.11. The `messages`
The messages will contain one or more [objects](#710-the-objects), located in the `obj.py` file.
Messages, requests and responses all inherit from the `grongier.pex.Message` class.
These messages will allow us to transfer information between any business service/process/operation.
Example of a message ( situated in the src/python/demo/reddit/message.py file ):
```python
from grongier.pex import Message
from dataclasses import dataclass
from obj import PostClass
@dataclass
class PostMessage(Message):
post:PostClass = None
to_email_address:str = None
found:str = None
```
WIP It is to be noted that it is needed to use types when you define an object or a message.
## 7.12. How to regsiter a component
You can register a component to iris in many way :
* Only one component with `register_component`
* All the component in a file with `register_file`
* All the component in a folder with `register_folder`
### 7.12.1. register_component
Start an embedded python shell :
```sh
/usr/irissys/bin/irispython
```
Then use this class method to add a new py file to the component list for interoperability.
```python
from grongier.pex import Utils
Utils.register_component(,,,,)
```
e.g :
```python
from grongier.pex import Utils
Utils.register_component("MyCombinedBusinessOperation","MyCombinedBusinessOperation","/irisdev/app/src/python/demo/",1,"PEX.MyCombinedBusinessOperation")
```
### 7.12.2. register_file
Start an embedded python shell :
```sh
/usr/irissys/bin/irispython
```
Then use this class method to add a new py file to the component list for interoperability.
```python
from grongier.pex import Utils
Utils.register_file(,,)
```
e.g :
```python
from grongier.pex import Utils
Utils.register_file("/irisdev/app/src/python/demo/bo.py",1,"PEX")
```
### 7.12.3. register_folder
Start an embedded python shell :
```sh
/usr/irissys/bin/irispython
```
Then use this class method to add a new py file to the component list for interoperability.
```python
from grongier.pex import Utils
Utils.register_folder(,,)
```
e.g :
```python
from grongier.pex import Utils
Utils.register_folder("/irisdev/app/src/python/demo/",1,"PEX")
```
### 7.12.4. migrate
Start an embedded python shell :
```sh
/usr/irissys/bin/irispython
```
Then use this static method to migrate the settings file to the iris framework.
```python
from grongier.pex import Utils
Utils.migrate()
```
#### 7.12.4.1. setting.py file
This file is used to store the settings of the interoperability components.
It has two sections :
* `CLASSES` : This section is used to store the classes of the interoperability components.
* `PRODUCTIONS` : This section is used to store the productions of the interoperability components.
e.g :
```python
import bp
from bo import *
from bs import *
CLASSES = {
'Python.RedditService': RedditService,
'Python.FilterPostRoutingRule': bp.FilterPostRoutingRule,
'Python.FileOperation': FileOperation,
'Python.FileOperationWithIrisAdapter': FileOperationWithIrisAdapter,
}
PRODUCTIONS = [
{
'dc.Python.Production': {
"@Name": "dc.Demo.Production",
"@TestingEnabled": "true",
"@LogGeneralTraceEvents": "false",
"Description": "",
"ActorPoolSize": "2",
"Item": [
{
"@Name": "Python.FileOperation",
"@Category": "",
"@ClassName": "Python.FileOperation",
"@PoolSize": "1",
"@Enabled": "true",
"@Foreground": "false",
"@Comment": "",
"@LogTraceEvents": "true",
"@Schedule": "",
"Setting": {
"@Target": "Host",
"@Name": "%settings",
"#text": "path=/tmp"
}
},
{
"@Name": "Python.RedditService",
"@Category": "",
"@ClassName": "Python.RedditService",
"@PoolSize": "1",
"@Enabled": "true",
"@Foreground": "false",
"@Comment": "",
"@LogTraceEvents": "false",
"@Schedule": "",
"Setting": [
{
"@Target": "Host",
"@Name": "%settings",
"#text": "limit=10\nother
Article
sween · Oct 20, 2023

This article will cover turning over control of provisioning the InterSystems Kubernetes Operator, and starting your journey managing your own "Cloud" of InterSystems Solutions through Git Ops practices. This deployment pattern is also the fulfillment path for the [PID^TOO||](https://www.pidtoo.com)| FHIR Breathing Identity Resolution Engine.
### Git Ops
I encourage you to do your own research or ask your favorite LLM about Git Ops, but I can paraphrase it here for you as we understand it. Git Ops is an alternative deployment paradigm, where the Kubernetes Cluster itself is "pulling" updates from manifests that reside in source control to manage the state of your solutions, making "Git" an integral part of the name.
### Prerequisites
* Provision a Kubernetes Cluster , this has been tested on EKS, GKE, and MicroK8s Clusters
* Provision a GitLab, GitHub, or other Git Repo that is accessible by your Kubernetes Cluster
### Argo CD
The star of our show here is [ArgoCD](https://argoproj.github.io/cd/), which provides a declarative approach to continuous delivery with a ridiculously well done UI. Getting the chart going on your cluster is a snap with just a couple of strokes on your cluster.
kubectl create namespace argocd
kubectl apply -n argocd -f \
https://raw.githubusercontent.com/argoproj/argo-cd/stable/manifests/install.yaml
Let's go get logged into the UI for ArgoCD on your Kubernetes Cluster, to do this, you need to grab the secret that was created for the UI, and setup a port forward to make it accessible on your system.
**Grab Secret**
Decrypt it and put it on your clipboard.

**Port Forward**
Redirect port 4000 (or whatever) to your local host

**UI**
Navigate to https://0.0.0.0:4000 and supply the secret to the login screen and login.

### InterSystems Kubernetes Operator (IKO)
Instructions for obtaining the IKO Helm chart in the [documentation ](https://docs.intersystems.com/components/csp/docbook/DocBook.UI.Page.cls?KEY=AIKO)itself, once you get it, check it in to your git repo in a feature branch. I would provide a sample repo for this, but unfortunately cant do it without violating a re-distribution as it does not appear the chart is available in a public repository.
Create yourself a feature branch in your git repository and unpack the IKO Helm chart into a single directory. As below, this is `iko/iris_operator_amd-3.5.48.100` off the root of the repo.
On `feature/iko` branch as an example:
├── iko
│ ├── AIKO.pdf
│ └── iris_operator_amd-3.5.48.100
│ ├── chart
│ │ └── iris-operator
│ │ ├── Chart.yaml
│ │ ├── templates
│ │ │ ├── apiregistration.yaml
│ │ │ ├── appcatalog-user-roles.yaml
│ │ │ ├── cleaner.yaml
│ │ │ ├── cluster-role-binding.yaml
│ │ │ ├── cluster-role.yaml
│ │ │ ├── deployment.yaml
│ │ │ ├── _helpers.tpl
│ │ │ ├── mutating-webhook.yaml
│ │ │ ├── service-account.yaml
│ │ │ ├── service.yaml
│ │ │ ├── user-roles.yaml
│ │ │ └── validating-webhook.yaml
│ │ └── values.yaml
**IKO Setup**
Create `isc` namespace, and add secret for `containers.intersystems.com` into it.
kubectl create ns isc
kubectl create secret docker-registry \
pidtoo-pull-secret --namespace isc \
--docker-server=https://containers.intersystems.com \
--docker-username='ron@pidtoo.com' \
--docker-password='12345'
This should conclude the setup for IKO, and enable it's delegate it entirely through Git Ops to Argo CD.
### Connect Git to Argo CD
This is a simple step in the UI for Argo CD to connect the repo, this step ONLY "connects" the repo, further configuration will be in the repo itself.

### Declare Branch to Argo CD
Configure Kubernetes to poll branch through Argo CD `values.yml` in the Argo CD chart. It is up to you really for most of these locations in the git repo, but the opinionated way to declare things in your repo can be in an "App of Apps" paradigm.
Consider creating the folder structure below, and the files that need to be created as a table of contents below:
├── argocd
│ ├── app-of-apps
│ │ ├── charts
│ │ │ └── iris-cluster-collection
│ │ │ ├── Chart.yaml ## Chart
│ │ │ ├── templates
│ │ │ │ ├── iris-operator-application.yaml ## IKO As Application
│ │ │ └── values.yaml ## Application Chart Values
│ │ └── cluster-seeds
│ │ ├── seed.yaml ## Cluster Seed
**Chart**
apiVersion: v1
description: 'pidtoo IRIS cluster'
name: iris-cluster-collection
version: 1.0.0
appVersion: 3.5.48.100
maintainers:
- name: intersystems
email: support@intersystems.com
**IKO As Application**
apiVersion: argoproj.io/v1alpha1
kind: Application
metadata:
name: iko
namespace: argocd
spec:
destination:
namespace: isc
server: https://kubernetes.default.svc
project: default
source:
path: iko/iris_operator_amd-3.5.48.100/chart/iris-operator
repoURL: {{ .Values.repoURL }}
targetRevision: {{ .Values.targetRevision }}
syncPolicy:
automated: {}
syncOptions:
- CreateNamespace=true
**IKO Application Chart Values**
targetRevision: main
repoURL: https://github.com/pidtoo/gitops_iko.git
**Cluster Seed**
apiVersion: argoproj.io/v1alpha1
kind: Application
metadata:
name: gitops-iko-seed
namespace: argocd
labels:
isAppOfApps: 'true'
spec:
destination:
namespace: isc
server: https://kubernetes.default.svc
project: default
source:
path: argocd/app-of-apps/charts/iris-cluster-collection
repoURL: https://github.com/pidtoo/gitops_iko.git
targetRevision: main
syncPolicy:
automated: {}
syncOptions:
- CreateNamespace=true
### Seed the Cluster!
This is the final on interacting with your Argo CD/IKO Cluster applications, **the rest is up to Git**!
kubectl apply -n argocd -f argocd/app-of-apps/cluster-seeds/seed.yaml
### Merge to Main
Ok, this is where we see how we did in the UI, you should immediately start seeing in Argo CD applications starting coming to life.
**The apps view:**

**InterSystems Kubernetes Operator View**


> Welcome to GitOps with the InterSystems Kubernetes Operator!
[Git Demos are the Best! - Live October 19, 2023](https://youtu.be/IKoadH_oOPU?feature=shared)
Ron Sweeney, Principal Architect Integration Required, LLC (PID^TOO)
Dan McCracken, COO, Devsoperative, INC
Ohh, so useful Ron & Dan, thanks for sharing your experience and tools. Very insightful Ron, thanks. 💡 This article is considered as InterSystems Data Platform Best Practice.
Article
sween · Jul 23, 2024
A Quick Start to InterSystems Cloud SQL Data in Databricks
Up and Running in Databricks against an InterSystmes Cloud SQL consists of four parts.
Obtaining Certificate and JDBC Driver for InterSystems IRIS
Adding an init script and external library to your Databricks Compute Cluster
Getting Data
Putting Data
Download X.509 Certificate/JDBC Driver from Cloud SQL
Navigate to the overview page of your deployment, if you do not have external connections enabled, do so and download your certificate and the jdbc driver from the overview page.
I have used intersystems-jdbc-3.8.4.jar and intersystems-jdbc-3.7.1.jar with success in Databricks from Driver Distribution.
Init Script for your Databricks Cluster
Easiest way to import one or more custom CA certificates to your Databricks Cluster, you can create an init script that adds the entire CA certificate chain to both the Linux SSL and Java default cert stores, and sets the REQUESTS_CA_BUNDLE property. Paste the contents of your downloaded X.509 certificate in the top block of the following script:
import_cloudsql_certficiate.sh
#!/bin/bash
cat << 'EOF' > /usr/local/share/ca-certificates/cloudsql.crt
-----BEGIN CERTIFICATE-----
<PASTE>
-----END CERTIFICATE-----
EOF
update-ca-certificates
PEM_FILE="/etc/ssl/certs/cloudsql.pem"
PASSWORD="changeit"
JAVA_HOME=$(readlink -f /usr/bin/java | sed "s:bin/java::")
KEYSTORE="$JAVA_HOME/lib/security/cacerts"
CERTS=$(grep 'END CERTIFICATE' $PEM_FILE| wc -l)
# To process multiple certs with keytool, you need to extract
# each one from the PEM file and import it into the Java KeyStore.
for N in $(seq 0 $(($CERTS - 1))); do
ALIAS="$(basename $PEM_FILE)-$N"
echo "Adding to keystore with alias:$ALIAS"
cat $PEM_FILE |
awk "n==$N { print }; /END CERTIFICATE/ { n++ }" |
keytool -noprompt -import -trustcacerts \
-alias $ALIAS -keystore $KEYSTORE -storepass $PASSWORD
done
echo "export REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt" >> /databricks/spark/conf/spark-env.sh
echo "export SSL_CERT_FILE=/etc/ssl/certs/ca-certificates.crt" >> /databricks/spark/conf/spark-env.sh
Now that you have the init script, upload the script to Unity Catalog to a Volume.
Once the script is on a volume, you can add the init script to the cluster from the volume in the Advanced Properties of your cluster.
Secondly, add the intersystems jdbc driver/library to the cluster...
...and either start or restart your compute.
Databricks Station - Inbound InterSystems IRIS Cloud SQL
Create a Python Notebook in your workspace, attach it to your cluster and test dragging data inbound to Databricks. Under the hood, Databricks is going to be using pySpark, if that is not immediately obvious.
The following spark dataframe construction is all you should need, you can grab your connection info from the overview page as before.
df = (spark.read
.format("jdbc")
.option("url", "jdbc:IRIS://k8s-05868f04-a4909631-ac5e3e28ef-6d9f5cd5b3f7f100.elb.us-east-1.amazonaws.com:443/USER")
.option("driver", "com.intersystems.jdbc.IRISDriver")
.option("dbtable", "(SELECT name,category,review_point FROM SQLUser.scotch_reviews) AS temp_table;")
.option("user", "SQLAdmin")
.option("password", "REDACTED")
.option("driver", "com.intersystems.jdbc.IRISDriver")\
.option("connection security level","10")\
.option("sslConnection","true")\
.load())
df.show()
Illustrating the dataframe output from data in Cloud SQL... boom!
Databricks Station - Outbound InterSystems IRIS Cloud SQL
Lets now take what we read from IRIS and write it write back with Databricks. If you recall we read only 3 fields into our dataframe, so lets write that back immediately and specify an "overwrite" mode.
df = (spark.read
.format("jdbc")
.option("url", "jdbc:IRIS://k8s-05868f04-a4909631-ac5e3e28ef-6d9f5cd5b3f7f100.elb.us-east-1.amazonaws.com:443/USER")
.option("driver", "com.intersystems.jdbc.IRISDriver")
.option("dbtable", "(SELECT TOP 3 name,category,review_point FROM SQLUser.scotch_reviews) AS temp_table;")
.option("user", "SQLAdmin")
.option("password", "REDACTED")
.option("driver", "com.intersystems.jdbc.IRISDriver")\
.option("connection security level","10")\
.option("sslConnection","true")\
.load())
df.show()
mode = "overwrite"
properties = {
"user": "SQLAdmin",
"password": "REDACTED",
"driver": "com.intersystems.jdbc.IRISDriver",
"sslConnection": "true",
"connection security level": "10",
}
df.write.jdbc(url="jdbc:IRIS://k8s-05868f04-a4909631-ac5e3e28ef-6d9f5cd5b3f7f100.elb.us-east-1.amazonaws.com:443/USER", table="databricks_scotch_reviews", mode=mode, properties=properties)
Executing the Notebook
Illustrating the data in InterSystems Cloud SQL!
Things to Consider
By default, PySpark writes data using multiple concurrent tasks, which can result in partial writes if one of the tasks fails.
To ensure that the write operation is atomic and consistent, you can configure PySpark to write data using a single task (i.e., set the number of partitions to 1) or use a iris-specific feature like transactions.
Additionally, you can use PySpark’s DataFrame API to perform filtering and aggregation operations before reading the data from the database, which can reduce the amount of data that needs to be transferred over the network.
Hello,
I have 2 questions if you could help
1 ) Do we need ";" in the end or it is not required
.option("dbtable", "(SELECT TOP 3 name,category,review_point FROM SQLUser.scotch_reviews) AS temp_table;")
2) This JDBC works fine until I add one specific column in my query, when I add that column I get following error
[%msg: < Input (;) encountered after end of query
Kindly help. No, I would leave out the semicolon at the end of that query. It's typically used as a statement separator, but not really part of query syntax itself. IRIS (as of 2023.2) will tolerate it at the end of a statement, but it doesn't seem that Spark really does anything with it as it wraps what you sent to dbtable with further queries, causing the error you saw.
You may also want to apply
.option(“pushDownLimit”, false)
Announcement
Derek Robinson · Jun 12
Hi, Community!
⛅Need to connect your application to InterSystems Cloud Services? Get a high-level overview of the process:
Connecting to InterSystems Cloud Services
In this video, you will learn:
How to connect with Python, Java, C++, or .NET.
Key components for a connection and basic setup steps.
The importance of TLS encryption.
Article
Kurro Lopez · Jun 25
Previously, we trained our model using machine learning. However, the sample data we utilized was generated directly from insert statements.
Today, we will learn how to load this data straight from a file.
Dump Data
Before dumping the data from your file, check what header the fields have.
In this case, the file is called “Sleep_health_and_lifestyle_dataset.csv” and is located in the data/csv folder.
This file contains 374 records plus a header (375 lines).
The header includes the following names and positions:
Person ID
Gender
Age
Occupation
Sleep Duration
Quality of Sleep
Physical Activity Level
Stress Level
BMI Category
Systolic
Diastolic
Heart Rate
Daily Steps
Sleep Disorder
It is essential to know the names of column headers.
The class St.MLL.insomnia02 has different column names; therefore, we need to load the data indicating the name of the column into the file, while the relation with the column is placed in the table.
LOAD DATA FROM FILE '/opt/irisbuild/data/csv/Sleep_health_and_lifestyle_dataset.csv'
INTO St_MLL.insomnia02
(Gender,Age,Occupation,SleepDuration,QualitySleep,PhysicalActivityLevel,
StressLevel,BMICategory,Systolic,Diastolic,HeartRate,DailySteps,SleepDisorder)
VALUES ("Gender","Age","Occupation","Sleep Duration","Quality of Sleep","Physical Activity Level",
"Stress Level","BMI Category","Systolic","Diastolic","Heart Rate","Daily Steps","Sleep Disorder")
USING {"from":{"file":{"header":true}}}
All the information makes sense, but… What is the last instruction?
{
"from": {
"file": {
"header": true
}
}
}
This is an instruction for the LOAD DATA command to determine what the file is (whether or not it has a header; whether the column separator is another character, etc).
You can find more information about the JSON options by checking out the following links:
LOAD DATA (SQL)
LOAD DATA jsonOptions
Since the columns of the file do not match those in the tables, it is necessary to indicate that the document has a line with the header, because by default, this value is “false”.
Now, we will drill our model once more. With much more data in hand, it will be way more efficient at this point.
TRAIN MODEL insomnia01AllModel FROM St_MLL.insomnia02
TRAIN MODEL insomnia01SleepModel FROM St_MLL.insomnia02
TRAIN MODEL insomnia01BMIModel FROM St_MLL.insomnia02
Populate the St_MLL.insomniaValidate02 table with 50% of St_MLL.insomnia02 rows:
INSERT INTO St_MLL.insomniaValidate02(
Age, BMICategory, DailySteps, Diastolic, Gender, HeartRate, Occupation, PhysicalActivityLevel, QualitySleep, SleepDisorder, SleepDuration, StressLevel, Systolic)
SELECT TOP 187
Age, BMICategory, DailySteps, Diastolic, Gender, HeartRate, Occupation, PhysicalActivityLevel, QualitySleep, SleepDisorder, SleepDuration, StressLevel, Systolic
FROM St_MLL.insomnia02
Validate the models with the newly validated table:
INSERT INTO St_MLL.insomniaTest02(
Age, BMICategory, DailySteps, Diastolic, Gender, HeartRate, Occupation, PhysicalActivityLevel, QualitySleep, SleepDisorder, SleepDuration, StressLevel, Systolic)
SELECT TOP 50
Age, BMICategory, DailySteps, Diastolic, Gender, HeartRate, Occupation, PhysicalActivityLevel, QualitySleep, SleepDisorder, SleepDuration, StressLevel, Systolic
FROM St_MLL.insomnia02
Proceeding with our previous model (a nurse, 29-year-old, female), we can check what prediction our test table will make.
Note: The following queries will be focused exclusively on this type of person.
SELECT *, PREDICT(insomnia01AllModel) FROM St_MLL.insomnia02
WHERE age = 29 and Gender = 'Female' and Occupation = 'Nurse'
SURPRISE!!! The result is identical to the one with less data. We thought that training our model with more data would improve the outcome, but we were wrong.
For a change, I executed the probability query instead, and I got a pretty interesting result:
SELECT Gender, Age, SleepDuration, QualitySleep, SleepDisorder, PREDICT(insomnia01SleepModel) As SleepDisorderPrediction, PROBABILITY(insomnia01SleepModel FOR 'Insomnia') as ProbabilityInsomnia,
PROBABILITY(insomnia01SleepModel FOR 'Sleep Apnea') as ProbabilityApnea
FROM St_MLL.insomniaTest02
WHERE age = 29 and Gender = 'Female' and Occupation = 'Nurse'
According to the data (sex, age, sleep quality, and sleep duration), the probability of having insomnia is only 46.02%, whereas the chance of having sleep apnea is 51.46%.
Our previous data training provided us with the following percentages: insomnia - 34.63%, and sleep apnea - 64.18%.
What does it mean? The more data we have, the more accurate results we obtain.
Time Is Money
Now, let's try another type of training, using the time series.
Following the same steps we took to build the insomnia table, I created a class called WeatherBase:
Class St.MLL.WeatherBase Extends %Persistent
{
/// Date and time of the weather observation in New York City
Property DatetimeNYC As %DateTime;
/// Measured temperature in degrees
Property Temperature As %Numeric(SCALE = 2);
/// Apparent ("feels like") temperature in degrees
Property ApparentTemperature As %Numeric(SCALE = 2);
/// Relative humidity (0 to 1)
Property Humidity As %Numeric(SCALE = 2);
/// Wind speed in appropriate units (e.g., km/h)
Property WindSpeed As %Numeric(SCALE = 2);
/// Wind direction in degrees
Property WindBearing As %Numeric(SCALE = 2);
/// Visibility distance in kilometers
Property Visibility As %Numeric(SCALE = 2);
/// Cloud cover fraction (0 to 1)
Property LoudCover As %Numeric(SCALE = 2);
/// Atmospheric pressure in appropriate units (e.g., hPa)
Property Pressure As %Numeric(SCALE = 2);
}
Then, I built two classes extending from WeatherBase (Weather and WeatherTest). It allowed me to have the same columns for both tables.
There is a file named “NYC_WeatherHistory.csv” in the csv folder. It contains the temperature, humidity, wind speed, and pressure for New York City in 2015. It is a fortune of data!! For that reason, we will load the file into our table using the knowledge about how to load data from a file.
LOAD DATA FROM FILE '/opt/irisbuild/data/csv/NYC_WeatherHistory.csv'
INTO St_MLL.Weather
(DatetimeNYC,Temperature,ApparentTemperature,Humidity,WindSpeed,WindBearing,Visibility,LoudCover,Pressure)
VALUES ("DatetimeNYC","Temperature","ApparentTemperature","Humidity","WindSpeed","WindBearing","Visibility","LoudCover","Pressure")
USING {"from":{"file":{"header":true}}}
📣NOTE: The names of the columns and the fields in the table are the same, therefore, we can use the following sentence instead.
LOAD DATA FROM FILE '/opt/irisbuild/data/csv/NYC_WeatherHistory.csv'
INTO St_MLL.Weather
USING {"from":{"file":{"header":true}}}
Now we will create our model, but we will do it in a particular way.
CREATE TIME SERIES MODEL WeatherForecast
PREDICTING (Temperature, Humidity, WindSpeed, Pressure)
BY (DatetimeNYC) FROM St_MLL.Weather
USING {"Forward":3}
If we wish to create a prediction series, we should take into account the recommendations below:
The date field must be datetime.
Try to sort the data chronologically.
📣NOTE: This advice comes from Luis Angel Perez, thanks to his great experience in Machine Learning.
The latest command, USING {"Forward":3}, sets the timesteps for the time series.
This parameter has other values:
forward specifies the number of timesteps in the future that you would like to foresee as a positive integer. Approximated rows will appear after the latest time or date in the original dataset. However, you may specify both this and the backward setting simultaneously.
Example: USING {"Forward":3}
backward defines the number of timesteps in the past that you would like to predict as a positive integer. Forecasted rows will appear before the earliest time or date in the original dataset. Remember that you can indicate both this and the forward setting at the same time. The AutoML provider ignores this parameter.Example: USING {"backward":5}
frequency determines both the size and unit of the predicted timesteps as a positive integer followed by a letter that denotes the unit of time. If this value is not appointed, the most common timestep in the data is supplied.
Example: USING {"Frequency":"d"}
This parameter is case-insensitive.
The letter abbreviations for units of time are outlined in the following table:
Abbreviation
Unit of Time
y
year
m
month
w
week
d
day
h
hour
t
minute
s
second
Now… training. You already know the command for that:
TRAIN MODEL WeatherForecast
Be patient! This training took 1391 seconds, wich is approximately 23 minutes!!!!
Now, populate the table St_MLL.WeatherTest with the command Populate.
Do ##class(St.MLL.WeatherTest).Populate()
It includes the first 5 days of January 2025. When completed, select the prediction using the model and the test table.
📣Remember: It is crucial to have at least three values to be able to make a prognosis.
SELECT WITH PREDICTIONS (WeatherForecast) * FROM St_MLL.WeatherTest
Well, it is showing us the forecast for the next 3 hours on January 2, 2025. This happens because we defined our model to forecast 3 records ahead. However, our data model has data for every hour of every day (00:00, 01:00, 02:00, etc.)
If we want to see the daily outlook, we should create another model trained to do so by the day.
Let's create the following model to see the 5-day forecast.
CREATE TIME SERIES MODEL WeatherForecastDaily
PREDICTING (Temperature, Humidity, WindSpeed, Pressure)
BY (DatetimeNYC) FROM St_MLL.Weather
USING {"Forward":5, "Frequency":"d"}
Now, repeat the same steps… training and displaying the forecast:
TRAIN MODEL WeatherForecastDaily
SELECT WITH PREDICTIONS (WeatherForecastDaily) * FROM St_MLL.WeatherTest
Wait! This time, it throws out the following error:
[SQLCODE: <-400>:<Fatal error occurred>][%msg: <PREDICT execution error: ERROR #5002: ObjectScript error: <PYTHON EXCEPTION> *<class 'ValueError'>: forecast_length is too large for training data. What this means is you don't have enough history to support cross validation with your forecast_length. Various solutions include bringing in more data, alter min_allowed_train_percent to something smaller, and also setting a shorter forecast_length to class init for cross validation which you can then override with a longer value in .predict() This error is also often caused by errors in inputing of or preshaping the data. Check model.df_wide_numeric to make sure data was imported correctly. >]
What has happened?
As the error says, it is due to the lack of data to make a prediction. You might think that it needs more data in the Weather table and training, but it has 8760 records… so what is wrong?
If we want to forecast the weather for a large number of days, we need a lot of data in the model. Filling all the data into a table requires extensive training time and a very powerful PC. Therefore, since this is a basic tutorial, we will build a model for 3 days only. Don’t forget to remove the model WeatherForecastDaily before following the instructions.
DROP MODEL WeatherForecastDaily
I am not going to include all the images of those changes, but I will give you the instructions on what to do:
CREATE TIME SERIES MODEL WeatherForecastDaily
PREDICTING (Temperature, Humidity, WindSpeed, Pressure)
BY (DatetimeNYC) FROM St_MLL.Weather
USING {"Forward":3, "Frequency":"d"}
TRAIN MODEL WeatherForecastDaily
SELECT WITH PREDICTIONS (WeatherForecastDaily) * FROM St_MLL.WeatherTest
Important Note
The Docker container containers.intersystems.com/intersystems/iris-community-ml:latest-em is no longer available, so you have to use the iris-community container.
This container is not initialized with the AutoML configuration, so the following statement will need to be executed first:
pip install --index-url https://registry.intersystems.com/pypi/simple --no-cache-dir --target /usr/irissys/mgr/python intersystems-iris-automl
If you are using a Dockerfile to deploy your Docker image, remember to add the command below to the deployment instructions:
ARG IMAGE=containers.intersystems.com/intersystems/iris-community:latest-em
FROM $IMAGE
USER root
WORKDIR /opt/irisbuild
RUN chown ${ISC_PACKAGE_MGRUSER}:${ISC_PACKAGE_IRISGROUP} /opt/irisbuild
RUN pip install --index-url https://registry.intersystems.com/pypi/simple --no-cache-dir --target /usr/irissys/mgr/python intersystems-iris-automl
For more information, please visit the website below:
https://docs.intersystems.com/iris20251/csp/docbook/DocBook.UI.Page.cls?KEY=GIML_Configuration_Providers#GIML_Configuration_Providers_AutoML_Install
Article
Irène Mykhailova · Jun 27
Hi Community,
While writing an article yesterday, I realized I was so busy with people who came to the Developer Community table at the Tech Exchange that I forgot to take photos for you. Luckily, I realized the error of my ways and corrected my behavior accordingly 😉
So, let's look at what happened on Tuesday at the InterSystems Ready 2025! It began with a speech of Scott Gnau about the approach and architecture of InterSystems Data Platform and how it is different from all other DBMSs:
Afterwards, @Tom.Woodfin and Peter Lesperance dove into the details of using the novelties of IRIS in Epic:
Then, @Gokhan.Uluderya talked about data in AI and how important it is to have good data to be able to apply GenAI or Vector Search to it:
@Jeffrey.Fried picked up this topic and went into more detail about InterSystems GenAI strategy:
Daniel Franko summed up the tools that are available to developers of IRIS for Health:
After lunch most of the participants went on to the sessions or Tech Exchange. For example, @Raj.Singh5479 dropped by our table and we talked about the current Ideas Contest
@Henrique, @henry, @Dean.Andrews2971 and @Guilherme.Silva came up to us as well:
@Lorenzo.Scalese, @Dean.Andrews2971 , @DKG
@Sergei.Shutov3787, @Anastasia.Dyubaylo, @Vishal.Pallerla
@Iryna.Mykhailova, @Anastasia.Dyubaylo, @Robert.Kuszewski
@Henrique, @Benjamin.DeBoe, @Anastasia.Dyubaylo, @Enrico.Parisi, @henry, @Iryna.Mykhailova, @José.Pereira
The "Musketeers" (@henry, @Henrique,@José.Pereira) with @Anastasia.Dyubaylo
@Dean.Andrews2971, Mariam Makhmutova, @Anastasia.Dyubaylo, @DKG
@Muhammad.Waseem, @Guillaume.Rongier7183, @Anastasia.Dyubaylo, @Oliver.Wilms
@DKG, @Anastasia.Dyubaylo, @Benjamin.Spead, @tomd
This year to make it more interesting, for our developer guests we prepared a special challenge - a quiz from Global Masters!
So, here are @Derek.Robinson, @Myles.Collins and @Patrick.Sulin7198 trying to get all 5 answers correctly:
In the next article, you will learn who beat the challenge!
While there was all this excitement at the Developer Community table, there were presentations at the big screen in the Tech Exchange, for example from @Brett.Saviano about VS Code:
And presentations on smaller tables, for example, from @Guillaume.Rongier7183:
Outside the Tech Exchange, the startups were making their presentations. For example, SerenityGPT which created our wonderful DC AI Bot and DC AI Chat:
And in the evening we went to the Universal City Walk and were treated to the concert of Integrity Check, which was a blast!
After the concert, we had the pleasure of the company of the guitar player, aka @Randy.Pallotta
Afterwards, I went to roam and met up with @Dean.Andrews2971, @Adeline.Icard, @Anastasia.Dyubaylo, and @Guillaume.Rongier7183:
We finished the day with a rousing game of Table shuffleboard - ladies (@Adeline.Icard, @Anastasia.Dyubaylo, and me) vs gentlemen (@Guillaume.Rongier7183, @Jeffrey.Fried, @Eduard.Lebedyuk). Guess in the comments who won 😁
All in all, we had a wonderful time at the Universal, which was a great end to a great day.
Article
Irène Mykhailova · Jun 25
Hi Community!
I'm super excited to be your on-the-ground reporter for the biggest developer event of the year - InterSystems Ready 2025!
As you may know from previous years, our global summits are always exciting, exhilarating, and packed with valuable knowledge, innovative ideas, and exciting news from InterSystems. This year is no different. But let's not get ahead of ourselves and start from the beginning.
Pre-summit day was, as usual, filled with fun and educational experiences. Those who enjoy playing golf (I among them) got up at the crack of dawn to tee off before the sun got too high up. Here's our dream team in action:
@sween, @Mark.Bolinsky, @Anzelem.Sanyatwe, @Iryna.Mykhailova
If you're interested, here are the results (but to save you the suspense, we didn't win 😭):
The other group of sports enthusiasts went to play football (AKA soccer). And those who are differently inclined attended the different workshops planned for Sunday:
AI-enabling your applications with InterSystems IRIS
Discovering InterSystems products: a high-level overview
Get ready to build with FHIR in InterSystems: visualizing data as FHIR resources
From FHIR to insights: analytics with FHIRPath, SQL Builder, and Pandas
Ready Startup Forum: insights, innovations & investment with InterSystems
Yet another exciting yearly pre-summit event was a Women's meet-up and reception. Unfortunately, after playing 18 hot and humid holes, I didn't have enough time to make myself presentable before the beginning.
Anyway, everyone was ready to begin the InterSystems Ready 2025 with a bang and turned up at the Welcome reception on time!
Let me share a secret - it's always a highlight of the event to meet friends and colleagues after a long pause.
@Iryna.Mykhailova, @Johan.Jacob7942, @Lorenzo.Scalese, @Adeline.Icard, @Guillaume.Rongier7183
And on Monday, the main event began with the keynote presentation from Terry Ragon, CEO & Founder of InterSystems, with a warm welcome, highlighting InterSystems' dedication to creating technology that truly matters during a time of fast change. He discussed the great promise of AI and data platforms to enhance healthcare and emphasized the importance of making a tangible difference, rather than merely following trends.
Later on, there was a panel discussion moderated by Jennifer Eaton between @Donald.Woodlock, Scott Gnau, and Tim Ferris on the future of healthcare.
Right before lunch was the best presentation of the day! And it was the best because it mentioned the Developer Community. And to share the excitement of it with you, here's a short clip from it:
And to make your day, here are a couple of photos of one of the presenters, @Randy.Pallotta
The AI did a good job, or did it 😁
Anyway, after lunch, our Developer Community booth at the Tech Exchange was ready to roll.
All our cool prizes and games were out and ready to amaze and entertain our guests!
And they soon came.
At the same time, in the hallway outside the Tech Exchange, the startups were doing their presentations. Here's a photo from the SerenityGPT presentation about their software, which utilizes IRIS Vector search to maximize the potential of clinical data.
And all the while, there were interesting presentations and use-cases of InterSystems technology from InterSystems colleagues and guests:
Moreover, there's a big screen for presentations in Tech Exchange, so don't miss it!
This very long and exciting day ended on a really high note - the Ready Games at the Demos and Drinks! There were many great demos from which the guests had to choose the winners — two runner-ups in each category and two winners, for Most Innovative and Most Likely to Use.
Btw, the winners of the Most Likely to Use category are from Lead North, who brought with them the coolest stickers ever:
So, if you're at the Ready 2025 and haven't yet picked up a cute sticker, don't miss your chance to get one (or more) and to talk to @Andre and his colleagues! Swing by the Partner Pavilion (which starts outside the Tech Exchange) and you will definitely find something you like.
So this is it about the first 1.5 days of the Ready 2025. Look out for a new recap tomorrow of the rest of it. And let me tell you, it is unforgettable!
Article
Ashok Kumar T · Jun 30
Overview
Fast Healthcare Interoperability Resources (FHIR) is a standardized framework developed by HL7 International to facilitate the exchange of healthcare data in a flexible, developer-friendly, and modern way. It leverages contemporary web technologies to ensure seamless integration and communication across healthcare systems.
Key FHIR Technologies
RESTful APIs for resource interaction
JSON and XML for data representation
OAuth2 for secure authorization and authentication
FHIR is structured around modular components called resources, each representing specific healthcare concepts, including the following:
Patient – Demographics and identifiers
Observation – Clinical measurements (e.g., vitals, labs)
Encounter – Patient-provider interactions
Medication, AllergyIntolerance, Condition, etc.
Resources are individually defined and can reference other resources to form a comprehensive data model.
InterSystems IRIS for Health: FHIR Support
InterSystems IRIS for Health is a unified data platform designed specifically for health care. It includes native HL7 FHIR support. It provides built-in tools and services, enabling storage, retrieval, transformation, and exchange of FHIR resources.IRIS enhances system interoperability with three major FHIR-handling components:
1.FHIR repository Server
IRIS enables rapid deployment of FHIR-compliant servers, with support for the following:
The complete FHIR paradigm
Implementation of FHIR RESTful APIs, including search and query parameters
Importing and utilizing FHIR packages and structure definitions
Working with FHIR Profiles
Native CRUD operations on FHIR resources
Retrieval of FHIR data in JSON or XML formats
Support for multiple FHIR versions
FHIR SQL builder and bulk FHIR handling capabilities
2. FHIR Facade Layer
The FHIR facade layer is a software architecture pattern used to expose a FHIR-compliant API on top of an existing one (often non-FHIR). It also streamlines the healthcare data system, including an electronic health record (EHR), legacy database, or HL7 v2 message store, without migrating all the data into a FHIR-native system.
This implementation specifically centers around the FHIR Interoperability Adapter.
3. FHIR Interoperability Adapter
InterSystems IRIS for Health offers high flexibility and fine-grained control for transforming such healthcare message standards as HL7 V2.x and C-CDA into FHIR, and vice versa (see the Message Conversion Diagram). However, not all FHIR implementations require a dedicated FHIR repository server. To support such scenarios, IRIS for Health includes an interoperability adapter toolkit that enables detailed message conversion without the need for a FHIR server.
This adapter can handle a variety of external requests (e.g., REST or SOAP APIs) from external systems, transform them into FHIR format, and route them to downstream systems, without necessarily persisting the data to a database.
Alternatively, if needed, the adapter can transform and store the data in the database.
It effectively provides an external interface layer that allows a non-FHIR database to behave as if it were a FHIR server, enabling seamless interoperability.
Message conversion
SDA: Summary Document Architecture
The Summary Document Architecture (SDA) is InterSystems’ intermediary XML-based format used to represent patient data internally within IRIS and HealthShare products. This powerful native data structure enables you to access discrete data and easily convert between multiple data formats, including HL7 V2, CCDA, C32, HL7 FHIR, and others.
SDA Structure
The SDA (Structured Data Architecture) is primarily divided into two main components:
Container – Top-level structure containing one or more sections
Sections – Representation of specific healthcare elements(e.g., Patient, Encounter, AllergyIntolerance)
Container
The container is the top level of the SDA standard, and it includes multiple sections (e.g., patient, encounter, allergyIntolerance and others).
Let's explore the internal structure of the SDA and its components.
class definition of Container:
The HS.SDA3.Container class serves as the primary definition for representing an SDA document. Various sections, such as patient and encounter, are defined as objects and included as properties within this class.Sections.
A section is a discrete piece of a container element represented as an IRIS class definition with relevant data elements on the container.
Patient – HS.SDA3.Patient
Encounter – HS.SDA3.Encounter
Allergy - HS.SDA3.Allergy
SDA Container Structure
The below XML structure represents an entire SDA container.
<Container>
<Patient/>
<Encounters/>
<Encounters/>
<AdvanceDirectives/>
</Container>
SDA Data Types
The FHIR data type formats are different from the IRIS standard data types. So, SDA has specific custom data types that handle the properties in sections more effectively than the standard properties, e.g., %String, %Integer, %Stream, etc. However, the standard properties are also used in SDA sections.
Those data type classes are also defined inside the HS.SDA3* package:
HS.SDA3.Name
HS.SDA3.CodeTableDetail.Allergy
HS.SDA3.PatientNumber
HS.SDA3.TimeStamp
SDA Extension
In most cases, the SDA has sufficient properties to manage and generate all the data coming through the system to develop a resource. However, if you need to accommodate additional data as a part of your implementation, IRIS provides a straightforward way to extend it into the SDA extension classes effortlessly.
For example, HS.Local.SDA3.AllergyExtension class definition is the extension class for the HS.SDA3.Allergy. You can add the necessary data elements to this extension class, simplifying the access and manipulation throughout your implementation.
The next step is to create a container object.
Create a container object
ClassMethod CreateSDAContainer()
{
set SDAContainer = ##class(HS.SDA3.Container).%New()
#; create patient object
set patientSDA = ##class(HS.SDA3.Patient).%New()
set patientSDA.Name.FamilyName = "stood"
set patientSDA.Name.GivenName = "test"
set patientSDA.Gender.Code="male"
set patientSDA.Gender.Description="birth gender"
#; create Encounter 1
set encounterSDA = ##class(HS.SDA3.Encounter).%New()
set encounterSDA.AccountNumber = 12109979
set encounterSDA.ActionCode ="E"
set encounterSDA.AdmitReason.Code ="Health Concern"
set encounterSDA.AdmitReason.Description = "general health concern"
#; create Encounter 2
set encounterSDA1 = ##class(HS.SDA3.Encounter).%New()
set encounterSDA1.AccountNumber = 95856584
set encounterSDA1.ActionCode ="D"
set encounterSDA1.AdmitReason.Code ="reegular checkup"
set encounterSDA1.AdmitReason.Description = "general health ckeckup"
#; set the patientSDA into the container.
set SDAContainer.Patient = patientSDA
#; set multiple encounters into the container SDA
do SDAContainer.Encounters.Insert(encounterSDA)
do SDAContainer.Encounters.Insert(encounterSDA1)
#; convert the SDA object into an XML string.
do SDAContainer.XMLExportToString(.containerString)
write containerString
}
SDA – XML Document Output
<Container>
<Patient>
<Name>
<FamilyName>stood</FamilyName>
<GivenName>test</GivenName>
</Name>
<Gender>
<Code>male</Code>
<Description>birth gender</Description>
</Gender>
</Patient>
<Encounters>
<Encounter>
<AccountNumber>12109979</AccountNumber>
<AdmitReason>
<Code>Health Concern</Code>
<Description>general health concern</Description>
</AdmitReason>
<ActionCode>E</ActionCode>
</Encounter>
<Encounter>
<AccountNumber>95856584</AccountNumber>
<AdmitReason>
<Code>reegular checkup</Code>
<Description>general health ckeckup</Description>
</AdmitReason>
<ActionCode>D</ActionCode>
</Encounter>
</Encounters>
<UpdateECRDemographics>true</UpdateECRDemographics>
</Container>
In the previous section, we discussed the SDA and its components. We also learned how to generate the SDA via Cache ObjectScript.
Next, we will generate a FHIR resource or Bundle using Interoperability production (formerly known as Ensemble).
Let’s briefly read about interoperability production before creating a FHIR resource.
Interoperability Production with FHIR Adaptor
An interoperability production is an integration framework for connecting systems and developing applications for interoperability with ease. It is typically divided into 3 major components:
Business service – It connects to the external system and receives the request from it.
Business process – It receives a request from the other business hosts, processes the request based on your defined business logic, and converts the relevant data. Multiple components are used to convert the data:
BPL – Business Process Language
DTL – Data Transformation Language
BR – Business Rules
Record Mapping
Business operation – It connects with the external system and sends the response to it.
Let’s begin the process of constructing a FHIR message.
Create FHIR Resource
There are two types of systems: FHIR servers and non-FHIR servers. In our case, we aim to make a non-FHIR InterSystems IRIS database appear as a FHIR-compliant system by generating FHIR resources using the FHIR Interoperability Adapters.
In this section, we will demonstrate how to generate FHIR resources from custom data stored in the IRIS database with the help of the InterSystems IRIS for Health Interoperability Toolkit with FHIR adapters.
As a part of this implementation, we will create the following types of FHIR resources:
Standard FHIR Resource – It utilizes the built-in FHIR classes with minimal or no modifications.
Custom FHIR Resource – It involves adding extensions to the SDA model and creating a custom Data Transformation (DTL) for the FHIR resource.
Each implementation will be initiated through dedicated business hosts.
Business Service
The RESTful business host is responsible for receiving requests from external systems. You may configure the appropriate adapter based on your specific integration requirements (e.g., HTTP, SOAP, or other supported protocols).
Upon receiving a request from the external system, the workflow will generate a corresponding FHIR resource using data persisted in the custom or legacy database.
FHIR Business Process
The FHIR message generation process involves two primary steps:
Transform custom/proprietary data into SDA (HL7 version 2.X to SDA, and CCDA to SDA, etc.).
Add data elements to the SDA and, if required, create a custom DTL. These steps are optional and depend on specific implementation needs, e.g., custom FHIR resource generation.
Then, convert the generated SDA into a FHIR Resource with the help of the IRIS built-in process.
The Structured Data Architecture (SDA) format serves as an intermediary, enabling flexible data transformation. Once the data is available in SDA format, it can be easily mapped to FHIR or other healthcare data standards.
Converting Custom/proprietary Data to SDA Format In this approach, begin by creating a persistent or interoperability request class to facilitate the transformation into SDA. It involves defining a custom patient class that maps data from your legacy or custom database structure into SDA-compliant objects.
Utilizing a custom patient class provides significant flexibility:
It simplifies object handling and manipulation.
It enables clean mapping in Data Transformation Language (DTL).
It allows an effortless reuse of the object in other transformation or business logic layers.
Request a class for the external layer to convert SDA:
Class Samples.FHIRAdapt.CustomStorage.Patient Extends (Ens.Request,%JSON.Adaptor)
{
Property Name As %String;
Property BirthDate As %String;
Property Citizenship As %String;
Property Religion As %String;
Property PrimaryLanguage As %String;
Property Married As %String;
Property MRN As %String;
}
This request class serves as the external interface layer, initiating the conversion process from your database format into SDA. Once the SDA object is created, it can be seamlessly transformed into the desired FHIR resource via standard or custom DTL mappings:
Add the Samples.FHIRAdapt.CustomStorage.Patient (use your class definition) class as the source class for the transformation.
Identify and select the appropriate SDA target class for mapping. In this case, HS.SDA3.Patient is a suitable class for transforming custom data into the SDA format.
Sample DTL conversion
Class Samples.FHIRAdapt.DTL.CustomDataToPatientSDA Extends Ens.DataTransformDTL [ DependsOn = (Samples.FHIRAdapt.CustomStorage.Patient, HS.SDA3.Patient) ]
{
Parameter IGNOREMISSINGSOURCE = 1;
Parameter REPORTERRORS = 1;
Parameter TREATEMPTYREPEATINGFIELDASNULL = 0;
XData DTL [ XMLNamespace = "http://www.intersystems.com/dtl" ]
{
<transform sourceClass='Samples.FHIRAdapt.CustomStorage.Patient' targetClass='HS.SDA3.Patient' create='new' language='objectscript' >
<assign value='$Piece(source.Name,",")' property='target.Name.GivenName' action='set' />
<assign value='$Piece(source.Name,",")' property='target.Name.FamilyName' action='set' />
<assign value='$Piece($Piece(source.Name,",",2)," ",2)' property='target.Name.MiddleName' action='set' />
<assign value='source.Citizenship' property='target.Citizenship' action='set' />
<assign value='"fullname"' property='target.Name.Type' action='set' />
<assign value='$Select(source.Married=1:"married",1:"single")' property='target.MaritalStatus.Code' action='set' />
</transform>
}
}
At this stage, the data has been successfully transformed into an SDA document and is ready for conversion into a FHIR resource.
Before generating the FHIR resource, additional supporting FHIR resources should be created as a part of this response. Besides, the custom fields need to be included in the FHIR output. To support these custom elements, the corresponding properties must be incorporated into the SDA structure.
It can be accomplished with the help of the SDA extensions, which enable the inclusion of custom data elements required for accurate and complete FHIR resource generation.
SDA Extension
FHIR follows the 80/20 rule, where the core FHIR specification covers approximately 80% of common healthcare use cases, while the remaining 20% are addressed through custom constraints and extensions.
To illustrate this, we will create an AllergyIntolerance resource with custom extensions.
There are two key steps for the proper implementation of extension data elements in InterSystems IRIS:
The class HS.SDA3.*******Extension is used to add extra data elements to each SDA section. For example, the class HS.Local.SDA3.AllergyExtension extends HS.SDA3.Allergy by defining the required custom properties.
Since the pre-built DTL mappings do not include these custom extensions, you must create a custom DTL to handle the transformation accordingly.
Allergy Extension Class
To build the required fields in the HS.Local.SDA3.AllergyExtension class for creating the required allergy resource, use the following lines of code:
Class HS.Local.SDA3.AllergyExtension Extends HS.SDA3.DataType
{
Parameter STREAMLETCLASS = "HS.SDA3.Streamlet.Allergy";
/// Mapped this property due to not being available in the SDA to FHIR conversion
Property Criticality As %String;
/// Mapped this property due to not being available in the SDA to FHIR conversion
Property Type As %String(MAXLEN = "");
Storage Default
{
<Data name="AllergyExtensionState">
<Subscript>"AllergyExtension"</Subscript>
<Value name="1">
<Value>Criticality</Value>
</Value>
<Value name="2">
<Value>Type</Value>
</Value>
</Data>
<State>AllergyExtensionState</State>
<Type>%Storage.Serial</Type>
}
}
Making an extension is a halfway done process because standard DTL does not have a mapping for the extension field. Now, we have to construct a custom DTL to transform the FHIR response properly.
Custom DTL Creation
Before customizing DTL classes, you need to define a dedicated package for all of your custom DTL implementations. To do that, InterSystems recommends using the package called HS.Local.FHIR.DTL.
To build a custom DTL for Allergy, start with the existing data transformation class:HS.FHIR.DTL.SDA3.vR4.Allergy.AllergyIntolerance, which handles the conversion from SDA to FHIR resources.
First, make a copy of this class into your custom package as
HS.Local.FHIR.DTL.SDA3.vR4.Allergy.AllergyIntolerance. Then, extend it by mapping your custom extensions into the FHIR resource generation process.
For instance, the sample class HS.Local.FHIR.DTL.FromSDA.Allergy demonstrates how to map Allergy extension fields for convenience, while inheriting all other mappings from the base class HS.FHIR.DTL.SDA3.vR4.Allergy.AllergyIntolerance.
Sample custom DTL mapping is illustrated below:
/// Transforms SDA3 HS.SDA3.Allergy to vR4 AllergyIntolerance
Class HS.Local.FHIR.DTL.SDA3.vR4.Allergy.AllergyIntolerance Extends Ens.DataTransformDTL [ DependsOn = (HS.SDA3.Allergy, HS.FHIR.DTL.vR4.Model.Resource.AllergyIntolerance), ProcedureBlock ]
{
XData DTL [ XMLNamespace = "http://www.intersystems.com/dtl" ]
{
<transform sourceClass='HS.SDA3.Allergy' targetClass='HS.FHIR.DTL.vR4.Model.Resource.AllergyIntolerance' create='existing' language='objectscript' >
<assign value='source.Extension.Criticality' property='target.criticality' action='set' />
<assign value='source.Extension.Type' property='target.type' action='set' >
<annotation>11/07/2023; ak; Added this set to populate type in AllergyIntolerance resource</annotation>
</assign>
</transform>
}
}
Once you have created your class package for custom DTL (in case the custom DTL package does not already exist), you must register it for future FHIR data transformation results.
set status = ##class(HS.FHIR.DTL.Util.API.ExecDefinition).SetCustomDTLPackage("HS.Local.FHIR.DTL")
Furthermore, you can obtain the custom DTL package details (if already defined) by calling the class method.
Write ##class(HS.FHIR.DTL.Util.API.ExecDefinition).GetCustomDTLPackage()
Stream Container Class for Request Message
The setup of the SDA and its optional SDA extension, along with the optional creation of a custom DTL for building the SDA, is now complete. However, the SDA object must now be converted into a standardized Ens.StreamContainer, used specifically in the SDA-to-FHIR conversion business process.
Here are the simple steps to convert the SDA object to Ens.StreamContainer.
ClassMethod CreateEnsStreamContainer()
{
set ensStreamCntr=""
try {
#; refer the CreateSDAContainer() method above
#dim SDAContainer As HS.SDA3.Container = ..CreateSDAContainer()
do SDAContainer.XMLExportToStream(.stream)
#; Create Ens.StreamContainer is the default format for processing the SDA to FHIR process
Set ensStreamCntr = ##class(Ens.StreamContainer).%New(stream)
}
catch ex {
Write ex.DisplayString()
set ensStreamCntr=""
}
return ensStreamCntr
}
The first phase of SDA creation is concluded. The second phase, generating the FHIR resource, is already handled by InterSystems IRIS.
The following article will demonstrate how to convert an SDA document into a FHIR resource.
SDA to FHIR Transformation
Configure Interoperability Business Hosts for FHIR Creation
Business logic for FHIR generation is finalized. Now, let’s configure the Interoperability production setup:
Set up your inbound service to receive requests from the external system.
Business process - it is a crucial step to create the FHIR resource.
Business Process Implementation
This business process focuses on SDA to FHIR transformation. InterSystems IRIS includes a comprehensive built-in business process, S.FHIR.DTL.Util.HC.SDA3.FHIR.Process that facilitates the transformation of the SDA to the FHIR message. By sending the generated SDA document to this business process, you receive a FHIR resource as a JSON response.
The Process supports two types of FHIR responses based on the SDA input.
Bundle – when an entire SDA container object is sent as an Ens.StreamConainter, the process returns a FHIR bundle with all resources.
Resource - when an individual SDA section (e.g., patient, encounter, allergy) is sent as an Ens.StreamConainter, it returns the corresponding single FHIR resource as a bundle.
Business Operation
The FHIR Bundle is now ready to be returned to the requester or sent to an external system.
Production settings:
Business Service Class
The business service class handles incoming requests from the external system to generate the FHIR.
Upon receiving the request, it creates the SDA using existing logic.
The SDA is then converted into a stream object.
This stream is transformed into the format expected by the standard business process.
Finally, the processed input is sent to the Business Process.
Class Samples.Interop.BS.GenerateFHIRService Extends Ens.BusinessService
{
Parameter ADAPTER = "Ens.InboundAdapter";
Property TargetConfigName As Ens.DataType.ConfigName [ InitialExpression = "HS.FHIR.DTL.Util.HC.FHIR.SDA3.Process" ];
Method OnProcessInput(pInput As %RegisteredObject, Output pOutput As %RegisteredObject) As %Status
{
#; create your SDA container object and export to stream
do ..CreateSDAContainer().XMLExportToStream(.sdaStream)
#; convert to the standard Ens.StreamContainer message format
set ensStreamCtnr = ##class(Ens.StreamContainer).%New(sdaStream)
#; send to the Business process
do ..SendRequestSync(..TargetConfigName,ensStreamCtnr,.pOutput)
Quit $$$OK
}
ClassMethod CreateSDAContainer() As HS.SDA3.Container
{
set SDAContainer = ##class(HS.SDA3.Container).%New()
#; create patient object
set patientSDA = ##class(HS.SDA3.Patient).%New()
set patientSDA.Name.FamilyName = "stood"
set patientSDA.Name.GivenName = "test"
set patientSDA.Gender.Code="male"
set patientSDA.Gender.Description="birth gender"
#; create Encounter 1
set encounterSDA = ##class(HS.SDA3.Encounter).%New()
set encounterSDA.AccountNumber = 12109979
set encounterSDA.ActionCode ="E"
set encounterSDA.AdmitReason.Code ="Health Concern"
set encounterSDA.AdmitReason.Description = "general health concern"
#; set the patientSDA into the container.
set SDAContainer.Patient = patientSDA
#; set encounters into the container SDA
do SDAContainer.Encounters.Insert(encounterSDA)
return SDAContainer
}
}
Creating SDA to FHIR Using ObjectScript
In the previous example, the FHIR resource was generated from SDA with the help of the Interoperability framework. In this section, we will build a FHIR bundle directly utilizing ObjectScript.
Creating a FHIR Bundle from an SDA Container
The CreateSDAContainer method returns an object of type HS.SDA3.Container (we referred to it above). This SDA container must be converted to a stream before being passed to the TransformStream method. The TransformStream method then processes the stream and returns a FHIR bundle as a %DynamicObject in tTransformObj.bundle.
ClassMethod CreateBundle(fhirVersion As %String = "R4") As %DynamicObject
{
try {
Set SDAContainer = ..CreateSDAContainer()
Do SDAContainer.XMLExportToStream(.stream)
#; Should pass stream, not a container object
Set tTransformObj = ##class(HS.FHIR.DTL.Util.API.Transform.SDA3ToFHIR).TransformStream( stream, "HS.SDA3.Container", fhirVersion)
return tTransformObj.bundle
}
catch ex {
write ex.DisplayString()
}
return ""
}
Creating a FHIR Bundle Using an SDA Section
In this approach, the patientSDA is declared directly within ObjectScript. This SDA object is then passed to the TransformObject method, which processes it and returns a FHIR bundle as a %DynamicObject.
ClassMethod CreatePatientResourceDirectSet()
{
try {
#; convert you're custom dataset into SDA by your DTL
set patientSDA = ##class(HS.SDA3.Patient).%New()
set patientSDA.Name.FamilyName = "stood"
set patientSDA.Name.GivenName = "test"
set patientSDA.Gender.Code="male"
set patientSDA.Gender.Description="birth gender"
#dim tTransformObj As HS.FHIR.DTL.Util.API.Transform.SDA3ToFHIR = ##class(HS.FHIR.DTL.Util.API.Transform.SDA3ToFHIR).TransformObject(patientSDA,"R4")
set patinetBundle = tTransformObj.bundle
}
catch ex {
write ex.DisplayString()
}
return patinetBundle
}
Creating an Allergy Resource with a Custom FHIR DTL and Allergy Extension
Populate all required fields, including custom extension fields, directly within the SDA object.
You should mention the FHIR version type as a second parameter in the TransformObject method (“R4” stands for Resource4 FHIR message).
Pass the completed SDA object to the FHIR transformation class to generate the AllergyIntolerance FHIR bundle.
Note: The custom extension for the allergy resource has already been defined, and the custom DTL mapping has been registered.
ClassMethod CreateAllergyWithDTL()
{
#; I already registered the "HS.Local.FHIR.DTL.SDA3.vR4.Allergy.AllergyIntolerance" for extension mapping
#; fetch the data from the table/global and set it into AllergySDA directly.
set allerySDA = ##class(HS.SDA3.Allergy).%New()
set allerySDA.Extension.Criticality = "critial"
set allerySDA.Extension.Type = "t1"
set allerySDA.Comments = "testing allergies"
set allerySDA.AllergyCategory.Code="food"
set allerySDA.AllergyCategory.Description="sea food"
#; Set the required and additional properties in SDA, depending on your requirements.
#; create a FHIR resource from the allergySDA with extension fields that uses a custom "HS.Local.FHIR.*" DTL
#dim tTransformObj As HS.FHIR.DTL.Util.API.Transform.SDA3ToFHIR = ##class(HS.FHIR.DTL.Util.API.Transform.SDA3ToFHIR).TransformObject(allerySDA,"R4")
Set patinetBundle = tTransformObj.bundle
}
FHIR to SDA Conversion
Custom data, HL7 v2.x, or CCDA messages were previously converted into FHIR. The next implementation involves converting the FHIR Bundle or resource into SDA format, which can then be stored in the database or transformed into CCDA or HL7 v2.x formats.
A JSON or XML-formatted FHIR resource is received from an external system. Upon receipt, the resource must be converted into the internal data structure and stored in the IRIS database.
Business Service
Requests can be received via HTTP/REST or any other inbound adapters based on the requirements.
Business Process - FHIR To SDA Transformation
Once InterSystems IRIS receives the FHIR request message, it provides an extensive built-in business process (HS.FHIR.DTL.Util.HC.FHIR.SDA3.Process). This business process takes a FHIR resource or Bundle as input. The FHIR input can only be of the configured FHIR version. This business process transforms the FHIR data into SDA3, forwards the SDA3 stream to a specified business host, receives the response from the business host, and returns a FHIR response.
Please note that you cannot send the received request to this Business process directly.
The request input type should be in the following:
“HS.FHIRServer.Interop.Request” – for Interoperability production.
“HS.Message.FHIR.Request” – FHIR repository server.
It means that you must convert the request to one of the abovementioned formats before sending.
Creating Interop.Request
ClassMethod CreateReqObjForFHIRToSDA(pFHIRResource As %DynamicObject) As HS.FHIRServer.Interop.Request
{
#; sample message
set pFHIRResource = {"resourceType":"Patient","name":[{"use":"official","family":"ashok te","given":["Sidharth"]}],"gender":"male","birthDate":"1997-09-08","telecom":[{"system":"phone","value":"1234566890","use":"mobile"},{"system":"email","value":"tornado1212@gmail.com"}],"address":[{"line":["Some street"],"city":"Manipal1","state":"Karnataka1","postalCode":"1234561"}]}
set stream = ##class(%Stream.GlobalCharacter).%New()
do stream.Write(pFHIRResource.%ToJSON())
#; create Quick stream
set inputQuickStream = ##class(HS.SDA3.QuickStream).%New()
set inputQuickStreamId = inputQuickStream.%Id()
$$$ThrowOnError( inputQuickStream.CopyFrom(stream) )
#dim ensRequest as HS.FHIRServer.Interop.Request = ##class(HS.FHIRServer.Interop.Request).%New()
set ensRequest.QuickStreamId = inputQuickStreamId
return ensRequest
Once the HS.FHIRServer.Interop.Request message is created, send it to the Business process to convert the FHIR resource to an SDA bundle.
Production settings:
Business Service Class
The Class receives the stream of a FHIR resource via an HTTP request, converts this stream input to the standard process expected format HS.FHIRServer.Interop.Request, and finally calls the FHIR adapter process class to generate the SDA.
Class Samples.Interop.BS.FHIRReceiver Extends Ens.BusinessService
{
Parameter ADAPTER = "EnsLib.HTTP.InboundAdapter";
Property TargetConfigName As Ens.DataType.ConfigName [ InitialExpression = "HS.FHIR.DTL.Util.HC.FHIR.SDA3.Process" ];
Method OnProcessInput(pInput As %Stream.Object, Output pOutput As %Stream.Object) As %Status
{
set inputQuickStream = ##class(HS.SDA3.QuickStream).%New()
set inputQuickStreamId = inputQuickStream.%Id()
$$$ThrowOnError( inputQuickStream.CopyFrom(pInput) )
#dim ensRequest as HS.FHIRServer.Interop.Request = ##class(HS.FHIRServer.Interop.Request).%New()
set ensRequest.QuickStreamId = inputQuickStreamId
Do ..SendRequestSync(..TargetConfigName, ensRequest, .pOutput)
Quit $$$OK
}
}
Creating SDA from the FHIR Resource Using ObjectScript
In the previous example, the SDA document was generated from FHIR with the help of the Interoperability framework. In this section, we will employ an SDA from FHIR directly using ObjectScript.
Once you have received the FHIR resource/Bundle as a request into the IRIS, convert the FHIR JSON to an SDA container:
Convert the InterSystems %DynamicObject AKA JSON into %Stream object.
Execute the TransformStream method from the HS.FHIR.DTL.Util.API.Transform.FHIRToSDA3 class, which returns the SDA container object as a response.
///Simple, straightforward FHIR JSON resource to SDA conversion
ClassMethod CreateSDAFromFHIRJSON()
{
try {
; have to send as a stream, not a %DynamicObject
set patientStream = ##Class(%Stream.GlobalCharacter).%New()
do patientStream.Write({"resourceType":"Patient","name":[{"use":"official","family":"ashok te","given":["Sidharth"]}],"gender":"male","birthDate":"1997-09-08","telecom":[{"system":"phone","value":"1234566890","use":"mobile"},{"system":"email","value":"tornado1212@gmail.com"}],"address":[{"line":["Some street"],"city":"Manipal1","state":"Karnataka1","postalCode":"1234561"}]}.%ToJSON())
#dim SDAObj As HS.FHIR.DTL.Util.API.Transform.FHIRToSDA3 = ##class(HS.FHIR.DTL.Util.API.Transform.FHIRToSDA3).TransformStream(patientStream,"R4","JSON")
set SDAContainer = SDAObj.container
; XML-based SDA output
write SDAContainer.XMLExport()
}
catch ex {
write ex.DisplayString()
}
}
FHIR XML to SDA container.
Convert the XML into %Stream object.
Execute the TransformStream method from the HS.FHIR.DTL.Util.API.Transform.FHIRToSDA3 class, which returns the SDA container object as a response.
/// Simple, straightforward FHIR XML resource to SDA conversion
ClassMethod CreateSDAFromFHIRXML()
{
try {
set patientXML = "<Patient xmlns=""http://hl7.org/fhir""><id value=""example""/><text><status value=""generated""/><div xmlns=""http://www.w3.org/1999/xhtml""><p>John Doe</p></div></text><identifier><use value=""usual""/><type><coding><system value=""http://terminology.hl7.org/CodeSystem/v2-0203""/><code value=""MR""/></coding></type><system value=""http://hospital.smarthealth.org""/><value value=""123456""/></identifier><name><use value=""official""/><family value=""Doe""/><given value=""John""/></name><gender value=""male""/><birthDate value=""1980-01-01""/></Patient>"
set patientStream = ##Class(%Stream.GlobalCharacter).%New()
do patientStream.Write(patientXML)
#dim SDAObj As HS.FHIR.DTL.Util.API.Transform.FHIRToSDA3 = ##class(HS.FHIR.DTL.Util.API.Transform.FHIRToSDA3).TransformStream(patientStream,"R4","XML")
set SDAContainer = SDAObj.container
; XML-based SDA output
write SDAContainer.XMLExport()
}
catch ex {
write ex.DisplayString()
}
}
By following the steps detailed above, you can seamlessly transform data to or from a FHIR resource.
Other built-in FHIR repository and FHIR Facade options are valuable tools for exposing a FHIR-compliant system and for handling and storing FHIR resources efficiently.
Article
Irène Mykhailova · Jun 28
Hey Community!
Here's the recap of the final half-day of the InterSystems Ready 2025! It was the last chance to see everyone and say farewell until next time.
It was a warm and energetic closing, with great conversations, smiles, and unforgettable memories!
The final Ready 2025 moment with our amazing team!
And, of course, let’s say a huge THANK YOU to a godmother of the Ready 2025, @Maureen.Flaherty! You and your team are the best! Here we are together with @Enrico.Parisi.
@Patrick.Sulin7198 dropped by Developer Cmmunity table:
And @Yuri.Gomes
Caught @Scott.Roth outside the Tech Exchange
And @Sergei.Shutov3787
My golf buddy @Anzelem.Sanyatwe also came to spin the wheel of fortune. And Luc Chatty dropped by.
We went to visit the source of great ribbons. Here are @Iryna.Mykhailova, @Macey.Minor3011, @Andre, @Anastasia.Dyubaylo
It was also time for the winners of the AI Programming Contest to present their AI agentic applications!
@Sergei.Shutov3787 talked about AI Agents as First-Class Citizens in InterSystems IRIS:
@Eric.Fortenberry presented "A Minimalist View of AI: Exploring Embeddings and Vector Search with EasyBot":
@Yuri.Gomes spoke about Natural Language Control of IRIS:
@Muhammad.Waseem talked about Next generation of autonomous AI Agentic Applications:
@henry, @Henrique, and @José.Pereira got hid by all the people who came to listen "Command the Crew - create an AI crew to automate your work" presentation:
@Victor.Naroditskiy explained how Developer Community AI works:
Also, on the other tables people carried out other presentations. For example, @Guillaume.Rongier7183 talked about Python:
Let's leave Tech Exchange and see what was going on at the DC sessions. @Benjamin.Spead, @Hannah.Sullivan, @Victor.Naroditskiy, and @Dean.Andrews2971 talked about using SerenityGPT to build GenAI middleware:
And, of course, the main session of the Ready 2025 - InterSystems Developer Ecosystem: new resources and tools you need to know. @Dean.Andrews2971 and @Anastasia.Dyubaylo gave an overview of all the updates to the DC Ecosystem:
Afterwards, @David.Reche checked how attentively everyone was listening by leading the Kahoot! game. Please welcome the winners: @Vishal.Pallerla, @Rochael.Ribeiro and @Jason.Morgan. Congratulations! We hope you enjoy your prize!
@Juliana.MatsuzakiModesto, @DKG, @Rochael.Ribeiro, @Katia.Neves, @Anastasia.Dyubaylo, @Dean.Andrews2971, @Enrico.Parisi, @Vishal.Pallerla, @Eduard.Lebedyuk
On this happy note, I promised last time to tell you who was the only verified person who answered all the quiz questions correctly at the DC table. And it was @Asaf.Sinay! Congratulations! @Olga.Zavrazhnova2637 and the whole Global Masters team are happy that so many people came and tried to master it. If you're interested to do a quiz, here's the link. And if you want to answer more quiz questions, you can find them on Global Masters!
Talking about Global Masters and quizzes, you can't skip the most popular reward 😁 No summit goes by without someone showing me their Developer Community socks 🤣
As you can see, the Brazilian DC team is very happy: @Rochael.Ribeiro, @Juliana.MatsuzakiModesto. @Danusa.Ferreira and @Heloisa.Paiva we really missed you - with you, the Portuguese Developer Community team would've been complete!
This was almost the end of the Ready 2025 and it's the end of my story.
The rumor is, the next summit will take place in April in Washington, D.C. Put it in your calendar not to double book, you know you want to!
See you next year! Great event, fantastic DC events and very happy to meet persons that make our community Thanks @Irène.Mykhailova for the sharing.This year’s event was an incredible opportunity to meet community heroes, explore cutting-edge technologies, forge meaningful connections, and exchange ideas with industry leaders. It has been very interesting and, as usual, keeping in touch with old fellows as well meeting new ones is the most valuable think of the conference.
And, HEY! only 10 month to the next conference!Washington DC from 27th to 29th of April!
Please note that next conference will start with the Welcome Reception on Monday (usually Sunday).
I'm looking forward to meet you all in DC! 😊 Great team!!!I've missed you all so much this year, I hope to see you all next year. What an incredible event! 🙌
It was so great to see such a strong presence of our amazing Developer Community — so many familiar faces and so many new ones too!
Thanks to everyone who made this experience so special. Already looking forward to next year! 😊 It was very good to see you all !!!!! It was so great to see you and being your neighbors again at the Tech Exchange pavilion! It was a great event full of interesting info and people! So nice! Tks for the reminder 😉
Announcement
Anastasia Dyubaylo · Jul 8
Hi Developers,
We are happy to announce the new InterSystems online programming contest dedicated to creating useful tools to make your fellow developers' lives easier:
🏆 InterSystems Developer Tools Contest 🏆
Duration: July 14 - August 3, 2025
Prize pool: $12,000
The topic
Develop any applications that improve developer experience with IRIS, help to develop faster, contribute more qualitative code, help to test, deploy, support, or monitor your solution with InterSystems IRIS.
General Requirements:
An application or library must be fully functional. It should not be an import or a direct interface for an already existing library in another language (except for C++, there you really need to do a lot of work to create an interface for IRIS). It should not be a copy-paste of an existing application or library.
Accepted applications: new to Open Exchange apps or existing ones, but with a significant improvement. Our team will review all applications before approving them for the contest.
The application should work either on IRIS Community Edition or IRIS for Health Community Edition. Both could be downloaded as host (Mac, Windows) versions from Evaluation site, or can be used in a form of containers pulled from InterSystems Container Registry or Community Containers: intersystemsdc/iris-community:latest or intersystemsdc/irishealth-community:latest .
The application should be Open Source and published on GitHub or GitLab.
The README file to the application should be in English, contain the installation steps, and contain either the video demo or/and a description of how the application works.
Only 3 submissions from one developer are allowed.
NB. Our experts will have the final say in whether the application is approved for the contest or not based on the criteria of complexity and usefulness. Their decision is final and not subject to appeal.
Prizes
1. Experts Nomination - a specially selected jury will determine winners:
🥇 1st place - $5,000
🥈 2nd place - $2,500
🥉 3rd place - $1,000
🏅 4th place - $500
🏅 5th place - $300
🌟 6-10th places - $100
2. Community winners - applications that will receive the most votes in total:
🥇 1st place - $1,000
🥈 2nd place - $600
🥉 3rd place - $300
🏅 4th place - $200
🏅 5th place - $100
❗ If several participants score the same number of votes, they are all considered winners, and the prize money is shared among the winners.❗ Cash prizes are awarded only to those who can verify their identity. If there are any doubts, organizers will reach out and request additional information about the participant(s).
Who can participate?
Any Developer Community member, except for InterSystems employees (ISC contractors allowed). Create an account!
Developers can team up to create a collaborative application. 2 to 5 developers are allowed in one team.
Do not forget to highlight your team members in the README of your application – DC user profiles.
Important Deadlines:
🛠 Application development and registration phase:
July 14, 2025 (00:00 EST): Contest begins.
July 27, 2025 (23:59 EST): Deadline for submissions.
✅ Voting period:
July 28, 2025 (00:00 EST): Voting begins.
August 3, 2025 (23:59 EST): Voting ends.
Note: Developers can improve their apps throughout the entire registration and voting period.
Helpful Resources:
✓ Example applications:
webterminal - an emulation for IRIS terminal as a web application
git-source-control - git tool to manage changes for shared dev environments and IRIS UI dev editors by @Timothy Leavitt
iris-rad-studio - RAD for UI
cmPurgeBackup - backup tool
errors-global-analytics - errors visualization
objectscript-openapi-definition - open API generator
Test Coverage Tool - test coverage helper
iris-bi-utils - a toolset for IRIS BI
and many more.
✓ Templates we suggest to start from:
iris-dev-template
Interoperability-python
rest-api-contest-template
native-api-contest-template
iris-fhir-template
iris-fullstack-template
iris-interoperability-template
iris-analytics-template
✓ For beginners with IRIS:
Build a Server-Side Application with InterSystems IRIS
Learning Path for beginners
✓ For beginners with ObjectScript Package Manager (IPM):
How to Build, Test and Publish IPM Package with REST Application for InterSystems IRIS
Package First Development Approach with InterSystems IRIS and IPM
✓ How to submit your app to the contest:
How to publish an application on Open Exchange
How to submit an application for the contest
Need Help?
Join the contest channel on InterSystems' Discord server or talk with us in the comment to this post.
We're waiting for YOUR project – join our coding marathon to win!
By participating in this contest, you agree to the competition terms laid out here. Please read them carefully before proceeding.
Announcement
Anastasia Dyubaylo · Jun 5
Hello Community,
We're thrilled to invite all our Developer Community members (both InterSystems employees and not) to participate in our next contest!
💡 The 4th InterSystems Ideas Contest 💡
We're looking for your innovative ideas to enhance InterSystems IRIS and related products and services. We encourage suggestions based on real-life use cases, highlighting the tangible benefits your idea will bring to other users and how it will enhance developers' experiences with InterSystems technology.
📅 Duration: June 9 - July 20, 2025
🏆 Prizes for the best ideas and a random draw!
🎁 Gifts for everyone: A special gift will be given to each author whose idea is accepted in the contest.
>> SUBMIT AN IDEA <<
Accepted ideas should:
be created during the Ideas Contest period by a user registered on the InterSystems Ideas portal (you can log in via InterSystems SSO);
not be part of other already existing ideas - only new ideas are allowed;
not describe the existing functionality of InterSystems IRIS and related Products or Services;
be posted in English;
be written by a person, not generated by AI;
be accepted as meaningful by InterSystems experts;
❗adhere to the structure below:
1️⃣ Description of the idea
2️⃣ Who is the target audience?
3️⃣ What problem does it solve?
4️⃣ How does this impact the efficiency, stability, reliability, etc, of the product?
5️⃣ Provide a specific use case or scenario that illustrates how this idea could be used in practice.
All ideas are subject to moderation. We may request to clarify the submitted idea. Ideas that meet the requirements will receive a special "Ideas Contest" status.
Who can participate?
We invite EVERYONE to join our new Ideas Contest. Both InterSystems employees and non-employees are welcome to participate and submit their ideas.
Prizes
1. Participation gift - authors of all accepted ideas will get:
🎁 Aluminum Media Stand
2. Expert award - InterSystems experts will select the best ideas. Winners will get:
🥇 1st place - Stilosa Barista Espresso Machine & Cappuccino Maker
🥈 2nd place - Osmo Mobile 7
🥉 3rd place - Smart Mini Projector XGODY Gimbal 3
3. Random award - a participating idea author chosen at random will get:
🏅 Smart Mini Projector XGODY Gimbal 3
Note: InterSystems employees are eligible to receive only the participation gift. Expert and Random awards can only be won by Developer Community members who are not InterSystems employees.
Important dates:
⚠️ Idea Submission: June 9 - July 13
✅ Voting for ideas: July 14 - July 20
🎉 Winners announcement: July 21
Good luck! 🍀
Note: All prizes are subject to availability and shipping options. Some items may not be available for international shipping to specific countries, in this case, an equivalent alternative will be provided. We will let you know if a prize is not available and offer a possible replacement. Prizes cannot be delivered to residents of Crimea, Russia, Belarus, Iran, North Korea, Syria, or other US-embargoed countries. Can inters join and do we count as employees who can get the participation gift? 😊 Interns are absolutely welcome to join — we’d love to have you involved! 😊 And they are considered employees for this contest.
While the Expert and Random awards are reserved for non-employees, taking part is still a great opportunity. As an intern, you have a unique, hands-on perspective that can contribute to a deeper understanding of what the product truly needs. Thanks for clarification. I'll be thinking of ideas! As I'm diving into Angular right now, mine is to add a projection to a TypeScript interface.
Is there a specific tag we're supposed to use for the contest this year? Hey Developers!
The fantastic prizes for the 4th InterSystems Ideas Contest were chosen, and here they are:
Expert Award 🏆
InterSystems experts will select the best ideas, with amazing prizes awaiting the winners:
🥇 1st place: Stilosa Barista Espresso Machine & Cappuccino Maker
🥈 2nd place: Osmo Mobile 7
🥉 3rd place: Smart Mini Projector XGODY Gimbal 3
Random Award
One lucky idea chosen at random will win:
🏅 Smart Mini Projector XGODY Gimbal 3
Reminder: To qualify for the contest, ensure your idea submission follows this required structure:
1. Description of the idea2. Who is the target audience?3. What problem does it solve?4. How does this impact the efficiency, stability, reliability, etc., of the product?5. Provide a specific use case or scenario that illustrates how this idea could be used in practice.
Good luck, everyone! 🚀✨ There is no need to choose a specific tag. All ideas that pass our experts' master will be added to the Contest. Hi Community!
We have an update on the dates of the contest - it's extended until July 20. During InterSystems Ready 2025, we received numerous requests to do this, as many of you were focused on your presentations or other commitments related to the event and didn't have the opportunity to submit your ideas.
You asked, we listened!
Don't forget, for the idea to take part in the contest, it has to follow the structure:
Description of the idea
Who is the target audience?
What problem does it solve?
How does this impact the efficiency, stability, reliability, etc, of the product?
Provide a specific use case or scenario that illustrates how this idea could be used in practice.
At this point, we have a lot of interesting ideas, but they don't adhere to the terms, so they aren't considered for the contest. @Mark.OReilly, @Andre.LarsenBarbosa, @Abdul.Manan, @Marykutty.George1462, @Jeffrey.Drumm, @Robert.Barbiaux, @Sylvain.Guilbaud, @Ashok.Kumar. Hey Community!
During the contest period, 43 ideas were submitted to the Ideas Portal, thank you for your contributions! 🙌Out of those, 24 ideas have been accepted into the Contest so far.
If your idea wasn't accepted yet (one of the remaining 19), there's still time! To be included in the Contest, please make sure your idea follows this required structure:
1️⃣ Description of the idea2️⃣ Who is the target audience?3️⃣ What problem does it solve?4️⃣ How does this improve the efficiency, stability, reliability, etc., of the product?5️⃣ Provide a specific use case or scenario that shows how your idea could be used in practice
🕒 You have until the end of day Sunday to update your submissions. Don’t miss your chance to participate!
Good luck! 🍀 Hey Community!
The submission period is over, and now it's voting time! While the judges are hard at work, let's look at the ideas that are participating in the contest:
Author
Idea
@Yuri.Gomes
Extending an open source LLM to support efficient code generation in intersystems technology
@David.Hockenbroch
Add Typescript Interface Projection
@Enrico.Parisi
Make DICOM iteroperability adapter usable in Mirror configuration/environment
@Marykutty.George1462
Ability to abort a specific message from message viewer or visual trace page
@Enrico.Parisi
Do not include table statistics when exporting Production for deployment
@Ashok.Kumar
recursive search in Abstract Set Query
@Ashok.Kumar
TTL(Time To Live) Parameter in %Persistent Class
@Ashok.Kumar
Programmatic Conversion from SDA to HL7 v2
@Ashok.Kumar
Streaming JSON Parsing Support
@Ashok.Kumar
Differentiating System-Defined vs. User-Defined Web Applications in IRIS
@Ashok.Kumar
Need for Application-Specific HTTP Tracing in Web Gateway
@Ashok.Kumar
Add Validation for Dispatch Class in Web Application Settings
@Ashok.Kumar
Encoding in SQL functions
@Ashok.Kumar
Compression in SQL Functions
@Alexey.Maslov
Universal Global Exchange Utility
@Ashok.Kumar
Automatically Expose Interactive API Documentation
@Vishal.Pallerla
Dark Mode for Management Portal
@Ashok.Kumar
IRIS Native JSON Schema Validator
@Ashok.Kumar
Enable Schema Validation for REST APIs Using Swagger Definitions
@diba
Auto-Scaling for Embedded Python Workloads in IRIS
@Dmitry.Maslennikov
Integrate InterSystems IRIS with SQLancer for Automated SQL Testing and Validation
@Dmitry.Maslennikov
Bring IRIS to the JavaScript ORM World
@Ashok.Kumar
HTML Report for UnitTest Results
@Andre.LarsenBarbosa
AI Suggestions for Deprecated Items
@Mark.OReilly
Add a field onto Oauth Client to allow alerting expiry dates alert
@Mark.OReilly
Expose "Reply To" as default on EnsLib.EMail.AlertOperation
I have updated the structure on my 2 ideas, wasn't aware of the contest, was just creating ideas ah it's closed now, no matter, hopefully in clearer format anyway now for actioning Since it's the beginning of the voting period, it was decided to add your ideas to the contest. Good luck!
Announcement
Anastasia Dyubaylo · 11 hr ago
Hi Community!
We’re pleased to announce that several Early Access Programs (EAPs) are now open for registration. These programs provide developers with the opportunity to explore upcoming features and technologies from InterSystems before their general release.
By joining an EAP, you can:
Evaluate and test new functionality
Provide direct feedback to product teams
Help shape the future direction of InterSystems platforms
If you're interested in contributing to the evolution of our tools and getting early insight into what's coming next, we encourage you to review the following available programs and sign up:
Name
Description
FHIR Application Training Course
InterSystems is developing a comprehensive FHIR application development course. The course will consist of 40 hours of recorded video content, numerous GitHub repositories, interactive quizzes, and supplementary material. The course will be targeted at FHIR application developers and will use the InterSystems IRIS for Health Community Edition with application development in Python.
Health Connect - AI Copilot for DTL Explanations
Uses generative AI to provide human-readable summaries for Data Transformation Language (DTL) logic. This reduces onboarding time for new interface developers, accelerates troubleshooting, and makes it easier for non-experts to understand and maintain complex data transformations.
Health Connect - DTL Assistant Copilot for HL7 Transformations
Introduces AI-powered mapping assistance that suggests transformations, auto-generates mappings, and helps enforce standards between HL7 message formats. This assistant shortens development cycles, reduces errors, and increases team productivity by lowering the technical barrier to working with HL7.
Health Connect - CDA Validation
Enables robust validation of Clinical Document Architecture (CDA) files against standards and rules, improving data integrity and compliance.This tool helps organizations detect structural and content-related issues early, ensuring smoother data exchange and regulatory readiness.
Health Data De-Identifier
Health Data De-identifier is a configurable framework to de-identify structured clinical data. It incorporates HIPAA Safe Harbor rules for the US but provides hooks to be adjusted for regional requirements.
InterSystems Data Fabric Studio Virtual Assistant module
InterSystems Data Fabric Studio (IDFS) simplifies data management and exploration. With its Virtual Assistant module, users gain access to interactive assistants that help them understand, navigate, and leverage their data more effectively. Built on an agentic framework, IDFS supports reasoning, diagnostics, evaluation, and governance capabilities. Users can also design custom assistants and agents that are enriched with specialized knowledge and memory for context-aware interaction.
InterSystems Data Fabric Studio with Health Module
Provides a fully managed, self-service platform for provisioning trusted healthcare datasets, data models for analytics, AI models, and operational reporting. It includes out-of-the-box business connectors, pipeline building, and data catalog modeling capabilities.
InterSystems IRIS Security Database
This new feature provides greater security by moving sensitive data from IRISSYS to a new database: IRISSECURITY. An additional security role has been added to restrict access to this data.
OAuth2 Authentication/Authorization
InterSystems is making it easier for customers to use and configure OAuth2.
Online Backup
With Online Backup, InterSystems IRIS automatically tracks which blocks change, such that it can easily create incremental backups of your data without freezing or otherwise interrupting regular operations. In InterSystems IRIS 2024.1, the first part of a significant overhaul of this capability is now available for your evaluation, bringing orders-of-magnitude faster backup and restore operations compared to the original version. If you are using or considering Online Backup today, please join the Early Access Program to receive updates and share your feedback.
Table Partitioning
Table Partitioning helps users manage large tables efficiently by enabling them to split the data across multiple databases based on a logical scheme. This enables, for example, moving older data to a database mounted on a cheaper tier of storage, while keeping the current data that is accessed frequently on premium storage. The data structure for partitioned tables also brings several operational and performance benefits when tables get very large (> 1B rows).
Participation may be limited, so we recommend registering early if you're interested. If you have any questions, please don't hesitate to ask them in the comments or send an email to EarlyAccess@InterSystems.com.
>> REQUEST EARLY ACCESS HERE <<