Find

Question
· Jun 18

Custom Description Popup Text for Interoperability Production Business Hosts

I seem to remember making this work before, but I'm not having any luck digging up examples.

I've defined some custom properties for a business operation that could definitely benefit from having popup descriptions available in the Production Configuration. I have triple-slash comments before each property that do just that in the source. I thought those provided the text for the popup descriptions when clicking on the property name, but apparently not.

Any thoughts?

10 Comments
Discussion (10)2
Log in or sign up to continue
Article
· Jun 18 2m read

Options for Python Devs + Poll!

I am writing this post primarily to gather an informal consensus on how developers are using Python in conjunction with IRIS, so please respond to the poll at the end of this article! In the body of the article, I'll give some background on each choice provided, as well as the advantages for each, but feel free to skim over it and just respond to the poll.

 

As most of you are probably aware, Python is currently the most popular programming language among developers around the world - and for good reason. Python is intuitive, easy to read, has an expanse of libraries and packages, and offers a number of resources for solving almost any problem. For these reasons and more, Python is a natural first choice for developers. 

So, when a developer first starts working with InterSystems IRIS, a common question might come up: "Can I use Python while still getting the most out of IRIS?" The answer? "Yes!" This article will introduce a few of the most common ways Python can be used in conjunction with InterSystems IRIS.

Options for Using Python with IRIS:

1. Embedded Python:
Introduced in 2021, Embedded Python allows for writing and executing Python code directly within ObjectScript. This allows for side-by-side use of Python and ObjectScript, allowing developers  to integrate Python and IRIS. This is a great choice for those who want to use Python while never leaving the IRIS environment.

2. ODBC Connectivity, via pyODBC:
You can also connect external Python apps to IRIS using ODBC, just like in a traditional client-server architectures. This is probably the most widely-used integration method, since it uses familiar SQL-based workflows for Python developers. The pyODBC package supports the DB API 2.0 specification, and lets you execute SQL queries on IRIS databases.

3.InterSYstems Native API and ELS:
These two options give you more flexible integration between Python and IRIS:
    a. Native API for Python: this gives direct access to core IRIS features like persistent classes and global structures. It's best for when you want to work with IRIS-specific consturcts programmatically from Python.
    b. External Language Server: This lets IRIS to call out to Python code running in a separate process, or even on a separate server. It's super useful when the Python code is resource-intensive (high CPU oor memory usage) or when you might want isolation from the IRIS kernel for stability/scalability.

In Summary:
Python and InterSystems IRIS complement each other well. Whether you're a developer with limited ObjectScript experience, someone who's looking to take advantage of Python's ecosystem, or simply prefer writing code in Python, there are several integration paths available to you. Each option has different benefits based on your architecture and use case. 

Thank you for reading this brief overview of the ways you can use Python with InterSystems IRIS!

5 Comments
Discussion (5)4
Log in or sign up to continue
Question
· Jun 18

SSL/TLS unsupported protocol error

I'm using a %Net.HttpRequest which had been successful in the past, but started failing at some point with a SSL/TLS protocol error.

ERROR #6085: Unable to write to socket with SSL/TLS configuration 'groundca', error reported 'SSL/TLS error in SSL_connect(), SSL_ERROR_SSL: protocol error, error:14077102:SSL routines:SSL23_GET_SERVER_HELLO:unsupported protocol'

The SSL/TLS configuration:

The request's SSLConfig is set to the "groundca" config when making the request.

A request using the same URL, API key, and CA file through Curl receives the desired response from the API at "https://osrd.atlassian.net/rest/api/2/issue/<issue-name>", so I believe the issue isn't with the OS, networking, or API server. It shows that Curl is using TLSv1.3.

This thread makes me think perhaps an older version of SSL is being used in the Caché request instead of TLS even though the SSL/TLS config is set to use TLS, since the post also shows "SSL23" in the error, and suggests it comes from the OP's config being set to use 'SSL23' rather than 'tls12': https://github.com/lefcha/imapfilter/issues/140#issuecomment-259671735

Another IS thread shows a similar issue that was worked around. They believed a lack of SNI was the issue, though this was a handshake error rather than a protocol error: https://community.intersystems.com/post/how-do-not-use-sslv3-force-tls-variant-httprequest-aws-api-gateway

 

Edit - adding ^REDEBUG log output:

06/19/25-09:52:07:258 (11524) 0 tpopen for host osrd.atlassian.net device number 25334188 port 443  mode 0x8848 tcpmode 0x24 terminators  ibfsz 8192 obfsz 8192 queuesize 5 timeout 30 tcpsbuf=0 tcprbuf=0, XY=off, BINDTO=
06/19/25-09:52:07:275 (11524) 0 TCPConnect: SNDBUF sys size=65536, dev size=0
06/19/25-09:52:07:276 (11524) 0 TCPConnect: RCVBUF sys size=65536, dev size=0
06/19/25-09:52:07:281 (11524) 0 
TCP connected to site 13.227.180.4 port 443
06/19/25-09:52:07:282 (11524) 0 StreamInit: SNDBUF sys size=65536, dev size=0
06/19/25-09:52:07:283 (11524) 0 StreamInit: RCVBUF sys size=65536, dev size=0
06/19/25-09:52:07:284 (11524) 0 
SSL/TLS configuration: groundca
06/19/25-09:52:07:285 (11524) 0 
Cipher list: ALL:!aNULL:!eNULL:!EXP:!SSLv2
06/19/25-09:52:07:285 (11524) 0 
Trusted certificate file: c:\Users\dwp\downloads\cacert.pem
06/19/25-09:52:07:296 (11524) 0 

Peer verification option = 1, certificate depth = 9
06/19/25-09:52:07:297 (11524) 0 
SSL/TLS client requested.
06/19/25-09:52:07:300 (11524) 0 
SSL/TLS error return from SSL_connect().
06/19/25-09:52:07:301 (11524) 0 
SSL_ERROR_SSL: protocol error
06/19/25-09:52:07:302 (11524) 0 
error:14077102:SSL routines:SSL23_GET_SERVER_HELLO:unsupported protocol
06/19/25-09:52:07:304 (11524) 0 
TPXMIT saw TCP device fail

 

Edit 2:

We're calling out to Curl with zf(-1) and having Curl write the result to a file so it can be read in that way. I'm still wondering whether the wrong version of SSL is being used because of the other post mentioning the same error for another tool, but I doubt we'll look into it further for now.

3 Comments
Discussion (3)2
Log in or sign up to continue
Announcement
· Jun 18

[Video] Chaining LLMs for Better Results using Agentic AI

Hey Community!

We're happy to share the next video in the "Code to Care" series on our InterSystems Developers YouTube:

⏯  Chaining LLMs for Better Results using Agentic AI

In this video, you will see a practical example of Agentic AI using a compound workflow of multiple large language models (LLMs) to improve the quality of output. The scenario involves generating a marketing plan for a new product: one LLM drafts the plan, another critiques it, and a third produces a final, improved version based on the critique. The video also walks through the process in code, demonstrating how chaining LLMs can enhance results for complex tasks. It also discusses the trade-offs - higher cost and longer execution time - for more accurate, policy-aligned, and refined outputs.

🗣 Presenter: @Don Woodlock, Head of Global Healthcare Solutions, InterSystems

Enjoy watching, and subscribe for more videos! 👍

Discussion (0)1
Log in or sign up to continue
Article
· Jun 18 3m read

How to create your own search table for HL7 messaging

My intention is to show how simple it is to generate a lookup table taking into account the information received in our HL7 messaging. The HL7 message lookup table provided by IRIS is certainly sufficient for most of the searches we want to perform, but we always have that special field in our HIS, LIS, RIS, etc. that we'd like to search by. But that's in a segment outside of that lookup table. That field forces us to generate a specific search using the expanded search criteria. We'll likely have many messages, and we'll also have to filter by date and time so we don't timeout.
 

 

How do we solve this?

By generating our own search table.

And how do we generate our search table?

As we non-geniuses have always done, COPYING! In our lives, we will meet 3-4 geniuses. They will be the ones who invent, generate, visualize the afterlife, etc. If you are one of them, you already know how to do this. You'll be in the top 100 of the global masters, you'll have more than 500,000 points, and a veritable trousseau of InterSystems products in your home. The rest of us will do as we have done all these years before the chatGPT: copy, and without any shame, with our heads held high, so let's begin.

 

First Step

We will generate our class in Visual Studio Code, we will open the EnsLib.HL7.SearchTable class that IRIS gives us, and we will copy all the content of the class (CTRL+C). Then we will go to our class and paste (CTRL+V).
 

 

Super important: we will also copy the Extends, the ClassType, Inheritance. We are not leaving anything out, just the Copyright, that doesn't interest us. We don't have enough ego to leave our mark of great work.

 

 

If you're old school, this screen will seem very familiar to you 😉

 

Second Step

 

Now we do the magic: we add our fields, remove the ones we aren't interested in, and we can even translate it into Spanish to make it look better.
 

 

 

In this case, I've added the message source, destination, event, service, and patient episode, and this is where your need comes in. Add any field you want/need. As you can see, you can add the segment, field, and component numerically. For me, it's easier than adding the English name, but keep in mind that we can only guarantee that segment 1 will be the MSH; the rest will depend on each message. So, in the case of segments, it's better to use the code (PID, PV1).

After all this arduous task, we compile and now have our lookup table. We just need to assign it to the components in our production, obviously the HL7 components. Let's get to it.

 

 

Third Step

 

We open our production and look for the HL7 components, we go to the additional parameters and there we will find our lookup table, we select it and proceed to apply the changes, from that moment on, all the messages that enter through that component will be stored in our lookup table. If later we add more fields to the lookup table, it will be from the compilation of this when they will begin to be saved, and for previous messages, that field will be blank.

 

 

Whether they enter or exit, in HL7 type operations we also have the lookup table. Normally, in the output components nothing is activated by default, but if we are interested in controlling the output we also have it available.

 

 

With these simple steps we have our production ready to save the data we are interested in. We just need to generate the queries, and to do that we go to the message viewer.

 

Demonstration

 

In the messaging within the criteria, when we select the lookup tables, we will find what we did. When we select it, we can now start playing with our fields

 

 

Below, I'll give several examples, but your imagination is the limit. I'm getting carried away. The fields in the lookup table will set the limits for you.


1. Search if no condition. We only want to see the input message event and message ID.

 

 

2. Search by event showing patient and episode codes

 


3. Search by episode showing the service, patient name and HL7 message type

 

 


From here on it's up to you, enjoy 😋

Discussion (0)1
Log in or sign up to continue