Find

Article
· Oct 5 5m read

The Wait Is Over: Welcome GoLang Support for InterSystems IRIS

Introduction

The InterSystems IRIS Data Platform has long been known for its performance, interoperability, and flexibility across programming languages. For years, developers could use IRIS with Python, Java, JavaScript, and .NET — but Go (or Golang) developers were left waiting.

Golang Logo

That wait is finally over.

The new go-irisnative driver brings GoLang support to InterSystems IRIS, implementing the standard database/sql API. This means Go developers can now use familiar database tooling, connection pooling, and query interfaces to build applications powered by IRIS.


Why GoLang Support Matters

GoLang is a language designed for simplicity, concurrency, and performance — ideal for cloud-native and microservices-based architectures. It powers some of the world’s most scalable systems, including Kubernetes, Docker, and Terraform.

Bringing IRIS into the Go ecosystem enables:

  • Lightweight, high-performance services using IRIS as the backend.
  • Native concurrency for parallel query execution or background processing.
  • Seamless integration with containerized and distributed systems.
  • Idiomatic database access through Go’s database/sql interface.

This integration makes IRIS a perfect fit for modern, cloud-ready Go applications.


Getting Started

1. Installation

go get github.com/caretdev/go-irisnative

2. Connecting to IRIS

Here’s how to connect using the standard database/sql API:

import (
    "database/sql"
    "fmt"
    "log"
    _ "github.com/caretdev/go-irisnative"
)

func main() {
    db, err := sql.Open("iris", "iris://_SYSTEM:SYS@localhost:1972/USER")
    if err != nil {
        log.Fatal(err)
    }
    defer db.Close()

    // Simple ping to test connection
    if err := db.Ping(); err != nil {
        log.Fatal("Failed to connect:", err)
    }

    fmt.Println("Connected to InterSystems IRIS!")
}

3. Creating a Table

Let’s create a simple demo table:

_, err = db.Exec(`CREATE TABLE IF NOT EXISTS demo (
    id INT PRIMARY KEY,
    name VARCHAR(50)
)`)
if err != nil {
    log.Fatal(err)
}
fmt.Println("Table created.")

4. Inserting Data

At this time, multi-row inserts are not supported — insert one row per call:

_, err = db.Exec(`INSERT INTO demo (id, name) VALUES (?, ?)`, 1, "Alice")
if err != nil {
    log.Fatal(err)
}

_, err = db.Exec(`INSERT INTO demo (id, name) VALUES (?, ?)`, 2, "Bob")
if err != nil {
    log.Fatal(err)
}

fmt.Println("Data inserted.")

5. Querying Data

Querying is straightforward using the database/sql interface:

rows, err := db.Query(`SELECT id, name FROM demo`)
if err != nil {
    log.Fatal(err)
}
defer rows.Close()

for rows.Next() {
    var id int
    var name string
    if err := rows.Scan(&id, &name); err != nil {
        log.Fatal(err)
    }
    fmt.Printf("ID: %d, Name: %s\n", id, name)
}

That’s all you need to perform basic SQL operations from Go.


How It Works

Under the hood, the go-irisnative driver uses the IRIS Native API for efficient, low-level communication with the database. The driver implements Go’s standard database/sql/driver interfaces, making it compatible with existing Go tools such as:

  • sqlx
  • gorm (with a custom dialect)
  • Standard Go migration tools

This gives developers a familiar API with the power and performance of native IRIS access.


Example Use Cases

  • Microservices — lightweight Go services connecting directly to IRIS.
  • Data APIs — expose REST or gRPC endpoints backed by IRIS.
  • Integration tools — bridge IRIS data with other systems in Go-based pipelines.
  • Cloud-native IRIS apps — deploy IRIS-backed Go apps on Kubernetes or Docker.

Testing with Testcontainers

If you want to run automated tests without managing a live IRIS instance, you can use testcontainers-iris-go.
It launches a temporary IRIS container for integration testing.

Example test setup:

import (
    "context"
    "database/sql"
    "flag"
    "log"
    "os"
    "testing"
    iriscontainer "github.com/caretdev/testcontainers-iris-go"
    "github.com/stretchr/testify/require"
    "github.com/testcontainers/testcontainers-go"
)

var connectionString string = "iris://_SYSTEM:SYS@localhost:1972/USER"
var container *iriscontainer.IRISContainer = nil
func TestMain(m *testing.M) {
    var (
        useContainer   bool
        containerImage string
    )
    flag.BoolVar(&useContainer, "container", true, "Use container image.")
    flag.StringVar(&containerImage, "container-image", "", "Container image.")
    flag.Parse()
    var err error
    ctx := context.Background()
    if useContainer || containerImage != "" {
        options := []testcontainers.ContainerCustomizer{
            iriscontainer.WithNamespace("TEST"),
            iriscontainer.WithUsername("testuser"),
            iriscontainer.WithPassword("testpassword"),
        }
        if containerImage != "" {
            container, err = iriscontainer.Run(ctx, containerImage, options...)
        } else {
            // or use default docker image
            container, err = iriscontainer.RunContainer(ctx, options...)
        }
        if err != nil {
            log.Println("Failed to start container:", err)
            os.Exit(1)
        }
        defer container.Terminate(ctx)
        connectionString = container.MustConnectionString(ctx)
        log.Println("Container started successfully", connectionString)
    }

    var exitCode int = 0
    exitCode = m.Run()

    if container != nil {
        container.Terminate(ctx)
    }
    os.Exit(exitCode)
}

func openDbWrapper[T require.TestingT](t T, dsn string) *sql.DB {
    db, err := sql.Open(`intersystems`, dsn)
    require.NoError(t, err)
    require.NoError(t, db.Ping())
    return db
}

func closeDbWrapper[T require.TestingT](t T, db *sql.DB) {
    if db == nil {
        return
    }
    require.NoError(t, db.Close())
}

func TestConnect(t *testing.T) {
    db := openDbWrapper(t, connectionString)
    defer closeDbWrapper(t, db)

    var (
        namespace string
        username  string
    )
    res := db.QueryRow(`SELECT $namespace, $username`)
    require.NoError(t, res.Scan(&namespace, &username))
    require.Equal(t, "TEST", namespace)
    require.Equal(t, "testuser", username)
}

This is ideal for CI/CD pipelines or unit tests, ensuring your Go application works seamlessly with IRIS in isolation.


Conclusion

GoLang support for InterSystems IRIS is here — and it’s a game-changer.
With go-irisnative, you can now build scalable, concurrent, and cloud-native applications that tap directly into the power of IRIS.

Whether you’re building microservices, APIs, or integration tools, Go gives you simplicity and performance, while IRIS gives you reliability and rich data capabilities.

👉 Try it out:

15 new Comments
Discussion (23)4
Log in or sign up to continue
Article
· Oct 5 3m read

IRIS Audio Query - Development

IRIS Audio Query is a full-stack application that transforms audio into a searchable knowledge base.

 

Project Structure

community/
├── app/                   # FastAPI backend application
├── baml_client/           # Generated BAML client code
├── baml_src/              # BAML configuration files
├── interop/               # IRIS interoperability components
├── iris/                  # IRIS class definitions
├── models/                # Data models and schemas
├── twelvelabs_client/     # TwelveLabs API client
├── ui/                    # React frontend application
├── main.py                # FastAPI application entry point
└── settings.py            # IRIS interoperability entry point

 

Required Installations and Setup

- Python 3.8+ - For embedded language development and backend application
- Node.js & npm - For frontend application development
- Docker - For containerization and running the IRIS database

 

TwelveLabs API

The TwelveLabs API is used for generating embeddings for uploaded audio files and query text.

To get your TwelveLabs API key:

1. Go to https://playground.twelvelabs.io and create an account (or log in).
2. Once logged in, navigate to the API Keys section under Settings.
3. Click Create API Keys to create a new key, and copy the generated key.

 

OpenAI API

The OpenAI API is used for generating answers to queries using audio files as context.

Note: Any API supported by BAML can be used in place of OpenAI. Check the BAML docs for the list of supported APIs. 

To get your OpenAI API key:

1. Go to https://platform.openai.com and create an account (or log in).
2. Once logged in, go to the Billling page and add payment details.
3. Next, go to the API Keys page.
3. Click Create new secret key to create a new key, and copy the generated key.

 

Installation

1. Clone the repository

git clone
cd iris-audio-query

2. Create a virtual environment

python3 -m venv .venv
source .venv/bin/activate

3. Install the requirements

pip install -r requirements.txt
npm --prefix community/ui/ install

4. Configure environmental variables
    1. Copy the template in .env.example as .env.
    2. Configure the environmental variables as appropriate.
5. Run the docker-compose file

docker-compose up

6. Import the Audio class in IRIS
   1. Access the IRIS Management Portal by going to http://localhost:53795/csp/sys/UtilHome.csp
   2. Sign in using username superuser and password SYS, or otherwise as specified in .env.
   3. Navigate to System Explorer > Classes.
   4. Select the IRISAPP namespace, or otherwise as specified in .env.
   5. Click Import and specify that the import file resides on My Local Machine, and choose the file community/iris/IrisAudioQuery.Audio.cls.
   6. Click Next then Import to import the Audio class.
7. Start the FastAPI backend.

docker exec -it iris-audio-query-iris-1 bash

   Then from within the container,

python3 community/main.py

8. Start the React frontend.

npm --prefix community/ui/ run dev

9. Access the application at http://localhost:5173.

Discussion (0)1
Log in or sign up to continue
Article
· Oct 5 2m read

IRIS Audio Query - Query Audio with Text using InterSystems IRIS

With the rapid adoption of telemedicine, remote consultations, and digital dictation, healthcare professionals are communicating more through voice than ever before. Patients engaging in virtual conversations generate vast amounts of unstructured audio data, so how can clinicians or administrators search and extract information from hours of voice recordings?

 

Enter IRIS Audio Query - a full-stack application that transforms audio into a searchable knowledge base. With it, you can:

  • Upload and store clinical conversations, consultation recordings, or dictations
  • Perform natural language queries (e.g., "What did the patient report about symptoms of fatigue?")
  • Receive a concise answer generated using Large Language Models

At its core, this application is powered by Intersystems IRIS for robust data handling and vector search and built on Intersystems Interoperability framework, all developed using the Python Native SDK.

 

User Interface

Uploading an audio file:

 Performing a query:

 

Tech Stack

  • InterSystems IRIS – Persistent object store & vector search foundation
  • Python (FastAPI) – Backend APIs and business logic
  • React – UI for upload and querying
  • TwelveLabs API – Generate embeddings from audio and text
  • OpenAI API – Generate text responses using audio content as context
  • Docker – Containerization 

 

Architecture

 

The uploaded audio files are stored in IRIS as persistent objects, and are also embedded then stored as vectors. To perform a query, the query text is first embedded, then a vector search is performed to find the most relevant audio embeddings, then the corresponding audio files are retrieved, and finally the answer is generated from the query text with the audio files as context.

The upload and query operations are built as Business Operations using the IRIS Native Python SDK. The FastAPI backend provides a REST API for external applications to interact with this system, while the React frontend provides a UI to interact with the backend.

[ React Frontend ]
        ↓
[ FastAPI Backend (REST API) ]
        ↓
[ IRIS Business Operations (Python SDK) ]
        ↓                      ↘
[ Store Audio in IRIS ]     [ Embed via TwelveLabs → Store vectors ]
                                ↓
                      [ Vector Search on Query Text ]
                                ↓
          [ Retrieve Relevant Audio → Answer using OpenAI ]
1 Comment
Discussion (1)1
Log in or sign up to continue
Article
· Oct 3 8m read

Why does InterSystems have no out-of-the-box ESB solution? Let’s try to fix it!

I was really surprised that such a flexible integration platform with a rich toolset specifically for app connections has no out-of-the-box Enterprise Service Bus solution. Like Apache ServiceMix, Mule ESB, SAP PI/PO, etc, what’s the reason? What do you think? Has this pattern lost its relevance completely nowadays? And everybody moved to message brokers, maybe?

Wiki time: An enterprise service bus (ESB) implements a communication system between mutually interacting software applications in a service-oriented architecture (SOA) ... Its primary use is in enterprise application integration (EAI) of heterogeneous and complex service landscapes.

Anyway, I googled "IRIS ESB" and found this topic in the documentation. But it looks a little weird to me. Like ESB means only Service Registry and pass-through services. And if Service Registry is a really good feature (not only for ESB), pass-through services, in my opinion, are completely not about ESB. Yes, we can use a Bus for pass-through data flows for some reasons, at least for centralized logging. But pass-through is contrary to the major ESB sense - to centralize integration implementation. Like in an ERP application, we wanna to fuse accounting, order management, manufacturing, HR, and so on. ESB it's the way to put together integration code into one platform. Even though this task can't be done 100%, using this pattern gives real advantages in heterogeneous IT environments. Especially if connected applications are hard/expensive to extend.

And I'm not talking about a bunch of other great features. Such as monitoring of everything that happens between apps, reusing data from data flows, and code between integrations. Garanty delivery for all flows and simple investigation of data transfer issues in a one-window UI. Fast and cheap replacement of ecosystem members through high standardization of the integration code. A ready-to-use platform for enterprise API, as it already connects to any apps in your landscape, and many other great features! I'm a big fan of the ESB pattern, as you might guess!

 

 

So, I wanna to show you one of my pet projects - IRIS ESB. It's a try to implement some typical ESB features on the InterSystems IRIS Data Platform, such as:

  1. Centralize integration code in one place
  2. Message management (message broker) with guaranteed delivery based on Pub/Sub architecture
  3. Message validation against the Data Model (Data Schema)
  4. Flexible API to receive any message types (using payload container)
  5. Centralized monitoring and alerting control

This project contains three main modules. Let us take a look at them.

Message Broker (Broker.*)

Message Broker is designed to keep messages and create separate message consumers, each of which can be independently subscribed to a message queue. It means all consumers have their own inbound queue by message type (not literally). Messages have statuses: NEW, PENDING (processing in progress), ERROR, and OK (message successfully processed). The main function of this Message Broker is to guarantee the delivery of messages. The message will be resending again and again until one of two events happens: successful message processing or the end of message lifetime (message expired).

IRIS ESB uses a slightly improved version of the Kafka algorithm. Kafka maintains the offset of the last processed message to facilitate moving forward on the message queue. Here, we keep all processed message IDs, which allows us not to stop consuming when we have some "troubled" messages in the queue. So, IRIS ESB can restore data flows after the temporary unavailability of external systems (or if we got some "bad data") - without manual actions.

I have not used an external broker, as the same Kafka, for not to lose the coolest IRIS feature - possible to see all that happens with the messages in visual traces. Also, Kafka does not have a message guarantee delivery (usually it is based on retries) out of the box.

How to add data flow in Message Broker?

First of all, your Production must have a Broker.Process.MessageRouter business host. He is responsible for routing messages to handlers and setting message statuses. Just add MessageRouter to Production, no need for any additional settings here. It will be common for all data flows.

Next, you need a handler for the message that extends Broker.Process.MessageHandler. It is a place for your custom code for message processing: mapping and transforming to other message formats, sending to external systems via business operations, and so on.

Finally, create a consumer. It is a business service instance of a Broker.Service.InboxReader class. It will read messages from the queue and transfer them to the handler. Set up its Settings, where:

  • MessageHandler - Your handler above
  • MessageType - On what kind of message do we wanna subscribe to? It is a full analogy topic in the Kafka
  • MessageLifetime - When will the message expire? It can be different for each consumer

Inbox REST API (Inbox.*)

Each ESB should have a universal way to receive messages from external systems. Here it's a REST API. Universal means you can send any JSON payload to this API. The received JSON text will be deserialized into the Cache class and placed in the Inbox queue. IRIS ESB works with class objects, not %DynamicObject, for example, becouse validation of messages is one more important feature of the ESB pattern. And importing JSON text to the class, I believe, is the best way for it.

So, to add a new custom message type, you need to create a class (or import it from some schema) that extends Inbox.Message.Inbound and describes the structure of your message (see samples in the Sample.Message.* package). When you send a message to the Inbox API, set the name of this class as the import_to parameter.

Inbox API testing

There are two endpoints for this API:

  • GET http://localhost:9092/csp/rest/healthcheck - just a simple health check. Should return 200 OK if all is set up the right way
  • POST http://localhost:9092/csp/rest/v1/inbox - put a new message into ESB

To put into the ESB a new sample of "Customer Order", you need to make the following request via CURL or Postman:

curl --location 'http://localhost:9092/csp/rest/v1/inbox?import_to=Sample.Message.CustomerOrder.Order' \
--header 'Content-Type: application/json' \
--data '{
    "CreatedAt": "2021-01-01T00:00:00.000Z",
    "OrderId": 1,
    "OrderStatus": "NEW",
    "Customer": {
        "FirstName": "John",
        "LastName": "Doe"
    },
    "Items": [
        {
            "ProductId": 1,
            "ProductName": "Product 1",
            "Quantity": 2
        },
        {
            "ProductId": 2,
            "ProductName": "Product 2",
            "Quantity": 1
        }
    ]
}'

And one more sample for "Array of Strings" message:

curl --location 'http://localhost:9092/csp/rest/v1/inbox?import_to=Sample.Message.SomeArray.Root' \
--header 'Content-Type: application/json' \
--data '[
    "111",
    "222",
    "333"
]'

Visual traces for these requests can be seen in the messages of the Inbox.Service.API business service. Check: Interoperability > Production Configuration - (Production.Main).

In Production, configured two test consumers, one for "Customer Order" and the other for "Array of Strings" message types. After messages are received by the Inbox API, you can see that them were processed in the Sample.Service.CustomerOrderConsumer or Sample.Service.StringArrayConsumer services.

Monitoring and Alerting (Alert.*)

In IRIS ESB, we have a flexible alerting module to set up subscriptions and ways to deliver alerts when something goes wrong in our data flows.

How alerting works

You should create a process in Production based on the Alert.Process.Router class and call it Ens.Alert. The process, with this name, will automatically collect all alerts from Production items for which the Alert on Error flag has been raised. It is the default way to create an alert processor, described in the documentation here.

Next, you need to fill Lookup Tables names by notifier types. For example, table names can be like Alert.Operation.EmailNotifier, Alert.Operation.SMSNotifier, and so on (you can add your own notifier implementations to the Alert.Operation.* package). It must be the names of Operations in our Production. I strongly recommend using class names for Production config item names, always when it is possible.

For each of these tables, Key means the source of the exception (name of Production business host). Value means contact ID (e-mail address for EmailNotifier, for example). Value can be empty when we use the notifier without forwarding the alert to a specific address.

For testing alerts, you can just raise the ThrowError flag in one of the test handlers. In Production, already set up LogFileNotifier, so alerts will be written to /tmp/alerts.log file.

Metrics

During message processing, IRIS ESB collects various metrics, including performance sensors such as the minimum, maximum, and average time of message processing (by consumers). Additionally, collecting statistics by message status: OK, ERROR, and PENDING counters.

These metrics are published via API (see GET http://localhost:9092/api/monitor/metrics endpoint), collected by Prometheus, and visualised by Grafana. Web UI for these applications is available at:

  • http://localhost:9090 - Prometheus
  • http://localhost:3000 - Grafana

Added custom metrics have a tag esb_broker.

Try it

You must have installed Docker Desktop and Git on your local PC. Clone the repository and run Docker containers:

git clone https://github.com/ogurecapps/iris-esb.git
cd iris-esb
docker-compose up -d

Interoperability Production will be available on the URL (use default credentials _system SYS for login): http://localhost:9092/csp/esb/EnsPortal.ProductionConfig.zen?$NAMESPACE=ESB

Send test messages as described in the Inbox API testing paragraph. You can see traces of receiving messages in the Inbox.Service.API and traces of processing messages in the Sample.Service.CustomerOrderConsumer.

Open Grafana at http://localhost:3000 (default credentials are admin admin).

  1. Add a data source: choose Prometheus as data source type and enter Server URL as http://host.docker.internal:9090
  2. Add dashboard: select "New > Import" and take a ready-to-use dashboard JSON config file from this Developer Community article, for example

Enjoy! Now you have an ESB with API and monitoring. Now it remains only to add your own message types and data flow implementations.

What about real use cases?

Yeah, I have one. I built probably the biggest ESB solution on the IRIS Data Platform using similar algorithms as described above. I don't want to reveal the company name (who knows, he knows), but I can share some counters. My IRIS ESB instance has around 800 data flows. When I say data flow, I mean sync/async point-to-point messages flow with protocol and format transformations, and optionally data enrichment. It's 50 or more combined systems, such as ERP, DWH, CRM, POS software, mobile, and e-commerce solutions. As for protocols, mostly historically uses SOAP, but the REST part grows fast. Also, we have many OData flows (just local specifics). The system does not receive a high volume of inbound requests, with a maximum of 300-350 RPS (Requests Per Second), but transfers around 3 TB of messages per week. I believe this is not so few. The server has several namespaces with a total count of business hosts in Interoperability Productions equal to 2041. Yep, I accurately calculated them.

Conclusion

That's all, Folks! Thanks for your attention. Forgive my mistakes, it's my first article for the Dev Community portal. Feel free to fork, rate my repo, and ask any questions.

Discussion (0)1
Log in or sign up to continue
Announcement
· Oct 3

[Video] Python Interoperability Productions

Hey Community!

We're happy to share a new video from our InterSystems Developers YouTube:

⏯  Python Interoperability Productions @ Ready 2025

This presentation introduces Python Interoperability Productions in InterSystems IRIS, a framework that lets developers build full interoperability productions entirely in Python, no ObjectScript required. It covers the key components of a production architecture, including message passing, callbacks, and persistence. The session also demonstrates how a production works end-to-end: how messages move from service → process → operation → back—and how developers can customize persistence, serialization, and UI display formats.

🗣 Presenters: @Geet Kalra, Senior Systems Developer, InterSystems

Enjoy watching, and subscribe for more videos! 👍

Discussion (0)2
Log in or sign up to continue