Article
· Nov 24, 2023 4m read

A framework yes, but a suitable framework

How can IRIS productions be deployed more quickly and with greater peace of mind?

The aim of interoperability productions is to enable you to connect systems in order to transform and route messages between them. To connect systems, you develop, configure, deploy and manage productions that integrate several software systems.

That’s what the InterSystems documentation on its reference website tells us, but what do you actually have to do to deploy a production ?

Go for it !

Productions can be composed to connect external systems to the IRIS Data Platform. To do this, it is necessary to create an environment specific to each production, including the following components :

  • a Business service 📨
  • a Business process (optional) 📘
  • a Business operation 💉
  • table definition schemas (.cls; classes) 📅
  • a namespace initialization file (.cpf) 📋

Of course, the importance of using productions to process messages lies in the fact that each message can be traced back to any undesirable events.

And what if I told you that you could deploy your productions using our IRIS interoperability framework with the wave of a magic wand ?🪄

Wait, what?

Explanations

The mainframe approach on which our framework is based means that IRIS InterSystems® productions can be deployed at high speed, without having to recreate all the components manually.

The use of the framework allows us to add an interesting feature that enables data to be read from the tables to be deployed with the production : the addition of an outgoing API (RestForms2) !

Sounds good :)

➡️ Data can be queried and returned in JSON format.

The framework will generate all the components based on a functional specification file filled out in agreement with the business and our project manager (whose role is to ensure that all the necessary information finds its place).

The script works in two stages : building the ETL flow and the data drop point. 📦🪂

Once filled in as required, the functional specifications file is used firstly to generate the message serialization file (data classes; obj.py), the data structure file for each message (msg.py), the message generation file (bs.py) and the message ingestion file for the corresponding database (bo.py); secondly it creates or deletes tables (if exist) in the database in the form of a SQL script with DDL (Data Definition Language) instructions.

A real time-saver ! ⌚

And best of all, the framework can be easily deployed from a Docker container ! 🐳

Xssshhh

Interests

Still not convinced ? How would using this framework can save 80% of your time?⏱️

What if I told you that the code deployed by the framework is validated by the InterSystems® editor, that it enables your team to work on standardized code, that during maintenance campaigns this possibility encourages you to be more efficient when updating code or looking for bugs, that it enables you to interact with your data using a REST API mechanism (available from the repository of InterSystems IRIS-compatible packages for all versions). Does it make sense for you ?👌

What do we mean by “the code is validated by the editor”?✅
Simply that it respects Python standards and those of the editor in terms of architecture, calls to internal IRIS InterSystems® mechanisms, and can interface with the ObjectScript language and vice versa.

Bye ObjectScript

Next

If this article resonates with your needs, or if you’re simply curious to see how this framework could revolutionize the way you work with IRIS InterSystems®: please visit our website and/or ask for join our Discord Server to speak to one of our experts.

Follow us also on our LinkedIn profile profile.

In the next issue, you’ll see a case study of the framework in an operational environment 😉 !

Discussion (1)2
Log in or sign up to continue