Clear filter
Announcement
Anastasia Dyubaylo · Oct 24, 2019
Dear Community,
We're pleased to invite you to the InterSystems DACH Symposium 2019, which will take place from November 12th to 14th 2019 at The Westin Grand Hotel in Frankfurt am Main!
This year, we will focus on best practices and use cases for AI, ML as well as API management, microservices and the use of containers for DevOps.
You’ll experience exciting lectures, interactive sessions and hands-on coding exercises, which are suitable for both InterSystems experts and beginners. See for yourself how the new API Manager simplifies the orchestration of microservice architectures and how QuickML and InterSystems IRIS can help you to implement ML projects quickly and efficiently.
So, remember:
⏱ Time: November 12-14, 2019
📍Venue: The Westin Grand Hotel, Frankfurt am Main, Germany
✅ Registration: SAVE YOUR SEAT
See you in Frankfurt!
Liebe Community,
Wir laden euch herzlich ein zum InterSystems DACH Symposium 2019, das in diesem Jahr vom 12.-14. November im The Westin Grand Hotel in Frankfurt am Main stattfinden wird.
Thematisch legen wir besondere Schwerpunkte auf Best Practices und Use Cases für KI, ML sowie API Management, Microservices sowie die Nutzung von Containern für DevOps. Meldet euch am besten gleich zu dieser kostenfreien Veranstaltung an: https://intersystems-dach-symposium.de/register/
Es erwarten euch spannende Vorträge, interaktive Sessions und praxisnahe Coding-Übungen, die sowohl für InterSystems-Profis wie auch –Einsteiger geeignet sind. Erfahrt aus erster Hand, wie der neue API Manager die Orchestrierung von Microservice-Architekturen vereinfacht und wie ihr mit QuickML und InterSystems IRIS Machine Learning Projekte schnell und effizient umsetzen könnt.
Und, denk dran:
⏱ Zeit: 12. bis 14. November 2019
📍Veranstaltungort: The Westin Grand Hotel, Frankfurt am Main, Deutschland
✅ Registrierung: Sichern Sie sich Ihren Platz
Sehen wir uns in Frankfurt! Any plans for a DC Meetup at this event? Hi John! Do you plan to come to DACH Summit 2019? Maybe. It's being considered. Will DC meetup influence the decision? ) John, we are considering it. There is a possibility if we have enough support and requests from developers. It's confirmed that I will be attending. The organizers have generously added me to the early evening "Beer & Bytes" demo sessions on Tuesday and Wednesday. I will demonstrate editing and debugging ObjectScript with Visual Studio Code using Serenji .
Announcement
Anastasia Dyubaylo · Oct 24, 2019
Hi Community,
As you may know, we successfully held InterSystems Developers Meetup in Boston at Global Summit 2019. And now it's time to find out which solutions development on InterSystems IRIS have been discussed!
Please welcome the Meetup video recording on InterSystems Developers YouTube Channel:
⏯ InterSystems Developers Meetup - Global Summit 2019
So, let's warmly greet these sessions from InterSystems developers:
№
TOPIC
SPEAKER
1
Using Python Gateway with InterSystems IRIS
@Eduard Lebedyuk, InterSystems
2
VSCode ObjectScript - your IDE for InterSystems IRIS development
@Dmitriy Maslennikov, CaretDev
3
Using Package Manager for deployment InterSystems Solutions
@Evgeny Shvarov, InterSystems
4
Falling in love with Visual Studio Code and Serenji
@John Murray, George James Software
5
InterSystems IRIS and the Cloud
@Anton Umnikov, InterSystems
6
How Your Code Matters!
@Dmitriy Maslennikov, CaretDev
P.s. Don't forget to check the useful resources on Open Exchange Marketplace.
Big applause for these speakers! And thanks to all participants of the Meetup! 👏🏼
Please find more details in this post.
Enjoy watching the video!
And...
What about the next events? Do you want our community to held more such meetups for InterSystems developers?
Your feedback is very welcome!
Please leave your suggestions in the comments of this post.
Announcement
Tony Coffman · Nov 20, 2019
Hello InterSystems Community,
We're excited to announce that we've completed our first Open Exchange submission for InterSystems platforms.
BridgeWorks VDM is an ad hoc reporting and graphical SQL query builder application that was designed for any user who needs access to their SQL projections in InterSystems Caché, InterSystems IRIS, InterSystems IRIS for Health databases as well as access to InterSystems DeepSee and InterSystems IRIS BI Cubes with minimal SQL scripting experience.
VDM features:
Drag and a drop query builder
Interactive Data and Pivot grids
Charting and Gauges
Data Visualizations
Banded Report Writer
Report Scheduling and Batching
Web reporting capabilities
Data export to PDF, Excel, HTML, Word, CSV, Text Files, Image and more
VDM leverages the Managed Provider for .Net to communicate with InterSystems databases, and no ODBC drivers are needed to access the SQL projections for InterSystems Caché, InterSystems IRIS and InterSystems IRIS for Health. VDM accesses DeepSee.RESTClient.cls for building MDX queries and communication with InterSystems DeepSee and InterSystems IRIS BI.
Video walk through:
jQuery(document).ready(function ($){$("#youtubexM5LKB0G25E").css("height",$("#youtubexM5LKB0G25E").css("width").replace(/[^0-9]/gim,"")/(560/315));$(window).resize(function(){ $("#youtubexM5LKB0G25E").css("height",$("#youtubexM5LKB0G25E").css("width").replace(/[^0-9]/gim,"")/(560/315));});});
Notes:
While VDM is not a SQL syntax editor, the graphical query builder can be bypassed and SQL scripts can be pasted directly into the Advanced Query form for execution against your InterSystems database.
The InterSystems DeepSee and InterSystems IRIS BI features are currently EAP.
Community feedback on our project is appreciated.
Announcement
Anastasia Dyubaylo · Dec 13, 2019
Hi Everyone,
Please welcome the new Global Summit 2019 video on InterSystems Developers YouTube Channel:
⏯ Using IoT in InterSystems IRIS for Health
In this video, we will demonstrate connecting a heart rate sensor, to InterSystems IRIS for Health via an MQTT/IoT interface. We will show how the sensor reading becomes a well-formatted HL7 message, suitable for sharing and analysis.
Takeaway: InterSystems IRIS for Health enables you to capture and use the latest wearable data from healthcare sensors.Presenter: @Anton.Umnikov, Senior Cloud Solution Architect, InterSystems
Learn more in this InterSystems Online Learning Course.
Enjoy watching the video! 👍🏼
Announcement
Olga Zavrazhnova · Jan 9, 2020
Hi Community!
Thank you for being a part of the InterSystems Open Exchange! We want to know what you like about Open Exchange and how we can make it better in 2020.
Could you please go through this short survey which will let us know what do you think.
➡️ Open Exchange Survey 2019 (2 minutes, 7 questions)
Your answers are very important to us!
Sincerely,
Your InterSystems Developer Community Team
Article
Mark Bolinsky · Mar 6, 2020
Introduction
InterSystems has recently completed a performance and scalability benchmark of IRIS for Health 2020.1, focusing on HL7 version 2 interoperability. This article describes the observed throughput for various workloads, and also provides general configuration and sizing guidelines for systems where IRIS for Health is used as an interoperability engine for HL7v2 messaging.
The benchmark simulates workloads that have been designed to closely match live environments. The details of the simulation are described in the Workload Description and Methodology section. The tested workloads comprised HL7v2 Patient Administration (ADT) and Observation Result (ORU) payloads and included transformations and re-routing.
The 2020.1 version of IRIS for Health has demonstrated a sustained throughput of over 2.3 billion (total inbound and outbound) messages per day with a commodity server using the 2nd Generation Intel® Xeon® Scalable Processors and Intel® Optane™ SSD DC P4800X Series SSD storage. These results have more than doubled the scalability from the prior Ensemble 2017.1 HL7v2 throughput benchmarking.
Throughout these tests, IRIS for Health was configured to preserve first-in/first-out (FIFO) ordering, and to fully persist messages and queues for each inbound and outbound message. By persisting the queues and messages, IRIS for Health provides data protection in the event of a system crash, and full search and resend capabilities for historic messages.
Further, configuration guidelines are discussed in the sections below, which will assist you in choosing an appropriate configuration and deployment to adequately meet your workload’s performance and scalability requirements.
The results demonstrate that IRIS for Health is capable of satisfying extreme messaging throughput on commodity hardware, and in most cases allowing a single small server to provide HL7 interoperability for an entire organization.
Overview of Results
Three workloads were used to represent different aspects of HL7 interoperability activity:
T1 workload: uses simple pass-through of HL7 messages, with one outbound message for each inbound message. The messages were passed directly from the Ensemble Business Service to the Ensemble Business Operation, without a routing engine. No routing rules were used and no transformations were executed. One HL7 message instance was created in the database per inbound message.
T2 workload: uses a routing engine to modify an average of 4 segments of the inbound message and route it to a single outbound interface (1-to-1 with a transform). For each inbound message, one data transformation was executed and two HL7 message objects were created in the database.
T4 workload: uses a routing engine to route separately modified messages to each of four outbound interfaces. On average, 4 segments of the inbound message were modified in each transformation (1 inbound to 4 outbound with 4 transforms). For each inbound message four data transformations were executed, four messages were sent outbound, and five HL7 message objects were created in the database.
The three workloads were run on a physical 48-core system with two Intel® Scalable Gold 6252 processors with two 750GB Intel® Optane™ SSD DC P4800X SSD drives running Red Hat Enterprise Linux 8. The data is presented as the number of messages per second (and per hour) inbound, the number per second (and per hour) outbound, as well as the total messages (inbound plus outbound) in a 10-hour day. Additionally, CPU utilization is presented as a measure of available system resources at a given level of throughput.
Scalability Results
Table-1: Summary of throughput of the four workloads on this tested hardware configuration:
* Combined workload with 25% of T1 / 25% of T2 / 50% T4 workload mix
Workload Description and Methodology
The tested workloads included HL7v2 Patient Administration (ADT) and Observation Result (ORU) messages, which had an average size of 1.2KB and an average of 14 segments. Roughly 4 segments were modified by the transformations (for T2 and T4 workloads). The tests represent 48 to 128 inbound and 48 to 128 outbound interfaces receiving and sending messages over TCP/IP.
In the T1 workload, four separate namespaces each with 16 interfaces were used, and the T2 workload used three namespaces each with 16 interfaces, the T4 workload used four namespaces each with 32 interfaces, and the final “mixed workload” used three namespaces with 16 for T1 workload, 16 for T2 workload, and 32 for T4 workload.
The scalability was measured by gradually increasing traffic on each interface to find the highest throughput with acceptable performance criteria. For the performance to be acceptable the messages must be processed at a sustained rate, with no queuing, no measurable delays in delivery of messages and the average CPU usage must remain below 80%.
Previous testing has demonstrated that the type of HL7 message used is not significant to the performance or scalability of Ensemble; the significant factors are the number of inbound messages, the size of inbound and outbound messages, the number of new messages created in the routing engine, and the number of segments modified.
Additionally, previous testing has shown that processing individual fields of an HL7 message in a data transformation is not usually significant to performance. The transformations in these tests used fairly straightforward assignments to create new messages. Note that complex processing (such as use of extensive SQL queries in a data transformation) may cause results to vary.
Previous testing has also verified that rules processing is not usually significant. The routing rule sets used in these tests averaged 32 rules, with all rules being simple. Note that extremely large or complex rule sets may cause results to vary.
Hardware
Server Configuration
The tests utilized a server with 2nd Generation Intel® Scalable Gold 6252 “Cascade Lake” processors providing 48 cores @ 2.1GHz on a 2-socket system, 24 cores per socket with 192 GB DDR4-2933 DRAM, and 10Gb Ethernet network interface. Red Hat Enterprise Linux Server 8 operating system was used for this test with InterSystems IRIS for Health 2020.1
Disk Configuration
Messages passing through IRIS for Health are fully persisted to disk. In the case of this test two Intel 750GB Intel® Optane™ SSD DC P4800X SSD drives internal to the system were used splitting the databases on one drive and the journals on another. In addition to ensure real-world comparison synchronous commit is enabled on the journals to force data durability. For the T4 workload as described previously in this document, each inbound HL7 message generates roughly 50KB of data, which can be broken down as described in Table 2. Transaction journals are typically kept on line for less time than message data or logs and this should be taken into account when calculating the total disk space required.
Table 2: Disk Requirement per inbound HL7 T4 Message
Contributor
Data Requirement
Segment Data
4.5 KB
HL7 Message Object
2 KB
Message Header
1.0 KB
Routing Rule Log
0.5 KB
Transaction Journals
42 KB
Total
50 KB
Recall that the T4 workload used a routing engine to route separate modified messages to each of four outbound interfaces. On average, 4 segments of the inbound message were modified in each transformation (1 inbound to 4 outbound with 4 transforms). For each inbound message four data transformations were executed, four messages were sent outbound, and five HL7 message objects were created in the database.
When configuring systems for production utilization, net requirements should be calculated by considering the daily inbound volumes as well as the purging schedule for HL7 messages and the retention policy for journal files. Additionally, appropriate journal file space should be configured on the system so as to prevent the journal disk volumes from filling up. The journal files should reside on physically separate disk from the database files, for both performance as well as reliability considerations.
Conclusion
InterSystems IRIS for Health HL7v2 message throughput demonstrated in these tests illustrates the massive throughput capabilities with a modest 2-socket commodity server configuration to support the most demanding message workloads of any organization. Additionally, InterSystems is committed to constantly improving on the performance and scalability from version to version along with taking advantage of the latest server and cloud technologies.
The following graph provides an overview and comparison of the increase in throughput from the previous Ensemble 2015.1 and Ensemble 2017.1 benchmarks with the Intel® E5-2600 v3 (Haswell) processors and Ensemble 2017.1 benchmark with 1st Generation Intel® Scalable Platinum Series (Skylake) processors respectively to latest results with the 2nd Generation Intel® Scalable Gold Series (Cascade Lake) processors running IRIS for Health 2020.1.
Graph-1: Message throughput (in millions) per 10-hour day on a single server
InterSystems IRIS for Health continues to raise the bar on interoperability throughput from version to version along with offering flexibility in connectivity capabilities. As the above graph shows the message throughput has increased significantly and, in case of T2 workloads, doubled from 2017, and comparing to 2015 more than tripled throughput in the same 10-hour window and sustained over 2.3 billion total 24-hour message rates.
Another key indicator of the advancements of IRIS for Health is the throughput improvement in the more complex T2 and T4 workloads which incorporates transformations and routing rules as opposed to pure pass-through operation of the T1 workload.
InterSystems is available to discuss solutions for your organizations interoperability needs. Hi Mark!
These are impressive results! Do you have the link to the previous Ensemble 2017.1 HL7v2 throughput benchmarking results? They used to be available on our website, but have since been removed since the results where from 3 years ago. The summary results from 2015 and 2017 have been included in graph-1 above in this new report for comparison. Thanks. hi Mark Bolinsky ;
I am student , I need HL7 Benchmark to test the process that I implemented. Please, can you help me get it?
Announcement
Anastasia Dyubaylo · Sep 6, 2019
Hey Developers!
Please join the upcoming InterSystems Developers Meetup in Boston which will be held on September 25th, 2019!
It will take place in Boston Marriott Copley Place from 5:30 p.m. to 9 p.m.
We meet to discuss solutions development on InterSystems IRIS. Come to tell your stories and share experience with InterSystems data platforms, for networking and developer conversations. Drinks and snacks will be provided.
The format is usual: 15 min for a session, 5 min for Q&A.
Here is the current agenda for the event:
TIME
TOPIC
SPEAKER
5:30 p.m.
Welcome and Introduction
6:00 p.m.
Using Python Gateway with InterSystems IRIS
@Eduard.Lebedyuk, InterSystems
6:20 p.m.
VSCode ObjectScript - your IDE for InterSystems IRIS development
@Dmitry.Maslennikov, CaretDev
6:40 p.m.
Using Package Manager for deployment InterSystems Solutions
@Evgeny.Shvarov, InterSystems
7:20 p.m.
Coffee break
7:40 p.m.
Falling in love with Visual Studio Code
@John.Murray, George James Software
8:00 p.m.
InterSystems IRIS and the Cloud
@Anton.Umnikov, InterSystems
8:20 p.m.
How Your Code Matters!
@Dmitry.Maslennikov, CaretDev
8:40 p.m.
Coffee break
Note: If you want to present your story, please contact @Evgeny.Shvarov or @Anastasia.Dyubaylo in Direct Messages.
Your stories are very welcome!
So! Don't miss an excellent opportunity to meet and discuss new solutions with like-minded peers and to find out what's new in InterSystems technology.
⏱ Time: September 25, 2019 from 5:30 p.m. to 9 p.m.
📍Place: Champions Restaurant (Left Field Room), located on the 2nd floor of the Boston Marriott Copley Place
✅ Registration: RSVP for Boston InterSystems Developers Meetup*
We look forward to seeing you!
---
*Space is limited, so register today to secure your place. Admission free, registration is mandatory for attendees. Hi folks!If you happen to visit Global Summit 2019 this year - please join us on InterSystems Developers meetup on the last day of the summit, just after DevOps and AI symposium at the same building.Come to Champions, Copley at 5-30pm to chat and discuss in a relaxed atmosphere your dev experience with InterSystems IRIS.RSVP here ;)See you in Boston! Will this presentation be recorded. I cannot attend due to distance but will like to view this presentationThanksGeoV Hi George!Yes, we plan to have a Livestream. Stay tuned with the topic - we'll publish the link. This will be on Developers Video channel. Agenda update for the meetup:
"InterSystems IRIS and the Cloud" session by @Anton.Umnikov
“In this session we'll talk about how IRIS can interoperate with various components of the public cloud ecosystem, such as S3 storage, SageMaker machine learning framework, Elastic Map Reduce and others.”
Space is limited, RSVP today! A bit more about my session "Falling in love with Visual Studio Code":
According to the 2019 StackOverflow Developer Survey, Visual Studio Code is number 1. More than 50% of those surveyed said they use it. Why is it so popular? Why do I like it? Why do you? Today's evening on Global Summit 2019 promises to be very interesting and full of developers talks! Don't miss!
Don't forget to RSVP here Thanks for everyone who came! We had a pretty nice event, hope to see you soon on the next InterSystems Dev Meetup in Boston! Hey Developers,
Thanks to all participants of InterSystems Developers Meetup at InterSystems Global Summit 2019! Great speakers, great sessions!
Here're some photos from the meetup:
Big applause for @Evgeny.Shvarov, @Eduard.Lebedyuk, @Dmitry.Maslennikov, @Anton.Umnikov and @John.Murray!
More photos from Global Summit 2019 you can find on InterSystems Developers Twitter or by following our official hashtag #GlobalSummit19.
Your feedback is very welcome!
Article
Evgeny Shvarov · Sep 14, 2019
Hi Developers!Often I find questions on how to install IRIS, connect to IRIS from IDE, setup the environment, compile, debug, maintain the repository.Here below possibly the shortest way to set up all the environment and start development with ObjectScript on InterSystems IRIS.PrerequisitesMake sure you have Git, Docker, and VSCode installedInstall Docker and ObjectScript extensions into VSCodeSign in or Create an account on GithubHere we go!To start development you do the following:Use the template repository and create your own repository on Github.Clone your repository with git on your desktop in terminalOpen repository in VSCodeUse docker-compose.yml to build the container with InterSystems IRIS and import all the ObjectScript from /src folder into USER namespace of IRIS container.Open a terminal to IRIS and call the imported ObjectScript codeMake a change to ObjectScript code and compile itCommit and push changes to your GitHub repositoryCheck the screencast below: Or check the long video with all the explanations.What's next? Start learning ObjectScript with Online Learning and Documentation.Also check Beginner posts, Best Practices and ObjectScript code guidelines.Happy coding! Noticed some good stuff in @Dmitry.Maslennikov's iris-template repo and updated mine foundation template for development with IRIS Community Edition in ObjectScript.
It's much easier now to run ObjectScript instructions in Dockerfile. Check the basic Dockerfie:
ARG IMAGE=intersystems/iris:2019.1.0S.111.0
ARG IMAGE=store/intersystems/irishealth:2019.3.0.308.0-community
ARG IMAGE=store/intersystems/iris-community:2019.3.0.309.0
FROM $IMAGE
USER root
WORKDIR /opt/irisapp
RUN chown ${ISC_PACKAGE_MGRUSER}:${ISC_PACKAGE_IRISGROUP} /opt/irisapp
USER irisowner
COPY Installer.cls .
COPY src src
COPY irissession.sh /
SHELL ["/irissession.sh"]
RUN \
do $SYSTEM.OBJ.Load("Installer.cls", "ck") \
set sc = ##class(App.Installer).setup()
# bringing the standard shell back
SHELL ["/bin/bash", "-c"]
CMD [ "-l", "/usr/irissys/mgr/messages.log" ]
And another which installs ZPM and Webterminal:
ARG IMAGE=intersystems/iris:2019.1.0S.111.0
ARG IMAGE=store/intersystems/iris-community:2019.3.0.309.0
FROM $IMAGE
USER root
WORKDIR /opt/irisapp
RUN chown ${ISC_PACKAGE_MGRUSER}:${ISC_PACKAGE_IRISGROUP} /opt/irisapp
USER irisowner
RUN mkdir -p /tmp/deps \
&& cd /tmp/deps \
&& wget -q https://pm.community.intersystems.com/packages/zpm/latest/installer -O zpm.xml
COPY Installer.cls .
COPY src src
COPY irissession.sh /
# running IRIS and open IRIS termninal in USER namespace
SHELL ["/irissession.sh"]
# below is objectscript executed in terminal
# each row is what you type in terminal and Enter
RUN \
do $SYSTEM.OBJ.Load("Installer.cls", "ck") \
set sc = ##class(App.Installer).setup() \
Do $system.OBJ.Load("/tmp/deps/zpm.xml", "ck") \
zn "IRISAPP" \
zpm "install webterminal"
# bringing the standard shell back
SHELL ["/bin/bash", "-c"]
CMD [ "-l", "/usr/irissys/mgr/messages.log" ]
First, thanks for this. It go me up and running pretty fast (as title says!). Couple of things:
- The notes/documentation say that code will be loaded into USER namespace, however it's actually being loaded into IRISAPP (as configured in docckerfiles).
- The jason.config is pointing to USER namespace so any new files and changes to existing will be actually loaded into USER instead of IRISAPP
- Make sure it's all consistent
- The webapp (irisweb) is missing a config for the directory where to store files. I fixed this by modifying the app in management portal. Need to address the installation file/dockerfile
- Haven't been able to make CSPs flow to the container the same as classes. I'm sure I'm missing something but haven't figured out what yet. Any tips? Maybe I'm placing files in the wrong location? Right now I created a csp/irisweb folder under src folder.
- The notes/documentation say that code will be loaded into USER namespace, however it's actually being loaded into IRISAPP (as configured in docckerfiles).
Because I made an update to the code recently) And not to the documentation) PR is welcome, or I'll change it by myself soon. Or add an issue!
- The jason.config is pointing to USER namespace so any new files and changes to existing will be actually loaded into USER instead of IRISAPP
Yes, it's a bug from previous version. Need to be fixed, thanks!
- The webapp (irisweb) is missing a config for the directory where to store files. I fixed this by modifying the app in management portal. Need to address the installation file/dockerfile
Cool!
Do you want to make a PR?
- Haven't been able to make CSPs flow to the container the same as classes. I'm sure I'm missing something but haven't figured out what yet. Any tips? Maybe I'm placing files in the wrong location? Right now I created a csp/irisweb folder under src folder.
You need to COPY this files from /csp in sources to /usr/irissys/mgr/csp/yourwebapp in Dockerfile OK, I'll give it a try and create a PR once I have a new config file. Thanks! Hi @warlin.Garcia !
I pushed the commit into the repo which fixes Namespace and README issue, thanks! Thank you. I'll check on it.
Do you have any tips to make csp changes flow in realtime the same as classes do? I've modified the dockerfile to copy the contents of my csp directory into the container, however my edits to CSPs are not flowing realtime which forces me to rebuild the container everytime to get my updates. Sure!
just add the volume mapping in docker-compose.yml file which maps folder /csp in sources into /csp directory in docker. Got it all working. Thank you! Updated the Installer.cls file - now you can set up the namespace and path in variables "Namespace" and "app" respectfully at the top of Installer.cls.
<Default Name="Namespace" Value="IRISAPP"/> <Default Name="app" Value="irisapp" />
This is convenient if you need to install the app to the namespace with your name.
Notice, that if you want instantly edit and compile the code of your project with VSCode don't forget to change the Namespace parameter in settings.json too.
Announcement
Evgeny Shvarov · Jun 10, 2023
Hi colleagues!
InterSystems Grand Prix 2023 unites all the key features of InterSystems IRIS Data Platform!
Thus we invite you to use the following features and collect additional technical bonuses that will help you to win the prize!
Here we go!
LLM AI or LangChain usage: Chat GPT, Bard and others - 6
InterSystems FHIR SQL Builder- 5
InterSystems FHIR - 3
IntegratedML - 4
Native API - 3
Embedded Python - 4
Interoperability - 3
Production EXtension(PEX) - 2
Adaptive Analytics (AtScale) Cubes usage - 3
Tableau, PowerBI, Logi usage - 3
InterSystems IRIS BI - 3
Columnar Index Usage - 1
Docker container usage - 2
ZPM Package deployment - 2
Online Demo - 2
Unit Testing - 2
Implement InterSystems Community Idea - 4
First Article on Developer Community - 2
Second Article On DC - 1
Code Quality pass - 1
First Time Contribution - 3
Video on YouTube - 3
LLM AI or LangChain usage: Chat GPT, Bard and others - 6 points
Collect 6 bonus expert points for building a solution that uses LangChain libs or Large Language Models (LLM) such as ChatGPT, Bard and other AI engines like PaLM, LLaMA and more. AutoGPT usage counts too.
A few examples already could be found in Open Exchange: iris-openai, chatGPT telegram bot.
Here is an article with langchain usage example.
InterSystems FHIR SQL Builder - 5 points
InterSystems FHIR SQL Builder is a feature of InterSystems IRIS for Health that helps to map FHIR resources to SQL tables and consume it via SQL queries in your application.
Learn more in the documentation.
Online course.Here is an example on Open Exchange.
NB: If you implement InterSystems FHIR SQL Builder the bonus 3 points for InterSystems FHIR as a Service and IRIS For Health is not included.
InterSystems FHIR as a Service and IRIS For Health - 3 points
We invite all developers to build new or test existing applications using InterSystems FHIR Server (FHIRaaS). Sign in to the portal, make the deployment and start using your InterSystems FHIR server on AWS in your application for the programming contest.
You can also build an FHIR application using InterSystems IRIS for Health, docker version. You can take the IRIS-FHIR-Template which prepares the FHIR server during the docker image building. The documentation for FHIR API 4.0.1 could be found here. Learn more in InterSystems IRIS for Health documentation.
IntegratedML usage - 4 points
1. Use InterSystems IntegratedML in your AI/ML solution. Here is the template that uses it.
InterSystems IntegratedML template
2. Data import tools:
Data Import Wizard
CSVGEN - CSV import util
CSVGEN-UI - the web UI for CSVGEN
3. Documentation:
Using IntegratedML
4. Online courses & videos:
Learn IntegratedML in InterSystems IRIS
Preparing Your Data for Machine Learning
Predictive Modeling with the Machine Learning Toolkit
IntegratedML Resource Guide
Getting Started with IntegratedML
Machine Learning with IntegratedML & Data Robot
InterSystems Native API usage - 3 points
You get this bonus if you access the data in your Full-Stack application using any of the InterSystems Native API options: .NET, Java, Python, Node.js. Learn more here.
Embedded Python - 4 points
Use Embedded Python in your application and collect 4 extra points. You'll need at least InterSystems IRIS 2021.2 for it.
NB: If you also use Native API for Python only Embedded Python bonus counts.
Interoperability Productions with BPL or DTL - 3 points
One of the key features of IRIS Interoperability Productions is a business process, which could be described by BPL (Business Process Language).
Learn more about Business Processes in the documentation.
Business Rule is a no-code/low-code approach to managing the processing logic of the interoperability production. In InterSystems IRIS you can create a business rule which you can create visually or via the ObjectScript representation.
You can collect the Business Process/Business Rule bonus if you create and use the business process or business rule in your interoperability production.
Business Rule Example
Learn more on Business Rules in the documentation
Production EXtension (PEX) Usage - 2 points
PEX is a Python, Java or .NET extension of Interoperability productions.
You get this bonus if you use PEX with Python, JAVA or .NET in your interoperability production.
PEX Demo.
Learn more on PEX in Documentation.
InterSystems IRIS has Python Pex module that provides the option to develop InterSystems Interoperability productions from Python. Use it and collect 3 extra points for your application. It's OK also to use alternative python.pex wheel introduced by Guillaume Ronguier.
You can also use Python Interoperability which is a PEX addon module for InterSystems IRIS on python provided by @Guillaume.Rongier7183 that gives the opportunity to develop InterSystems IRIS interoperability solutions in clear python.
Article to use PEX for Hugging Face, example.
Adaptive Analytics (AtScale) Cubes usage - 3 pointsInterSystems Adaptive Analytics provides the option to create and use AtScale cubes for analytics solutions.
You can use the AtScale server we set up for the contest (URL and credentials can be collected in the Discord Channel) to use cubes or create a new one and connect to your IRIS server via JDBC.
The visualization layer for your Analytics solution with AtScale can be crafted with Tableau, PowerBI, Excel, or Logi.
Documentation, AtScale documentation
Training
Tableau, PowerBI, Logi usage - 3 points
Collect 3 points for the visualization you made with Tableau, PowerBI, or Logi - 3 points per each.
Visualization can be made vs a direct IRIS BI server or via the connection with AtScale.
Logi is available on behalf of the InterSystems Reports solution - you can download the composer on InterSystems WRC. A temporary license can be collected in the discord channel.
Documentation
Training
InterSystems IRIS BI - 3 points
InterSystems IRIS Business Intelligence is a feature of IRIS which gives you the option to create BI cubes and pivots against persistent data in IRIS and deliver then this information to users using interactive dashboards.
Learn more
The basic iris-analytics-template contains examples of an IRIS BI cube, pivot, and a dashboard.
Here is the set of examples of IRIS BI solutions:
Samples BI
Covid19 analytics
Analyze This
Game of Throne Analytics
Pivot Subscriptions
Error Globals Analytics
Creating InterSystems IRIS BI Solutions Using Docker & VSCode (video)
The Freedom of Visualization Choice: InterSystems BI (video)
InterSystems BI(DeepSee) Overview (online course)
InterSystems BI(DeepSee) Analyzer Basics (online course)
Columnar Index Usage - 1 point
Columnar Index feature can significantly improve the performance of analytics queries. Use columnar indexes in your solution's persistent data model and collect 1 extra bonus point. Learn more about Columnar Indexes.
Docker container usage - 2 points
The application gets a 'Docker container' bonus if it uses InterSystems IRIS running in a docker container. Here is the simplest template to start from.
ZPM Package deployment - 2 points
You can collect the bonus if you build and publish the ZPM(InterSystems Package Manager) package for your Full-Stack application so it could be deployed with:
zpm "install your-multi-model-solution"
command on IRIS with ZPM client installed.
ZPM client. Documentation.
Online Demo of your project - 2 pointsCollect 2 more bonus points if you provision your project to the cloud as an online demo. You can do it on your own or you can use this template - here is an Example. Here is the video on how to use it.
Unit Testing - 2 points
Applications that have Unit Testing for the InterSystems IRIS code will collect the bonus.
Learn more about ObjectScript Unit Testing in Documentation and on Developer Community.
Implement Community Opportunity Idea - 4 points
Implement any idea from the InterSystems Community Ideas portal which has the "Community Opportunity" status. This will give you 4 additional bonus points.
Article on Developer Community - 2 points
Post an article on Developer Community that describes the features of your project and collect 2 points for the article.
The Second article on Developer Community - 1 point
You can collect one more bonus point for the second article or the translation regarding the application. The 3rd and more will not bring more points but the attention will all be yours.
Code quality pass with zero bugs - 1 point
Include the code quality Github action for code static control and make it show 0 bugs for ObjectScript.
First Time Contribution - 3 points
Collect 3 bonus points if you participate in InterSystems Open Exchange contests for the first time!
Video on YouTube - 3 points
Make the Youtube video that demonstrates your product in action and collect 3 bonus points per each.
The list of bonuses is subject to change. Stay tuned! The bonus set is updated. Two bonuses added:
4 points for the community opportunity implementation.
1 point for the columnar index usage.
@Evgeny.Shvarov - I submitted my application for the contest and I'm really excited! Could you please let me know how I can claim the bonus points? Thanks in advance. Hi @Ikram.Shah3431 !
Tomorrow we'll publish the bonus table for all the applications. If something is not accurate you comment here or in Discord
May I ask if this score is an expert score or a community score?
Article
Lucas Enard · Aug 17, 2022
[In this GitHub](https://github.com/grongierisc/iris-python-flask-api-template) based on [this InterSystems community rest api template](https://github.com/intersystems-community/iris-rest-api-template) Guillaume and I have created this example of all the import CRUD operations usable using ONLY Python on IRIS and using Flask.
Using the IRIS ORM or by simply doing SQL requests as both methods are seen in the GitHub.
# 1. intersystems-iris-docker-rest-template
This is a template of a REST API application built in python in InterSystems IRIS.
It also has OPEN API spec, can be developed with Docker and VSCode.
- [1. intersystems-iris-docker-rest-template](#1-intersystems-iris-docker-rest-template)
- [2. Prerequisites](#2-prerequisites)
- [3. Installation](#3-installation)
- [3.1. Installation for development](#31-installation-for-development)
- [3.2. Management Portal and VSCode](#32-management-portal-and-vscode)
- [3.3. Having the folder open inside the container](#33-having-the-folder-open-inside-the-container)
- [4. How it works](#4-how-it-works)
- [5. How to Work With it](#5-how-to-work-with-it)
- [5.1. POST request](#51-post-request)
- [5.1.1. Testing POST request](#511-testing-post-request)
- [5.1.2. How POST request works](#512-how-post-request-works)
- [5.2. GET requests](#52-get-requests)
- [5.2.1. Testing GET request](#521-testing-get-request)
- [5.2.2. How GET request works](#522-how-get-request-works)
- [5.3. PUT request](#53-put-request)
- [5.3.1. Testing PUT request](#531-testing-put-request)
- [5.3.2. How PUT request works](#532-how-put-request-works)
- [5.4. DELETE request](#54-delete-request)
- [5.4.1. Testing DELETE request](#541-testing-delete-request)
- [5.4.2. How DELETE request works](#542-how-delete-request-works)
- [6. How to start coding](#6-how-to-start-coding)
- [7. What's inside the repo](#7-whats-inside-the-repo)
- [7.1. Dockerfile](#71-dockerfile)
- [7.2. .vscode/settings.json](#72-vscodesettingsjson)
- [7.3. .vscode/launch.json](#73-vscodelaunchjson)
# 2. Prerequisites
Make sure you have [git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git) and [Docker desktop](https://www.docker.com/products/docker-desktop) installed.
It is to be noted that the table Sample.Person was already created in advance for the demo using in the management portal in the sql tab:
```sql
CREATE TABLE Sample.Person (
Company VARCHAR(50),
DOB DATE,
Name VARCHAR(4096),
Phone VARCHAR(4096),
Title VARCHAR(50)
)
```
# 3. Installation
## 3.1. Installation for development
Clone/git pull the repo into any local directory e.g. like it is shown below:
```
$ git clone https://github.com/grongierisc/iris-python-flask-api-template.git
```
Open the terminal in this directory and run:
```
$ DOCKER_BUILDKIT=1 docker-compose up -d --build
```
## 3.2. Management Portal and VSCode
This repository is ready for [VS Code](https://code.visualstudio.com/).
Open the locally-cloned `formation-template-python` folder in VS Code.
If prompted (bottom right corner), install the recommended extensions.
## 3.3. Having the folder open inside the container
**It is really important** to be *inside* the container before coding.
For this, docker must be on before opening VSCode.
Then, inside VSCode, when prompted (in the right bottom corner), reopen the folder inside the container so you will be able to use the python components within it.
The first time you do this it may take several minutes while the container is readied.
[More information here](https://code.visualstudio.com/docs/remote/containers)

By opening the folder remote you enable VS Code and any terminals you open within it to use the python components within the container. Configure these to use `/usr/irissys/bin/irispython`
# 4. How it works
The `app.py`, once launched (inside the container) will gather CRUD request.
Depending on the type of the request, the right message will be created to send to the `FlaskService`, this service will call the CrudPerson operation that will, depending on the type of the message send from the service to it, dispatch the information needed to do the action requested.
For more details you can check the `How it works` part of [this fully documented demo](https://github.com/grongierisc/interoperability-embedded-python).
# 5. How to Work With it
This template creates /crud REST web-application on IRIS which implements 4 types of communication: GET, POST, PUT and DELETE aka CRUD operations.
These interface works with a sample persistent class `Person` found in `src/python/person/obj.py`.
First of all, it is needed to start the 'app.py' situated in `src/python/person/app.py` using flask.
To do this, go in the `app.py` file, then to the `run and debug` window in VSCode and select `Python: Flask` then run.
This will run the app.
## 5.1. POST request
### 5.1.1. Testing POST request
Create a POST request, for example in Postman or in RESTer for mozilla, with raw data in JSON like:
```json
{"name":"Elon Musk","title":"CEO","company":"Tesla","phone":"123-123-1233","dob":"1982-01-19"}
```
Using `Content-Type` as `application/json`
Adjust the authorisation if needed - it is basic for container with default login and password for IRIS Community edition container.
Send the POST request to `localhost:5000/persons/`
This will create a record in the table Sample.Person of IRIS and return the `id` of the newly added `Person`
 of the POST request to add `Elon Musk` to the table.
### 5.1.2. How POST request works
```python
def create_person(self,request:CreatePersonRequest):
"""
> Create a new person in the database and return the new person's ID
:param request: The request object that was passed in from the client
:type request: CreatePersonRequest
:return: The ID of the newly created person.
"""
# sqlInsert = 'insert into Sample.Person values (?,?,?,?,?)'
# iris.sql.exec(sqlInsert,request.person.company,dob,request.person.name,request.person.phone,request.person.title)
# IRIS ORM
person = iris.cls('Sample.Person')._New()
if (v:=request.person.company) is not None: person.Company = v
if (v:=request.person.name) is not None: person.Name = v
if (v:=request.person.phone) is not None: person.Phone = v
if (v:=request.person.title) is not None: person.Title = v
if (v:=request.person.dob) is not None: person.DOB = v
Utils.raise_on_error(person._Save())
return CreatePersonResponse(person._Id())
```
Using IRIS ORM we can create a new `Person` and save into our database.
## 5.2. GET requests
### 5.2.1. Testing GET request
To test GET you need to have some data. You can create it with a [POST request](#41-testing-post-request).
This REST API exposes two GET requests: all the data and one record.
To get all the data in JSON call:
```
localhost:5000/persons/all
```
To request the data for a particular record provide the id in GET request like 'localhost:5000/persons/id', here is an example:
```
localhost:5000/persons/1
```
This will return JSON data for the person with ID=1, something like that:
```json
{"name":"Elon Musk","title":"CEO","company":"Tesla","phone":"123-123-1233","dob":"1982-01-19"}
```
### 5.2.2. How GET request works
```python
def get_person(self,request:GetPersonRequest):
"""
> The function takes a `GetPersonRequest` object, executes a SQL query, and returns a
`GetPersonResponse` object
:param request: The request object that is passed in
:type request: GetPersonRequest
:return: A GetPersonResponse object
"""
sql_select = """
SELECT
Company, DOB, Name, Phone, Title
FROM Sample.Person
where ID = ?
"""
rs = iris.sql.exec(sql_select,request.id)
response = GetPersonResponse()
for person in rs:
response.person= Person(company=person[0],dob=person[1],name=person[2],phone=person[3],title=person[4])
return response
def get_all_person(self,request:GetAllPersonRequest):
"""
> This function returns a list of all the people in the Person table
:param request: The request object that is passed to the service
:type request: GetAllPersonRequest
:return: A list of Person objects
"""
sql_select = """
SELECT
Company, DOB, Name, Phone, Title
FROM Sample.Person
"""
rs = iris.sql.exec(sql_select)
response = GetAllPersonResponse()
response.persons = list()
for person in rs:
response.persons.append(Person(company=person[0],dob=person[1],name=person[2],phone=person[3],title=person[4]))
return response
```
This time, using the `iris python` `sql.exec` function, we can directly run SQL code inside the IRIS database, gather the information needed and send it back to the API and to the user.
## 5.3. PUT request
### 5.3.1. Testing PUT request
PUT request could be used to update the records. This needs to send the similar JSON as in POST request above supplying the id of the updated record in URL.
For example we want to change the record with id=5. Prepare the JSON in raw like following:
```
{"name":"Jeff Besos","title":"CEO","company":"Amazon","phone":"123-123-1233","dob":"1982-01-19"}
```
and send the put request to:
```
localhost:5000/persons/5
```
### 5.3.2. How PUT request works
```python
def update_person(self,request:UpdatePersonRequest):
"""
> Update a person in the database
:param request: The request object that will be passed to the service
:type request: UpdatePersonRequest
:return: UpdatePersonResponse()
"""
# IRIS ORM
if iris.cls('Sample.Person')._ExistsId(request.id):
person = iris.cls('Sample.Person')._OpenId(request.id)
if (v:=request.person.company) is not None: person.Company = v
if (v:=request.person.name) is not None: person.Name = v
if (v:=request.person.phone) is not None: person.Phone = v
if (v:=request.person.title) is not None: person.Title = v
if (v:=request.person.dob) is not None: person.DOB = v
Utils.raise_on_error(person._Save())
return UpdatePersonResponse()
```
Using IRIS ORM we can check if the id leads to a `Person`, if it does, we can update it using our new information and save it into our database.
## 5.4. DELETE request
### 5.4.1. Testing DELETE request
For delete request this REST API expects only the id of the record to delete. E.g. if the id=5 the following DELETE call will delete the record:
```
localhost:5000/persons/5
```
### 5.4.2. How DELETE request works
```python
def delete_person(self,request:DeletePersonRequest):
"""
> Delete a person from the database
:param request: The request object that is passed to the service
:type request: DeletePersonRequest
:return: The response is being returned.
"""
sql_select = """
DELETE FROM Sample.Person as Pers
WHERE Pers.id = ?
"""
rs = iris.sql.exec(sql_select,request.id)
response = DeletePersonResponse()
return response
```
This time, using the `iris python` `sql.exec` function, we can directly run SQL code inside the IRIS database and delete the person.
# 6. How to start coding
This repository is ready to code in VSCode with InterSystems plugins.
Open `/src/python/person/app.py` to change anything on the api.
Open `/src/python/person/bo.py` to be able to change things related to the internal requests, this is where you can use SQL - it will be compiled in running IRIS docker container.
# 7. What's inside the repo
## 7.1. Dockerfile
The simplest dockerfile to start IRIS.
Use the related docker-compose.yml to easily setup additional parametes like port number and where you map keys and host folders.
## 7.2. .vscode/settings.json
Settings file to let you immedietly code in VSCode with [VSCode ObjectScript plugin](https://marketplace.visualstudio.com/items?itemName=daimor.vscode-objectscript))
## 7.3. .vscode/launch.json
Config file if you want to debug with VSCode ObjectScript
If you are interested, check out the formation I made on the tools I used in this GitHub :
https://github.com/LucasEnard/formation-template-python As Issues are disabled in the GitHub Repo I place my issue here.
Container starts fine
Postman fails:POST http://localhost:5000/persons/
Error: connect ECONNREFUSED 127.0.0.1:5000
Network
agent: "Desktop Agent"
Request Headers
Content-Type: application/json
User-Agent: PostmanRuntime/7.29.2
Accept: */*
Cache-Control: no-cache
Postman-Token: d40ec7c2-5b24-4944-8a76-c4cbf2685bf7
Host: localhost:5000
Accept-Encoding: gzip, deflate, br
Connection: keep-alive
Request Body
{"name":"Elon Musk","title":"CEO","company":"Tesla","phone":"123-123-1233","dob":"1982-01-19"}
as port 5000 is not mapped in docker-compose.yml this might be related ???
DONE!I found my solution and placed it in a PullRequest.it was slightly more than just a missing port. Thanks Robert for the PullRequest and making sure of the quality of the demos/articles.
Sometimes it's hard do build and test at the same time, this is why in medium to large projects the development teams are different from the testers.
Thanks again :) You're welcome. Those things keep my mind in speed. I failed in testing.where can I find the pull request and how to use it? my PR contains an update on docker-compose.ymland a bash script startflask.shall available here https://github.com/rcemper/iris-python-flask-api-template
Sep.23 all merged I'm getting 403 when I do a Post request on localhost:5000/persons/. What am I doing wrong? Yes, the flask app run on 8080 and map on docker compose to 4040.
I update the github to make flask run on 5000 and map to 5000. @Guillaume.Rongier7183 , just curious, why do you need CallIn service enabled?
Still have Forbidden after update and rebuild:
Because underneath embedded python is using CallIn Service of IRIS.
And by default this service is off. So, every time I need to use embedded python with IRIS I need to turn CallIn on?
Announcement
Anastasia Dyubaylo · Mar 3, 2023
Hi Community,
Watch this video to explore common security pitfalls within the industry and how to avoid them when building applications on InterSystems IRIS:
⏯ The OWASP Top 10 & InterSystems IRIS Application Development @ Global Summit 2022
Presenters:
🗣 @Timothy.Leavitt, Application Services Development Manager🗣 @Pravin.Barton, Developer, Application Services🗣 @Wangyi.Huang, Technical Specialist, Application Services
Subscribe to our Youtube channel InterSystems Developers to stay up to date!
Announcement
Anastasia Dyubaylo · Feb 19, 2023
Hey Developers,
Enjoy watching the new video on InterSystems Developers YouTube:
⏯ Understanding your InterSystems Login Account & Where to Use It @ Global Summit 2022
Learn about your InterSystems Login Account, how to use it to get access to InterSystems Services like the Developer Community, Evaluation Service, Open Exchange, Online Learning, WRC and others. This will also cover the new features for controlling your personal communication preferences.
Presenters:
🗣 @Timothy.Leavitt, AppServices Development Manager,InterSystems🗣 @Pravin.Barton, Internal Application Developer, InterSystems
Hope you like it and stay tuned! 👍 Correction on this post - I was originally supposed to present but unfortunately was unable to attend Global Summit due to testing positive for COVID the day before :( Call out to @Timothy.Leavitt who stepped in and presented in my place and did a great job.
Watch the video! I got way too much air time last Summit. Thanks for noticing, guys! Fixed ;)
Article
Eduard Lebedyuk · Feb 10, 2023
In this article, we will establish an encrypted JDBC connection between Tableau Desktop and InterSystems IRIS database using a JDBC driver.
While [documentation on configuring TLS with Java clients](https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=GTLS_javacli) covers all possible topics on establishing an encrypted JDBC connection, configuring it with Tableau might be a little bit tricky, so I decided to write it down.
# Securing SuperServer
Before we start with client connections, you need to configure SuperServer, which by default runs on port `1972` and is responsible for xDBC traffic to accept encrypted connections. This topic is described in the [documentation](https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=GTLS_superserver), but to summarise it, you need to do three things:
1. Generate CA and server cert/key. I use [easyRSA](https://github.com/OpenVPN/easy-rsa):
```bash
mkdir /tls
cd /easyrsa3
./easyrsa init-pki
# Prep a vars file https://www.sbarjatiya.com/notes_wiki/index.php/Easy-rsa#Initialize_pki_infrastructure
# cp /tls/vars /opt/install/easy-rsa/easyrsa3/pki/vars
./easyrsa build-ca
./easyrsa gen-req IRIS nopass
./easyrsa sign-req server IRIS
cp pki/issued/* /tls/
cp pki/private/* /tls/
cp pki/ca.crt /tls/
sudo chown irisusr:irissys /tls/*
sudo chmod 440 /tls/*
```
Optionally, generate client cert/key for mutual verification. I recommend doing it after establishing an initial encrypted connection.
2. Create `%SuperServer` SSL Configuration, which uses server cert/key from (1).
```objectscript
set p("CertificateFile")="/tls/IRIS.crt"
set p("Ciphersuites")="TLS_AES_256_GCM_SHA384:TLS_CHACHA20_POLY1305_SHA256:TLS_AES_128_GCM_SHA256"
set p("Description")="Autogenerated SuperServer Configuration"
set p("Enabled")=1
set p("PrivateKeyFile")="/tls/IRIS.key"
set p("PrivateKeyPassword")=""
set p("PrivateKeyType")=2
set p("TLSMaxVersion")=32
set p("TLSMinVersion")=16 // Set TLSMinVersion to 32 to stick with TLSv1.3
set p("Type")=1
set sc = ##class(Security.SSLConfigs).Create("%SuperServer", .p)
kill p
```
3. Enable (or Require) SuperServer SSL/TLS support::
```
set p("SSLSuperServer")=1
set sc = ##class(Security.System).Modify("SYSTEM", .p)
kill p
```
Where: 0 - disabled, 1 - enabled, 2 - required.
Before you Require SSL/TLS connections, remember to enable SSL/TLS connections to the SuperServer for [WebGateway](https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=GTLS_WEBGATEWAY) and [Studio](https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=GTLS_studio). I recommend first enabling SSL/TLS connections, then verifying that all clients (xDBC, WebGateway, Studio, NativeAPI, etc.) use encrypted SSL/TLS connections and requiring SSL/TLS connections only after that.
Now we are ready to establish an encrypted JDBC connection from Tableau Desktop.
# Configuring encrypted JDBC connection from Tableau Desktop
1. [Download JDBC drivers](https://intersystems-community.github.io/iris-driver-distribution/).
2. Place the JDBC driver into the correct folder:
- Windows: `C:\Program Files\Tableau\Drivers`
- Mac: `~/Library/Tableau/Drivers`
- Linux: `/opt/tableau/tableau_driver/jdbc`
3. Create `truststore.jks` from the SuperServer CA certificate: `keytool -import -file CA.cer -alias CA -keystore truststore.jks -storepass 123456 -noprompt`
4. Create `SSLConfig.properties` file:
```
debug = false
logFile = javatls.log
protocol = TLSv1.3
cipherSuites = TLS_AES_256_GCM_SHA384
trustStoreType = JKS
trustStore = truststore.jks
trustStorePassword = 123456
trustStoreRecoveryPassword = 123456
```
5. Place `SSLConfig.properties` and `truststore.jks` in a working directory of your java program. Note that Tableau spawns several processes. You need a working directory for the Java process (which is not a root process or one of the QT processes). Here's how to find it on Windows with [ProcessExplorer](https://learn.microsoft.com/en-us/sysinternals/downloads/process-explorer):

and go to the java process properties and get the current directory:

So for Windows, for me it's `C:\Users\elebedyu\AppData\Local\Temp\TableauTemp`, or in general `%LOCALAPPDATA%\Temp\TableauTemp`.
On Mac and Linux, you can figure it out using `lsof -d cwd`. Please write in comments if you determined the directory for Mac/Linux.
6. Create `.properties` file. This file contains JDBC customizations. Name it `IRIS.properties` and place it:
- On Windows: `C:\Users\\Documents\My Tableau Repository`
- On Mac and Linux: `~/Documents/My Tableau Repository`
If you are using the Tableau version before 2020.3, you should use Java 8 for JDBC connections. Java 8 properties files require ISO-8859-1 encoding. As long as your properties file contains only ASCII characters, you can also save it in UTF-8. However, ensure that you don't save with a BOM (Byte order mark). Starting in Tableau version 2020.3, you can use UTF-8 encoded properties files, which are standard with Java 9+.
In this file, specify your JDBC connection properties:
```
host=
port=
user=
password=
connection\ security\ level=10
```
Note that Tableau uses `JDBCDriverManager`, all properties are [listed here](https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=BJAVA_refapi#BJAVA_refapi_connparms). As per the `.properties` [file spec](https://en.wikipedia.org/wiki/.properties), you must escape whitespaces in keys with a `\`.
7. Open Tableau and create a new JDBC connection, specifying the properties file. It should connect.

# Debugging connections
First try establishing an unencrypted connection. If something does not work, go to `My Tableau Repository\Logs` and open `jprotocolserver.log`. In there, search for:
```
Connecting to jdbc:IRIS://host:port/DATABASE
Connection properties {password=*******, connection security level=*******, port=*******, host=*******, user=_SYSTEM}
Connected using driver {com.intersystems.jdbc.IRISDriver} from isolatedDriver.
```
If logged connection properties do not contain the properties you expect, Tableau is not seeing your `IRIS.properties` file.
# Verifying that connection is encrypted
1. Close everything except for Tableau, which might communicate with IRIS.
2. Open WireShark and start capturing on the correct network interface. Set Capture Filter to: `host `.

3. In Tableau, try to connect to IRIS, change the schema, or perform any other action requiring server communication.
4. If everything is properly configured, you should see Packets with Protocol `TLSv1.3`, and following a TCP stream should result in you looking at a ciphertext.

# Conclusion
Use the best security practices for xDBC connections.
# Links
- [SuperServer TLS](https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=GTLS_superserver)
- [WebGateway TLS](https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=GTLS_WEBGATEWAY)
- [Studio TLS](https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=GTLS_studio)
- [Java TLS](https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=GTLS_javacli)
- [Java Connection Properties](https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=BJAVA_refapi#BJAVA_refapi_connparms)
- [easyRSA](https://github.com/OpenVPN/easy-rsa) This helped me a lot, thanks!! I just want to note that in the section "Securing SuperServer" there is a typo where it says:
set sc = ##class(Security.SSLConfig).Create("%SuperServer", .p)
should say:
set sc = ##class(Security.SSLConfigs).Create("%SuperServer", .p)
Because it threw me a "Class Does Not Exists" error.
Regards,
Mauro
Thanks! Fixed.
Are you on Mac by any chance? No, Windows as the client and Linux as the server
Announcement
Laurel James (GJS) · Feb 15, 2023
How source control integrates with your system is imperative in ensuring it works seamlessly behind the scenes without interruption.
Deltanji source control understands the internal workings of InterSystems IRIS and provides a solution that can seamlessly handle its unique needs, with client integrations for VS Code, Studio, Management Portal, and Management Portal Productions.
You can find out more about the benefits of using a source control tailored for InterSystems IRIS at this webinar.
This demo will show how Deltanji goes beyond the traditional CI/CD pipeline, automating the project lifecycle from development through to deployment, making it the perfect source control companion for organizations with continually evolving systems.
🗓 Thursday, February 23rd⏰ 4 pm GMT | 5 pm CET | 11 am ET
Sign up here > http://bit.ly/40JOaxo
Announcement
Anastasia Dyubaylo · Mar 24, 2023
Hey Community!
Here are the bonuses for participants' articles that take part in the Tech Article Contest: InterSystems IRIS Tutorials:
No
Article
Topic bonus
Video bonus
Discussion bonus
Translation bonus
New member bonus
Total points
1
Quick sample database tutorial
+
+
4
2
Tutorial - Working with %Query #1
+
+
+
9
3
Tutorial - Working with %Query #2
+
+
8
4
Tutorial - Working with %Query #3
+
+
8
5
Tutorial - Streams in Pieces
+
+
8
6
SQLAlchemy - the easiest way to use Python and SQL with IRIS's databases
+
+
+
9
7
Creating an ODBC connection - Step to Step
+
+
+
9
8
Tutorial - Develop IRIS using SSH
+
+
+
9
9
InterSystems Embedded Python in glance
+
5
10
Query as %Query or Query based on ObjectScript
+
+
+
+
10
11
Setting up VS Code to work with InterSystems technologies
+
+
4
12
Tutorial: Improving code quality with the visual debug tool's color-coded logs
+
3
13
Kinds of properties in IRIS
0
14
Backup and rebuilding procedure for the IRIS server
+
+
4
15
Stored Procedures the Swiss army knife of SQL
+
+
4
16
Tutorial how to analyze requests and responses received and processed in webgateway pods
0
17
InterSystems's Embedded Python with Pandas
+
+
8
18
Tutorial for Middle/Senior Level Developer: General Query Solution
+
+
+
9
19
Tutorial - Creating a HL7 TCP Operation for Granular Error Handling
0
20
Tutorial from Real Practice in China Hosipital Infomatics Construction: How to autobackup your code/ auto excute code when you are not allowed to use Git?
+
+
4
21
SQL IRIS Editor and IRIS JAVA CONNECTION
+
+
8
22
Perceived gaps to GPT assisted COS development automation
+
+
4
23
Set up an IRIS docker image on a Raspberry Pi 4
+
3
24
Using JSON in IRIS
+
5
Bonuses are subject to change upon the update.
Please claim your bonuses here in the comments below!
We've updated bonuses!
This time our expert decided that 3 articles will collect our "Discussion Bonus" for the most useful discussion in the post.
p.s. Only one day left to enter the competition! And collect all our bonuses. Good luck to all participants!