Purpose

Most CloudFormation articles are Linux-based (no wonder), but there seems to be a demand for automation for Windows as well. Based on this original article by Anton, I implemented an example of deploying a mirror cluster to Windows servers using CloudFormation.I also added a simple walk through.
The complete source code can be found here.

Update: 2021 March 1 I added a way to connect to Windows shell by public key authentication via a bastion host as a one-liner.

0 0
0 547

Welcome community members to a new article! this time we are going to test the interoperability capabilities of IRIS for Health to work with DICOM files.

Let's go to configure a short workshop using Docker. You'll find at the end of the article the URL to access to GitHub if you want to make it run in your own computer.

Previously to any configuration we are going to explain what is DICOM:

3 0
0 652

Written in reply to community post for can Python create HL7 Message dynamically.

Pre-requisites and setup

Use an integration enabled namespace.
Note: USER namespace is not enabled for interoperability by default.
If following suggest create a new interoperatibility namespace to explore functionality.

# Switch to
ZN "[Interoperability Namespace Name]"

# Launch interactive Python shell:
Do $SYSTEM.Python.Shell()

4 0
2 324
Article
· Jul 7, 2023 8m read
Iris FHIR Python Strategy

Description

With InterSystems IRIS FHIR Server you can build a Strategy to customize the behavior of the server (see documentation for more details).

Image

This repository contains a Python Strategy that can be used as a starting point to build your own Strategy in python.

This demo strategy provides the following features:

  • Update the capability statement to remove the Account resource
  • Simulate a consent management system to allow or not access to the Observation resource
    • If the User has sufficient rights, the Observation resource is returned
    • Otherwise, the Observation resource is not returned
6 0
2 242

In this article, I am demonstrating how to create a table column(formerly known as properties) with your custom datatype classes by using User defined DDL. Properties are the crucial member of the persistent class definition. Datatypes are essential to define types of values that are stored in a table column. In general, the datatype names of SQL different from Intersystems datatypes, such as VARCHAR = %String.

1 0
0 161
Scenario

IRIS has the likes of SQL inbound adapters for use with SQL gateways such as EnsLib.SQL.InboundAdapter to repeatedly query SQL Gateway connections. A scenario appeared as that we wanted to query an Internal database for some data but did not see an out of the box service for this.

Desired Approach

Have a Generic service that can poll internal SQL to work with downstream components.

2 0
0 190
Article
· Jan 28 3m read
Fhir-HepatitisC-Predict

Processing FHIR resources with FHIR SQL BUILDER to predict the probability of developing hepatitis C disease

With the development of technology, the medical industry is also constantly advancing, and humans often pay more attention to their own health,
By learning and processing datasets through computers, diseases can be predicted.

Pre condition: Ability to use FHIR and ML
Firstly, our dataset is obtained from kaggle and transformed into FHIR resources based on patient gender, age, ALP or ALT, and imported into the FHIR resource repository

4 0
0 83

This is more for my memory that anything else but I thought I'd share it because it often comes up in comments, but is not in the InterSystems documentation.

There is a wonderful utility called ^REDEBUG that increases the level of logging going into mgr\cconsole.log.

You activate it by

a) start terminal/login

b) zn "%SYS"

c) do ^REDEBUG

6 0
0 1.1K

Hi Developers!

Another way to start using InterSystems ObjectScript Package Manager is to use prebuilt container images of InterSystems IRIS Community Edition and InterSystems IRIS for Health Community Edition.

We deploy this IRIS images on DockerHub and you can run it with the following command:

docker run --rm -p 52773:52773 --init --name my-iris -d intersystemsdc/iris-community:2019.4.0.383.0-zpm

Launch a terminal with:

docker exec -it my-iris iris session IRIS

And install zpm-module as:

USER>zpm 

zpm: USER>install objectscript-math

[objectscript-math] Reload START

[objectscript-math] Reload SUCCESS

[objectscript-math] Module object refreshed.

[objectscript-math] Validate START

[objectscript-math] Validate SUCCESS

[objectscript-math] Compile START

[objectscript-math] Compile SUCCESS

[objectscript-math] Activate START

[objectscript-math] Configure START

[objectscript-math] Configure SUCCESS

[objectscript-math] Activate SUCCESS

zpm: USER>

And use same commands for InterSystems IRIS for Health using the tag: intersystemsdc/irishealth-community:2019.4.0.383.0-zpm

The images are being published on IRIS Community Edition and IRIS Community Edition for Health repositories of Docker Hub.

We will update tags with every new release of IRIS and ZPM.

Happy coding!

4 0
1 355

Hello, developers!

In this series, I will not show you how to use IRIS for Health, but rather how to use SUSHI, a tool for creating FHIR profiles, as an associated technology.

With the right tools, the profile information (specifications, limitations, extensions, etc.) of a FHIR project can be well organized and published.

Before we begin, what is SUSHI? I will briefly explain it.

9 0
0 2.3K

Hi developers!

Maybe you have to implement scenarios that don't require a FHIR repository, but forwarding FHIR requests, manage the responses and maybe run transformations or extract some values in between. Here you will find some examples that can be implemented using InterSystems IRIS For Health o HealthShare Health Connect.

6 0
1 372

Hi, I would like to tell you how easy it is to spin up IRIS for Health docker container in compute engine(VPS) in google cloud.

I know that to run IRIS for Health in AWS is pretty simple and straightforward, but I wanted to tried if its same easy in GCP environment.

Create vm instance. 2GB RAM is more than enough.

I used Debian 11 as Linux distro.

Standart persistent disk is cheaper.

6 0
1 550

Dear community members!

A very common problem of our users is to use an external database as data source in an IRIS production. As many of you already know, we have two ways to connect directly to an external database, the first one is using an ODBC connection, the second is using JDBC.

In our example we are going to create a connection using JDBC, and we are going to build a simple Docker's project, in this way you will be able to modify the example as you wish.

1 0
0 324

InterSystems FAQ rubric

If the journal file is too large to be searched or filtered using the Management Portal, you can refer to it using the following two methods.

① How to use the ^JRNDUMP utility
② How to reference it in a program

================================================== ==========

5 0
1 158

Setting up Management Portal Help Pages for Full WebServers

On each page of the System Management Portal, there is a “Help” button. This takes users to an article in documentation that describes the page functionality and use.

Caché provides local documentation for all of these articles.

InterSystems IRIS does not provide local documentation. Instead, the Help button will redirect users to the articles in the online documentation at docs.intersystems.com.

4 0
0 505

For those that, at some point, need to test what means that of ECP for horizontal escalability (computing power and/or users and processes concurrency), but they're lazy o have no much time to build the environment, configure the server nodes, etc..., I've just published in Open Exchange the app/sample OPNEx-ECP Deployment .

0 0
0 231
Article
· Feb 28, 2023 2m read
DataPipe: a data ingestion framework

Hi all!

I'm sharing a tool for data ingestion that we have used in some projects.

DataPipe is an interoperability framework for data ingestion in InterSystems IRIS in a flexible way. It allows you to receive data from external sources, normalize and validate the information and finally perform whatever operation you need with your data.

5 0
1 370