The Amazon Web Services (AWS) Cloud provides a broad set of infrastructure services, such as compute resources, storage options, and networking that are delivered as a utility: on-demand, available in seconds, with pay-as-you-go pricing. New services can be provisioned quickly, without upfront capital expense. This allows enterprises, start-ups, small and medium-sized businesses, and customers in the public sector to access the building blocks they need to respond quickly to changing business requirements.

Updated: 10-Jan, 2023

18 3
10 5.8K

Released with no formal announcement in IRIS preview release 2019.4 is the /api/monitor service exposing IRIS metrics in Prometheus format. Big news for anyone wanting to use IRIS metrics as part of their monitoring and alerting solution. The API is a component of the new IRIS System Alerting and Monitoring (SAM) solution that will be released in an upcoming version of IRIS.

11 2
7 1.8K

The following steps show you how to display a sample list of metrics available from the /api/monitor service.

In the last post, I gave an overview of the service that exposes IRIS metrics in Prometheus format. The post shows how to set up and run IRIS preview release 2019.4 in a container and then list the metrics.


This post assumes you have Docker installed. If not, go and do that now for your platform :)

17 10
5 1.2K
Article
· Apr 9, 2019 3m read
IRIS/Ensemble as an ETL

IRIS and Ensemble are designed to act as an ESB/EAI. This mean they are build to process lots of small messages.

But some times, in real life we have to use them as ETL. The down side is not that they can't do so, but it can take a long time to process millions of row at once.

To improve performance, I have created a new SQLOutboundAdaptor who only works with JDBC.

BatchSqlOutboundAdapter

Extend EnsLib.SQL.OutboundAdapter to add batch batch and fetch support on JDBC connection.

3 7
2 1.4K

Loading your IRIS Data to your Google Cloud Big Query Data Warehouse and keeping it current can be a hassle with bulky Commercial Third Party Off The Shelf ETL platforms, but made dead simple using the iris2bq utility.

Let's say IRIS is contributing to workload for a Hospital system, routing DICOM images, ingesting HL7 messages, posting FHIR resources, or pushing CCDA's to next provider in a transition of care. Natively, IRIS persists these objects in various stages of the pipeline via the nature of the business processes and anything you included along the way. Lets send that up to Google Big Query to augment and compliment the rest of our Data Warehouse data and ETL (Extract Transform Load) or ELT (Extract Load Transform) to our hearts desire.

A reference architecture diagram may be worth a thousand words, but 3 bullet points may work out a little bit better:

  • It exports the data from IRIS into DataFrames
  • It saves them into GCS as .avro to keep the schema along the data: this will avoid to specify/create the BigQuery table schema beforehands.
  • It starts BigQuery jobs to import those .avro into the respective BigQuery tables you specify.

5 3
0 933

¡Hi everybody!

As you likely are aware, the new version of InterSystems IRIS for Health (I4H) it's already available in Docker Hub. It's the Community version and is free and fully functional. There have been comments about it in other articles and posts,... so today I won't add anything about features. Here I want to explore "the mistery about the disappearance, or better, absence of our persistent data when we run a container with the durable option" (I didn't find a terrifying font to emphasize the thriller... post editor is not terrific for styling smiley ) .

2 0
2 395

Some time ago I developed an application that tackled a familarial problem faced by many developers when required to update multiple UAT or PRODUCTION sites with the latest Software patches that have been developed and tested on your DEV server and now need to be deployed to multiple sites running that software.

In principle the solution works as follows:

1) Prepare an XML export of affected classes/routines/csp pages/hl7 definitions et al

2) Optionally create a global export of any new globals or changes to existing globals

1 1
0 317
Article
· Oct 23, 2019 2m read
Unit Tests for Data Transforms

Would you like to be sure your data transforms work as expected with a single command? And what about writing unit tests for your data transforms in a quick and simple way?

When talking about interoperability, there are usually a lot of data transforms involved. Those data transforms are used to convert data between different systems or applications in your code, so they are running a very important job.

4 2
1 665

Every developer has made the mistake of accidentally leaving temporary debug code in place when they meant to remove it after debugging is complete. The great thing about writing in ObjectScript is that there is a way to make temporary code be truly temporary and automatically self-destruct! This can also be done in such a way that the code has no change of making it into your source control stream, which can be helpful as well.

3 1
0 368
Article
· Jul 4, 2019 1m read
Install EnsDemo on IRIS

Has you may know, EnsDemo from Ensemble are not available anymore on IRIS.

This is a good thing, Iris is cloud oriented, it must be light, fast. Now the new way of sharing samples or modules is through git, continuous integration and OpenExchange.

But, in some cases you want to go back to your good old samples from EnsDemo to get inspiration or best practices.

Good news, there is a git for that :

4 1
2 990

This is more for my memory that anything else but I thought I'd share it because it often comes up in comments, but is not in the InterSystems documentation.

There is a wonderful utility called ^REDEBUG that increases the level of logging going into mgr\cconsole.log.

You activate it by

a) start terminal/login

b) zn "%SYS"

c) do ^REDEBUG

6 0
0 1.1K