Hi Community!

I think everyone keeps the source code of the project in the repository nowadays: Github, GitLab, bitbucket, etc. Same for InterSystems IRIS projects check any on Open Exchange.

What do we do every time when start or continue working with a certain repository with InterSystems Data Platform?

We need a local InterSystems IRIS machine, have the environment for the project set up and the source code imported.

So every developer performs the following:

  1. Check out the code from repo
  2. Install/Run local IRIS installation
  3. Create a new namespace/database for a project
  4. Import the code into this new namespace
  5. Setup all the rest environment
  6. Start/continue coding the project

If you dockerize your repository this steps line could be shortened to this 3 steps:

  1. Check out the code from repo
  2. Run docker-compose build
  3. Start/continue coding the project

Profit - no any hands-on for 3-4-5 steps which could take minutes and bring head ache sometime.

You can dockerize (almost) any your InterSystems repo with a few following steps. Let’s go!

7 10
9 1,218

Hi Developers!

Often I find questions on how to install IRIS, connect to IRIS from IDE, setup the environment, compile, debug, maintain the repository.

Here below possibly the shortest way to set up all the environment and start development with ObjectScript on InterSystems IRIS.

Prerequisites

Make sure you have Git, Docker, and VSCode installed

Install Docker and ObjectScript extensions into VSCode

Sign in or Create an account on Github

Here we go!

5 9
5 1,205

Here is an ObjectScript snippet which lets to create database, namespace and a web application for InterSystems IRIS:

    set currentNS = $namespace

    zn "%SYS"

    write "Create DB ...",!
    set dbName="testDB"
    set dbProperties("Directory") = "/InterSystems/IRIS/mgr/testDB"
    set status=##Class(Config.Databases).Create(dbName,.dbProperties)
    write:'status $system.Status.DisplayError(status)
    write "DB """_dbName_""" was created!",!!


    write "Create namespace ...",!
    set nsName="testNS"
    //DB for globals
    set nsProperties("Globals") = dbName
    //DB for routines
    set nsProperties("Routines") = dbName
    set status=##Class(Config.Namespaces).Create(nsName,.nsProperties)
    write:'status $system.Status.DisplayError(status)
    write "Namespace """_nsName_""" was created!",!!


    write "Create web application ...",!
    set webName = "/csp/testApplication"
    set webProperties("NameSpace") = nsName
    set webProperties("Enabled") = $$$YES
    set webProperties("IsNameSpaceDefault") = $$$YES
    set webProperties("CSPZENEnabled") = $$$YES
    set webProperties("DeepSeeEnabled") = $$$YES
    set webProperties("AutheEnabled") = $$$AutheCache
    set status = ##class(Security.Applications).Create(webName, .webProperties)
    write:'status $system.Status.DisplayError(status)
    write "Web application """webName""" was created!",!

    zn currentNS

4 3
4 1,194
Article
· Feb 25, 2019 4m read
Using Grafana directly from IRIS

There have been some very helpful articles in the community that show how to use Grafana with IRIS (or Cache/Ensemble) by using an intermediate database.

But I wanted to get at IRIS structures directly. In particular, i wanted to access the Cache History monitor data that is accessible by SQL as described here

https://community.intersystems.com/post/apm-using-cach%C3%A9-history-mon...

and didn't want anything between me and the data.

9 5
4 1,190
Article
· Apr 9, 2019 3m read
IRIS/Ensemble as an ETL

IRIS and Ensemble are designed to act as an ESB/EAI. This mean they are build to process lots of small messages.

But some times, in real life we have to use them as ETL. The down side is not that they can't do so, but it can take a long time to process millions of row at once.

To improve performance, I have created a new SQLOutboundAdaptor who only works with JDBC.

BatchSqlOutboundAdapter

Extend EnsLib.SQL.OutboundAdapter to add batch batch and fetch support on JDBC connection.

3 7
2 1,185

There's an easy new way to add certificate authority (CA) certificates to your SSL/TLS configurations on InterSystems IRIS 2019.1 (and 2018.1.2) on Windows and Mac. You can ask IRIS to use the operating system's certificate store by entering:

%OSCertificateStore

in the field for "File containing Trusted Certificate Authority X.509 certificate(s)". Here's an image of how to do this in the portal:

14 4
4 1,140

Are you all ready for something you wish you knew ages ago (or, in my case, a DECADE ago)? Open up a portal in your favorite instance and go to:

System Administration->Configuration->Additional Settings->Startup

Scroll down to "Terminal Prompt" and click 'Edit'. This allows you to edit what you see on your terminal prompt. You can change that to my current setting: 8,3,2

What does this do? It adds your instance name for your prompt. So now your prompt can look like:

DEVELOPMENT:USER>

12 11
1 1,126

Some time ago, InterSystems introduced the concept of %DynamicObjects.
This feature is a powerful tool that makes it very easy to convert any string of JSON text to objects and vice versa.
However, in the work that J2 Interactive is doing for our customers, there are a couple of things that "need some tweaking".

3 3
2 1,083

Hi Developers!

Recently we launched InterSystems Package Manager - ZPM. And one of the intentions of the ZPM is to let you package your solution and submit into the ZPM registry to make its deployment as simple as "install your-package" command.

To do that you need to introduce module.xml file into your repository which describes what is your InterSystems IRIS package consists of.

This article describes different parts of module.xml and will help you to craft your own.

I will start from samples-objectscript package, which installs into IRIS the Sample ObjectScript application and could be installed with:

zpm: USER>install samples-objectscript

It is probably the simplest package ever and here is the module.xml which describes the package:

<?xml version="1.0" encoding="UTF-8"?>
<Export generator="Cache" version="25">
  <Document name="samples-objectscript.ZPM">
    <Module>
      <Name>samples-objectscript</Name>
      <Version>1.0.0</Version>
      <Packaging>module</Packaging>
      <SourcesRoot>src</SourcesRoot>
      <Resource Name="ObjectScript.PKG"/>
    </Module>
  </Document>
</Export>

6 12
5 1,069

Keywords: Anaconda, Jupyter Notebook, Tensorflow GPU, Deep Learning, Python 3 and HealthShare

1. Purpose and Objectives

This "Part I" is a quick record on how to set up a "simple" but popular deep learning demo environment step-by-step with a Python 3 binding to a HealthShare 2017.2.1 instance . I used a Win10 laptop at hand, but the approach works the same on MacOS and Linux.

4 0
2 1,019

How can you allow computers to trust one another in your absence while maintaining security and privacy?

“A Dry Martini”, he said. “One. In a deep champagne goblet.”
“Oui, monsieur.”
“Just a moment. Three measures of Gordons, one of vodka, half a measure of Kina Lillet. Shake it very well until it’s ice-cold, then add a large thin slice of lemon peel. Got it?”
"Certainly, monsieur." The barman seemed pleased with the idea.
Casino Royale, Ian Fleming, 1953


OAuth helps to separate services with user credentials from “working” databases, both physically and geographically. It thereby strengthens the protection of identification data and, if necessary, helps you comply with the requirements of countries' data protection laws.

With OAuth, you can provide the user with the ability to work safely from multiple devices at once, while "exposing" personal data to various services and applications as little as possible. You can also avoid taking on "excess" data about users of your services (i.e. you can process data in a depersonalized form).

7 1
5 994

At least three different ways to process errors (status codes, exceptions, SQLCODE etc is given in ObjectScript. Most systems have status, but for a range of reasons exceptions are more convenient to manage. You spend some time translating between the various techniques dealing with legacy code. For reference, I use several of these excerpts. It is hoped that they will also support others.

4 2
2 983

The following steps show you how to display a sample list of metrics available from the /api/monitor service.

In the last post, I gave an overview of the service that exposes IRIS metrics in Prometheus format. The post shows how to set up and run IRIS preview release 2019.4 in a container and then list the metrics.


This post assumes you have Docker installed. If not, go and do that now for your platform :)

16 10
5 944

This is more for my memory that anything else but I thought I'd share it because it often comes up in comments, but is not in the InterSystems documentation.

There is a wonderful utility called ^REDEBUG that increases the level of logging going into mgr\cconsole.log.

You activate it by

a) start terminal/login

b) zn "%SYS"

c) do ^REDEBUG

6 0
0 926

In this article, we will explore the development of an IRIS client for consuming RESTful API services that have been developed to the OData API standard.

We will be exploring a number of built-in IRIS libraries for making HTTP requests, reading and writing to JSON payloads, and seeing how we can use them in combination to build a generic client adaptor for OData. We will also explore the new JSON adapter for deserializing JSON into persistent objects.

4 3
2 863
Article
· Jul 4, 2019 1m read
Install EnsDemo on IRIS

Has you may know, EnsDemo from Ensemble are not available anymore on IRIS.

This is a good thing, Iris is cloud oriented, it must be light, fast. Now the new way of sharing samples or modules is through git, continuous integration and OpenExchange.

But, in some cases you want to go back to your good old samples from EnsDemo to get inspiration or best practices.

Good news, there is a git for that :

4 1
2 841
Article
· Jan 11, 2019 4m read
SQL Performance Resources

There are three things most important to any SQL performance conversation: Indices, TuneTable, and Show Plan. The attached PDFs includes historical presentations on these topics that cover the basics of these 3 things in one place. Our documentation provides more detail on these and other SQL Performance topics in the links below. The eLearning options reinforces several of these topics. In addition, there are several Developer Community articles which touch on SQL performance, and those relevant links are also listed.

There is a fair amount of repetition in the information listed below. The most important aspects of SQL performance to consider are:

  1. The types of indices available
  2. Using one index type over another
  3. The information TuneTable gathers for a table and what it means to the Optimizer
  4. How to read a Show Plan to better understand if a query is good or bad
8 1
4 835

Loading your IRIS Data to your Google Cloud Big Query Data Warehouse and keeping it current can be a hassle with bulky Commercial Third Party Off The Shelf ETL platforms, but made dead simple using the iris2bq utility.

Let's say IRIS is contributing to workload for a Hospital system, routing DICOM images, ingesting HL7 messages, posting FHIR resources, or pushing CCDA's to next provider in a transition of care. Natively, IRIS persists these objects in various stages of the pipeline via the nature of the business processes and anything you included along the way. Lets send that up to Google Big Query to augment and compliment the rest of our Data Warehouse data and ETL (Extract Transform Load) or ELT (Extract Load Transform) to our hearts desire.

A reference architecture diagram may be worth a thousand words, but 3 bullet points may work out a little bit better:

  • It exports the data from IRIS into DataFrames
  • It saves them into GCS as .avro to keep the schema along the data: this will avoid to specify/create the BigQuery table schema beforehands.
  • It starts BigQuery jobs to import those .avro into the respective BigQuery tables you specify.

5 3
0 818

The last time that I created a playground for experimenting with machine learning using Apache Spark and an InterSystems data platform, see Machine Learning with Spark and Caché, I installed and configured everything directly on my laptop: Caché, Python, Apache Spark, Java, some Hadoop libraries, to name a few. It required some effort, but eventually it worked.

8 3
7 805
Article
· Jun 10, 2019 1m read
more usefull Object Dump

During testing your code you are often confronted with the need to examine
the actual content of an object. Either using ZWRITE or $system.OBJ.Dump()
you get a picture of simple properties as "--- attribute values ---"
while "--- swizzled references ---" are more confusing than informative
and with "--- calculated references ---" you are just left in the lurch.

This small helper class allows you to dump an object to terminal or
e.g in background to some stream for later review.
By default, you see just properties with content,

16 0
1 796
Article
· Dec 7, 2019 1m read
About %objlasterror

%objlasterror is a useful reference to the last error.

Every time $$$ERROR is called, %objlasterror is set to a result of this call.

It's important in cases where you want to convert exception to status:

Try {
   //  quality code
} Catch ex {
   Set sc = $g(%objlasterror, $$$OK)
   Set sc = $$$ADDSC(sc, ex.AsStatus())
}

Because AsStatus calls $$$ERROR under the wraps, the order is important, first you need to get %objlasterror and convert exception after that.

7 8
3 795

Keywords: Jupyter Notebook, Tensorflow GPU, Keras, Deep Learning, MLP, and HealthShare

1. Purpose and Objectives

In previous"Part I" we have set up a deep learning demo environment. In this "Part II" we will test what we could do with it.

Many people at my age had started with the classic MLP (Multi-Layer Perceptron) model. It is intuitive hence conceptually easier to start with.

1 2
3 789