Clear filter
Announcement
James Schultz · Jun 14, 2018
Hi Community!Come join us on Developer Week in NYC on 18-20 of June!InterSystems has signed on for a high-level sponsorship and exhibitor space at this year's DeveloperWeek, billed as "New York City’s Largest Developer Conference & Expo". This is the first time we have participated in the event that organizers expect will draw more than 3,000 developers from 18th to 20th June.“The world is changing rapidly, and our target audience is far more diverse in terms of roles and interests than it used to be... DeveloperWeek NYC is a gathering of people who create applications for a living, and we want developers to see the power and capabilities of InterSystems. We need to know them, and they need to know us, as our software can be a foundation for their success.” – says Marketing Vice President Jim Rose.The main feature at InterSystems booth 812 is the new sandbox experience for InterSystems IRIS Data Platform™. Meanwhile Director of Data Platforms Product Management @Iran.Hutchinson is delivering two presentations on the conference agenda. One, "GraalVM: What it is. Why it’s important. Experiments you should try today" will be on the Main Stage on June 19 between 11:00 a.m. and 11:20 a.m. GraalVM is an open source set of projects driven by Oracle Labs, the Institute for Software Kepler University Linz Austria, and a community of contributors.On the following day, Hutchinson will lead a follow-on presentation to Frictionless Systems Founder Carter Jernigan's productivity boosting "Power Up with Flow: Get “In the Zone” to Get Things Done", which runs from 11:00 a.m. - 11:45 a.m. on Workshop Stage 2. In "Show and Tell Your Tips and Techniques – and Win in Powers of 2!" he leads an open exchange of productivity ideas, tips, and innovations culminating in prizes to the "Power of 2" for the best ideas. If you are attending, it takes place between 11:45 a.m. and 12:30 p.m. also on Workshop Stage 2.Don't forget to check these useful links:All details about the DeveloperWeek NYC The original agendaRegister now and see you in New York! Very cool Thanks! We're very excited to be participating!
Article
Gevorg Arutiunian · Sep 4, 2018

I already talked about [GraphQL](https://community.intersystems.com/post/graphql-intersystems-data-platforms) and the ways of using it in this article. Now I am going to tell you about the tasks I was facing and the results that I managed to achieve in the process of implementing GraphQL for InterSystems platforms.
## What this article is about
- Generation of an [AST](https://en.wikipedia.org/wiki/Abstract_syntax_tree) for a GraphQL request and its validation
- Generation of documentation
- Generation of a response in the JSON format
Let’s take a look at the entire process from sending a request to receiving a response using this simple flow as an example:

The client can send requests of two types to the server:
- A schema request.
The server generates a schema and returns it to the client. We’ll cover this process later in this article.
- A request to fetch/modify a particular data set.
In this case, the server generates an AST, validates and returns a response.
## AST generation
The first task that we had to solve was to parse the received GraphQL request. Initially, I wanted to find an external library and use it to parse the response and generate an AST. However, I discarded the idea for a number of reasons. This is yet another black box, and you should keep the issue with long callbacks in mind.
That’s how I ended up with a decision to write my own parser, but where do I get its description? Things got better when I realized that [GraphQL](http://facebook.github.io/graphql/October2016/) was an open-source project with a pretty good description by Facebook. I also found multiple examples of parsers written in other languages.
You can find a description of an AST [here](http://facebook.github.io/graphql/October2016/#Document).
Let’s take a look at a sample request and tree:
```
{
Sample_Company(id: 15) {
Name
}
}
```
**AST**
```
{
"Kind": "Document",
"Location": {
"Start": 1,
"End": 45
},
"Definitions": [
{
"Kind": "OperationDefinition",
"Location": {
"Start": 1,
"End": 45
},
"Directives": [],
"VariableDefinitions": [],
"Name": null,
"Operation": "Query",
"SelectionSet": {
"Kind": "SelectionSet",
"Location": {
"Start": 1,
"End": 45
},
"Selections": [
{
"Kind": "FieldSelection",
"Location": {
"Start": 5,
"End": 44
},
"Name": {
"Kind": "Name",
"Location": {
"Start": 5,
"End": 20
},
"Value": "Sample_Company"
},
"Alias": null,
"Arguments": [
{
"Kind": "Argument",
"Location": {
"Start": 26,
"End": 27
},
"Name": {
"Kind": "Name",
"Location": {
"Start": 20,
"End": 23
},
"Value": "id"
},
"Value": {
"Kind": "ScalarValue",
"Location": {
"Start": 24,
"End": 27
},
"KindField": 11,
"Value": 15
}
}
],
"Directives": [],
"SelectionSet": {
"Kind": "SelectionSet",
"Location": {
"Start": 28,
"End": 44
},
"Selections": [
{
"Kind": "FieldSelection",
"Location": {
"Start": 34,
"End": 42
},
"Name": {
"Kind": "Name",
"Location": {
"Start": 34,
"End": 42
},
"Value": "Name"
},
"Alias": null,
"Arguments": [],
"Directives": [],
"SelectionSet": null
}
]
}
}
]
}
}
]
}
```
## Validation
Once we receive a tree, we’ll need to check if it has classes, properties, arguments and their types on the server – that is, we’ll need to validate it. Let’s traverse the tree recursively and check whether its elements match the ones on the server. [Here’s](https://github.com/intersystems-ru/GraphQL/blob/master/cls/GraphQL/Query/Validation.cls) how a class looks.
## Schema generation
A **schema** is a type of documentation for available classes and properties, as well as a description of property types in these classes.
GraphQL implementations in other languages or technologies use resolvers to generate schemas. A resolver is a description of the types of data available on the server.
**Examples or resolvers, requests, and responses**
```
type Query {
human(id: ID!): Human
}
type Human {
name: String
appearsIn: [Episode]
starships: [Starship]
}
enum Episode {
NEWHOPE
EMPIRE
JEDI
}
type Starship {
name: String
}
```
```
{
human(id: 1002) {
name
appearsIn
starships {
name
}
}
}
```
```json
{
"data": {
"human": {
"name": "Han Solo",
"appearsIn": [
"NEWHOPE",
"EMPIRE",
"JEDI"
],
"starships": [
{
"name": "Millenium Falcon"
},
{
"name": "Imperial shuttle"
}
]
}
}
}
```
However, before we generate a schema, we need to understand its structure, find a description or, even better, examples. The first thing I tried was attempting to find one that would help me understand the structure of a schema. Since GitHub has its own [GraphQL API](https://developer.github.com/v4/explorer/), it was easy to get one from there. But the problem was that its server-side was so huge that the schema itself occupied 64 thousand lines. I really hated the idea of delving into all that and started looking for other methods of obtaining a schema.
Since our platforms are based on a DBMS, my plan for the next step was to build and start GraphQL for PostgreSQL and SQLite. With PostgreSQL, I managed to fit the schema into just 22 thousand lines, and SQLite gave me an even better result with 18 thousand lines. It was better than the starting point, but still not enough, so I kept on looking.
I ended up choosing a [NodeJS](https://graphql.org/graphql-js/) implementation, made a build, wrote a simple resolver, and got a solution with just 1800 lines, which was way better!
Once I had wrapped my head around the schema, I decided to generate it automatically without creating resolvers on the server in advance, since getting meta information about classes and their relationships is really easy.
To generate your own schema, you need to understand a few simple things:
- You don’t need to generate it from scratch – take one from NodeJS, remove the unnecessary stuff and add the things that you do need.
- The root of the schema has a queryType type. You need to initialize its “name” field with some value. We are not interested in the other two types since they are still being implemented at this point.
- You need to add all the available classes and their properties to the **types** array.
```
{
"data": {
"__schema": {
"queryType": {
"name": "Query"
},
"mutationType": null,
"subscriptionType": null,
"types":[...
],
"directives":[...
]
}
}
}
```
- First of all, you need to describe the **Query** root element and add all the classes, their arguments, and class types to the **fields** array. This way, they will be accessible from the root element.
**Let’s take a look at two sample classes, Example_City and Example_Country**
```
{
"kind": "OBJECT",
"name": "Query",
"description": "The query root of InterSystems GraphQL interface.",
"fields": [
{
"name": "Example_City",
"description": null,
"args": [
{
"name": "id",
"description": "ID of the object",
"type": {
"kind": "SCALAR",
"name": "ID",
"ofType": null
},
"defaultValue": null
},
{
"name": "Name",
"description": "",
"type": {
"kind": "SCALAR",
"name": "String",
"ofType": null
},
"defaultValue": null
}
],
"type": {
"kind": "LIST",
"name": null,
"ofType": {
"kind": "OBJECT",
"name": "Example_City",
"ofType": null
}
},
"isDeprecated": false,
"deprecationReason": null
},
{
"name": "Example_Country",
"description": null,
"args": [
{
"name": "id",
"description": "ID of the object",
"type": {
"kind": "SCALAR",
"name": "ID",
"ofType": null
},
"defaultValue": null
},
{
"name": "Name",
"description": "",
"type": {
"kind": "SCALAR",
"name": "String",
"ofType": null
},
"defaultValue": null
}
],
"type": {
"kind": "LIST",
"name": null,
"ofType": {
"kind": "OBJECT",
"name": "Example_Country",
"ofType": null
}
},
"isDeprecated": false,
"deprecationReason": null
}
],
"inputFields": null,
"interfaces": [],
"enumValues": null,
"possibleTypes": null
}
```
- Our second step is to go one level higher and extend the **types** array with the classes that have already been described in the **Query** object with all of the properties, types, and relationships with other classes.
**Descriptions of classes**
```
{
"kind": "OBJECT",
"name": "Example_City",
"description": "",
"fields": [
{
"name": "id",
"description": "ID of the object",
"args": [],
"type": {
"kind": "SCALAR",
"name": "ID",
"ofType": null
},
"isDeprecated": false,
"deprecationReason": null
},
{
"name": "Country",
"description": "",
"args": [],
"type": {
"kind": "OBJECT",
"name": "Example_Country",
"ofType": null
},
"isDeprecated": false,
"deprecationReason": null
},
{
"name": "Name",
"description": "",
"args": [],
"type": {
"kind": "SCALAR",
"name": "String",
"ofType": null
},
"isDeprecated": false,
"deprecationReason": null
}
],
"inputFields": null,
"interfaces": [],
"enumValues": null,
"possibleTypes": null
},
{
"kind": "OBJECT",
"name": "Example_Country",
"description": "",
"fields": [
{
"name": "id",
"description": "ID of the object",
"args": [],
"type": {
"kind": "SCALAR",
"name": "ID",
"ofType": null
},
"isDeprecated": false,
"deprecationReason": null
},
{
"name": "City",
"description": "",
"args": [],
"type": {
"kind": "LIST",
"name": null,
"ofType": {
"kind": "OBJECT",
"name": "Example_City",
"ofType": null
}
},
"isDeprecated": false,
"deprecationReason": null
},
{
"name": "Name",
"description": "",
"args": [],
"type": {
"kind": "SCALAR",
"name": "String",
"ofType": null
},
"isDeprecated": false,
"deprecationReason": null
}
],
"inputFields": null,
"interfaces": [],
"enumValues": null,
"possibleTypes": null
}
```
- The third point is that the “types” array contains descriptions of all popular scalar types, such as int, string, etc. We’ll add our own scalar types there, too.
## Response generation
Here is the most complex and exciting part. A request should generate a response. At the same time, the response should be in the JSON format and match the request structure.
For each new GraphQL request, the server has to generate a class containing the logic for obtaining the requested data. The class is not considered new if the values of arguments changed – that is, if we get a particular dataset for Moscow and then the same set for London, no new class will be generated, it’s just going to be new values. In the end, this class will contain an SQL query and the resulting dataset will be saved in the JSON format with its structure matching the GraphQL request.
**An example of a request and a generated class**
```
{
Sample_Company(id: 15) {
Name
}
}
```
```
Class gqlcq.qsmytrXzYZmD4dvgwVIIA [ Not ProcedureBlock ]
{
ClassMethod Execute(arg1) As %DynamicObject
{
set result = {"data":{}}
set query1 = []
#SQLCOMPILE SELECT=ODBC
&sql(DECLARE C1 CURSOR FOR
SELECT Name
INTO :f1
FROM Sample.Company
WHERE id= :arg1
) &sql(OPEN C1)
&sql(FETCH C1)
While (SQLCODE = 0) {
do query1.%Push({"Name":(f1)})
&sql(FETCH C1)
}
&sql(CLOSE C1)
set result.data."Sample_Company" = query1
quit result
}
ClassMethod IsUpToDate() As %Boolean
{
quit:$$$comClassKeyGet("Sample.Company",$$$cCLASShash)'="3B5DBWmwgoE" $$$NO
quit $$$YES
}
}
```
How this process looks in a scheme:

For now, the response is generated for the following requests:
- Basic
- Embedded objects
- Only the many-to-one relationship
- List of simple types
- List of objects
Below is a scheme containing the types of relationships that are yet to be implemented:

Summary
- **Response** — at the moment, you can get a set of data for relatively simple requests.
- **Automatically generated schema** — the schema is generated for stored classes accessible to the client, but not for pre-defined resolvers.
- **A fully functional parser** – the parser is fully implemented, you can get a tree by making a request of any complexity.
→ [Link to the project repository](https://github.com/intersystems-community/GraphQL)
→ [Link to the demo server](http://37.139.6.217:57773/graphiql/index.html)
Demo server doesn't work Thank you, I fixed this issue!
Announcement
Michelle Spisak · Apr 30, 2018
When you hear people talk about moving their applications to the cloud, are you unsure of what exactly they mean? Do you want a solution for migrating your local, physical servers to a flexible, efficient cloud infrastructure? Join Luca Ravazzolo for Introducing InterSystems Cloud Manager, (May 17th, 2:00 p.m. EDT). In this webinar, Luca — Product Manager for InterSystems Cloud Manager — will explain cloud technology and how you can move your InterSystems IRIS infrastructure to the cloud in an operationally agile fashion. He will also be able to answer your questions following the webinar about this great new product from InterSystems! Thanks Michelle!I'm happy to answer any question anybody may have on the webinar where I presented InterSystems Cloud Manager and generally on the improvement an organization can achieve in its software-factory with the newly available technologies from InterSystems. This webinar recording has been posted: https://learning.intersystems.com/course/view.php?name=IntroICMwebinar And now this webinar recording is available on InterSystems Developers YouTube Channel: Please welcome!
Announcement
Evgeny Shvarov · Aug 7, 2018
Hi, Community!
Just want to let you know that InterSystems IRIS is available on Google Cloud Marketplace.
Start here to get your InterSystems IRIS VM on GCP.
You can request a license for InterSystems IRIS here.
Learn more in an official press release Create an IRIS instance in 5 minutes video... https://youtu.be/f_uVe_Q5X-c
Announcement
Sourabh Sethi · Jul 29, 2019
A SOLID Design in Cache Object
In this session, we will discussing SOLID Principle of Programming and will implement in a example.I have used Cache Object Programming Language for examples.We will go step by step to understand the requirement, then what common mistakes we use to do while designing, understanding each principles and then complete design with its implementation via Cache Objects.
If you have any questions or suggestions, please write to me - sethisourabh.hit@gmail.com
CodeSet - https://github.com/sethisourabh/SolidPrinciplesTraining Thanks for sharing this knowledge on ObjectScript language.I haven't heard of SOLID Principle before, I'll apply it on my next code.BTW : can you share your sildes for an easier walkthrough ? Thank you for your response.I dont see any way to attach documents here. You can send your email id and I will send over there.My email ID - sethisourabh.hit@gmail.comRegards,Sourabh You could use https://www.slideshare.net/ or add the document to the GitHub repo.There is a way to post documents on Intersystems Community under Edit Post -> Change Additional Settings, which I documented here but it's not user friendly and I didn't automatically see links to attached documents within the post so I had to manually add the links. Community feedback suggests they may turn this feature off at some point so I'd recommend any of the above options instead. Thanks, @Stephen.Wilson!Yes, we plan to turn off the attachments feature. As you mention there are a lot of better ways to expose presentation and code.And as you see @Sourabh.Sethi6829 posted the recent package for his recent video on Open Exchange. Do I need the code set for this session in open exchange? Would be great - it’s even more presence and developers can collaborate DONE
Question
Evgeny Shvarov · Jul 23, 2019
Hi guys!What is the IRIS analog for Ensemble.INC? Tried to compile the class in IRIS - says
Error compiling routine: Util.LogQueueCounts. Errors: Util.LogQueueCounts.cls : Util.LogQueueCounts.1(7) : MPP5635 : No include file 'Ensemble'
You just have to enable Ensemble in the installer
<Namespace Name="${NAMESPACE}" Code="${DBNAME}-CODE" Data="${DBNAME}-DATA" Create="yes" Ensemble="1">
That helped! Thank you! What do you mean? There are still Ensemble.inc
Article
Evgeny Shvarov · Sep 19, 2019
Hi Developers!
Recently we launched InterSystems Package Manager - ZPM. And one of the intentions of the ZPM is to let you package your solution and submit into the ZPM registry to make its deployment as simple as "install your-package" command.
To do that you need to introduce module.xml file into your repository which describes what is your InterSystems IRIS package consists of.
This article describes different parts of module.xml and will help you to craft your own.
I will start from samples-objectscript package, which installs into IRIS the Sample ObjectScript application and could be installed with:
zpm: USER>install samples-objectscript
It is probably the simplest package ever and here is the module.xml which describes the package:
<?xml version="1.0" encoding="UTF-8"?>
<Export generator="Cache" version="25">
<Document name="samples-objectscript.ZPM">
<Module>
<Name>samples-objectscript</Name>
<Version>1.0.0</Version>
<Packaging>module</Packaging>
<SourcesRoot>src</SourcesRoot>
<Resource Name="ObjectScript.PKG"/>
</Module>
</Document>
</Export>
Let's go line-by-line through the document.
<Export generator="Cache" version="25">
module.xml belongs to the family of Cache/IRIS xml documents so this line states this relation to let internal libraries to recognize the document.
Next section is <Document>
<Document name="samples-objectscript.ZPM">
Your package should have a name. The name can contain letters in lower-case and "-" sign. E.g. samples-objectscript in this case. please put the name of your package in the name clause of the Document tag with the .ZPM extension.
Inner elements of the Document are:
<Name> - the name of your package. In this case:
<Name>samples-objectscript</Name>
<Version> - the version of the package. In this case:
<Version>1.0.0</Version>
<Packaging>module</Packaging> - the type of packaging. Put the module parameter here.
<Packaging>module</Packaging>
<SourcesRoot> - a folder, where zpm will look for ObjectScript to import.
In this case we tell to look for ObjectScript in /src folder:
<SourcesRoot>src</SourcesRoot>
<Resource Name> - elements of ObjectScript to import. This could be packages, classes, includes, globals, dfi, etc.
The structure under SourceRoot folder should be the following:
/cls - all the ObjectScript classes in Folder=Package, Class=file.cls form. Subpackages are subfolders
/inc - all the include files in file.inc form.
/mac - all the mac routines.
/int - all the "intermediate" routines (AKA "other" code, the result of a compilation of mac code, or ObjectScirpt without classes and macro).
/gbl - all the globals in xml form of export.
/dfi - all the DFI files in xml form of export. Each pivot comes in pivot.dfi file, each dashboard comes in dashboard.dfi file.
E.g. here we import the ObjectScript page. This will tell to ZPM to look for /src/cls/ObjectScript folder and import all the classes from it:
<Resource Name="ObjectScript.PKG"/>
So! To prepare your solution for packaging put ObjectScript classes into some folder of your repository inside /cls folder and place all packages and classes in package=folder, class=file.cls form.
If you store classes in your repo differently and don't want a manual work to prepare the proper folder structure for ObjectScript there are plenty of tools which do the work: Atelier and VSCode ObjectScript export classes this way, also there is isc-dev utility which exports all the artifacts from namespace ready for packaging.
Packaging mac routines
This is very similar to classes. Just put routines under /mac folder. Example.
<?xml version="1.0" encoding="UTF-8"?>
<Export generator="Cache" version="25">
<Document name="DeepSeeButtons.ZPM">
<Module>
<Name>DeepSeeButtons</Name>
<Version>0.1.7</Version>
<Packaging>module</Packaging>
<SourcesRoot>src</SourcesRoot>
<Resource Name="DeepSeeButtons.mac"/>
</Module>
</Document>
</Export>
Some other elements
There are also optional elements like:<Author>
Which could contain <Organization> and <CopyrightDate> elements.
Example:
<Author>
<Organization>InterSystems</Organization>
<CopyrightDate>2019</CopyrightDate>
</Author>
Packaging CSP/Web applications
ZPM can deploy web applications too.
To make it work introduce CSPApplication element with the clauses of CSP Application parameters.
For example, take a look on DeepSeeWeb module.xml CSPApplication tag:
<CSPApplication
Url="/dsw"
DeployPath="/build"
SourcePath="${cspdir}/dsw"
ServeFiles="1"
Recurse="1"
CookiePath="/dsw"
/>
This setting will create a Web application with the name /dsw and will copy all the files from /build folder of the repository into ${cspdir}/dsw folder which is a folder under IRIS csp directory.
REST API application
If this is a REST-API application the CSPApplication element will contain dispatch class and could look like the MDX2JSON module.xml:
<CSPApplication
Path="/MDX2JSON"
Url="/MDX2JSON"
CookiePath="/MDX2JSON/"
PasswordAuthEnabled="1"
UnauthenticatedEnabled="1"
DispatchClass="MDX2JSON.REST"
/>
Dependencies
Your module could expect the presence of another module installed on the target system. This could be described by <Dependencies> element inside <Document> element which could contain several <ModuleReference> elements each of which has <Name> and <Version> and which state what other modules with what version should be installed before your one. This will cause ZPM to check, whether modules are installed and if not perform the installation.
Here is an example of dependency DSW module on MDX2JSON module:
<Dependencies>
<ModuleReference>
<Name>MDX2JSON</Name>
<Version>2.2.0</Version>
</ModuleReference>
</Dependencies>
Another example where ThirdPartyPortlets depends on Samples BI(holefoods):
<Dependencies>
<ModuleReference>
<Name>holefoods</Name>
<Version>0.1.0</Version>
</ModuleReference>
</Dependencies>
There are also options to run your arbitrary code to set up the data, environment and we will talk about it in the next articles.
How to build your own package
Ok! Once you have a module.xml you can try to build the package and test if the module.xml structure is accurate.
You may test via zpm client. Install zpm on an IRIS system and load the package code with load command:
zpm: NAMESPACE>load path-to-the-project
The path points to the folder which contains the resources for the package and has module.xml in the root folder.
E.g. you can test the package building this project. Check out it and build a container with docker-compose-zpm.yml.
Open terminal in SAMPLES namespace and call ZPM:
zpm: SAMPLES>
zpm: SAMPLES>load /iris/app
[samples-objectscript] Reload START
[samples-objectscript] Reload SUCCESS
[samples-objectscript] Module object refreshed.
[samples-objectscript] Validate START
[samples-objectscript] Validate SUCCESS
[samples-objectscript] Compile START
[samples-objectscript] Compile SUCCESS
[samples-objectscript] Activate START
[samples-objectscript] Configure START
[samples-objectscript] Configure SUCCESS
[samples-objectscript] Activate SUCCESS
The path is "/iris/app" cause we tell in docker-compose-zpm.yml that we map the root of the project to /iris/app folder in the container. So we can use this path to tell zpm where to load the project from.
So! The load performed successfully. And this means that module.xml could be used to submit a package to the developers' community repository.
Now you know how to make a proper module.xml for your application.
How to submit the application to InterSystems Community repository
As for today there two requirements:
1. Your application should be listed on Open Exchange
2. Request me in Direct Message or in comments to this post if you want your application to be submitted to the Community Package manager repository.
And you should have a module.xml working!) Updated module documents from .MODULE to .ZPM Hi @Evgeny.Shvarov I'm creating a module.xml for iris-history-monitor, and during the process, a question came up.When you run docker-compose up in my project, the Installer has an invoke tag to execute a class method.
But how can I make this works in the ZPM? Here is objectscript package template, which has an example module.xml with almost everything which could happen in a package.
Take a look on invoke tag:
<Invokes>
<Invoke Class="community.objectscript.PersistentClass" Method="CreateRecord"></Invoke>
<Invoke Class="community.objectscript.ClassExample" Method="SetToTheGlobal">
<Arg>42</Arg>
<Arg>Text Data</Arg>
</Invoke>
</Invokes>
Place calls elements <Invoke> in <Invokes> tag. You can pass parameters if you need. This article describes all the details about <invoke> elements. Perfect!
Thanks @Evgeny.Shvarov ZPM forces to use categories in the folder structure... perhaps, to make it easier, VS Code ObjectScript extension should be configured with that option by default... just an idea.
Also, is it there any place with full doc about module.xml? Articles are full of really useful info but having to navigate through all of them is a bit confusing. Hi Salva!
Thanks, but not anymore. Check the article.
Also, is it there any place with full doc about module.xml? Articles are full of really useful info but having to navigate through all of them is a bit confusing.
Sure. Here is the ZPM documentation Oh my... I didn't see the Wiki...
Thanks! Aggggghhh... OK... come back to previous structure. At least... I can confirm that it was a good idea...but I was not the first one Yes. We first introduced this one cause it exactly how Atelier API exports ObjectScript artifacts by default, but IMHO the simplified one is much better.
And that's why we have it in the ZPM basic template. What is the rule for VERSION in Dependencies ?Is it an EQUAL or a MINIMUM Version.
e.g. <Version>0.0.0</Version> would mean any version
<Dependencies>
<ModuleReference>
<Name>holefoods</Name>
<Version>0.1.0</Version>
</ModuleReference>
</Dependencies>
Thanks ! I used the ZPM "generate" function to create myself a module.xml of a Cache Server Pages application. Worked great! However, I cannot figure out how to get the classes and csp files for this application physically out of IRIS and into a the "src" folder with the module.xml so I can install the application, via ZPM, into another server. Can anyone help me with this part? Hi Jason!
If you are on VSCode you can leverage the InterSystems plugin and export classes.
For CSP files - if something works for you as a CSP web application, you don't have CSP files but rather CSP classes; you can simply export them via VSCode as well.
Thanks Evgeny! I don't use VS Code much but I can get back to it and try. Are there any other options? Hi Jason! I asked DC-AI and he provided an answer!
You can do this:
do $SYSTEM.OBJ.Export("*.cls", "/path/to/export/directory/", "ck")
Article
Henry Pereira · Sep 16, 2019
In an ever-changing world, companies must innovate to stay competitive. This ensures that they’ll make decisions with agility and safety, aiming for future results with greater accuracy.Business Intelligence (BI) tools help companies make intelligent decisions instead of relying on trial and error. These intelligent decisions can make the difference between success and failure in the marketplace.Microsoft Power BI is one of the industry’s leading business intelligence tools. With just a few clicks, Power BI makes it easy for managers and analysts to explore a company’s data. This is important because when data is easy to access and visualize, it’s much more like it’ll be used to make business decisions.
Power BI includes a wide variety of graphs, charts, tables, and maps. As a result, you can always find visualizations that are a good fit for your data.
BI tools are only as useful as the data that backs them, however. Power BI supports many data sources, and InterSystems IRIS is a recent addition to those sources. Since Power BI provides an exciting new way to explore data stored in IRIS, we’ll be exploring how to use these two amazing tools together.
This article will explain how to use IRIS Tables and Power BI together on real data. In a follow-up article, we’ll walk through using Power BI with IRIS Cubes.
Project Prerequisites and Setup
You will need the following to get started:
InterSystems IRIS Data Platform
Microsoft Power BI Desktop (April 2019 release or more recent)
InterSystems Sample-BI data
We'll be using the InterSystems IRIS Data Platform, so you’ll need access to an IRIS install to proceed. You can download a trial version from the InterSystems website if necessary.
There are two ways to install the Microsoft Power BI Desktop. You can download an installer and, or install it through the Microsoft Store. Note that if you are running Power BI from a different machine than where you installed InterSystems IRIS, you will need to install the InterSystems IRIS ODBC drivers on that machine separately
To create a dashboard on Power BI we'll need some data. We'll be using the HoleFoods dataset provided by InterSystems here on GitHub. To proceed, either clone or download the repository.
In IRIS, I've created a namespace called SamplesBI. This is not required, but if you want to create a new namespace, in the IRIS Management Portal, go to System Administration > Configuration > System Configuration > Namespace and click on New Namespace. Enter a name, then create a data file or use an existing one.
On InterSystems IRIS Terminal, enter the namespace that you want to import the data into. In this case, SamplesBI:
Execute $System.OBJ.Load() with the full path of buildsample/Build.SampleBI.cls and the "ck" compile flags:
Execute the Build method of Build.SampleBI class, and full path directory of the sample files:
Connecting Power BI with IRIS
Now it's time to connect Power BI with IRIS. Open Power BI and click on "Get Data". Choose "Database", and you will see the InterSystems IRIS connector:
Enter the host address. The host address is the IP address of the host for your InterSystems IRIS instance (localhost in my case), the Port is the instance’s superserver port (IRIS default is 57773), and the Namespace is where your HoleFoods data is located.
Under Data Connectivity mode, choose "DirectQuery", which ensures you’re always viewing current data.
Next, enter the username and password to connect to IRIS. The defaults are "_SYSTEM" and "SYS".
You can import both tables and cubes generated you’ve created in IRIS. Let’s start by importing some tables.
Under Tables and HoleFoods, check:
Country
Outlet
Product
Region
SalesTransaction
We're almost there! To tell Power BI about the relationship between our tables, click on "Manage Relationships".
Then, click on New.
Let's make two relationships: "SalesTransaction" and "Product relationship".
On top, select the "SalesTransaction" table and click on the "Product" column. Next, select the "Product" table and click on the "ID" column. You'll see that the Cardinality changes automatically to "Many to One (*:1)".
Repeat this step for the following:
"SalesTransaction(Outlet)" with "Outlet(ID)"
"Outlet(Country)" with "Country(ID)"
"Country(Region)" with "Region(ID)":
Note that these relationships are imported automatically if they are expressed as Foreign Keys.
Power BI also has a Relationships schema viewer. If you click the button on the left side of the application, it will show our data model.
Creating a Dashboard
We now have everything we need to create a dashboard.
Start by clicking the button on the left to switch from schema view back to Report view. On the Home tab under the Insert Group, click the TextBox to add a Title.
The Insert Group includes static elements like Text, Shapes, and Images we can use to enhance our reports.
It's time to add our first visualization! In the Fields pane, check "Name" on "Product" and "UnitsSold" on "SalesTransaction".
Next, go to Style and select "Bold Header".
Now it's time to do some data transformation. Click on the ellipsis next to "SalesTransaction" in the Field pane.
Then, click on "Edit Query". It will open the "Power Query Editor".
Select the "DateOfSale" column and click on "Duplicate Column".
Rename this new column to "Year", and click on "Date" and select "Year".
Apply these changes. Next, select the new column and, on the "Modeling" tab, change "Default Summarization" to "Don't Summarize".
Add a "Line Chart" visualization, then drag Year to Axis, drag "Name" from "Region" to Legend, and drag "AmountOfSale" from "SalesTransaction" to Values.
Imagine that the HoleFoods sales team has a target of selling 2000 units. How can we tell if the team is meeting its goal?
To answer, let's add a visual for metrics and targets.
On "SalesTransaction" in the Field pane, check "UnitsSold", then click Gauge Chart. Under the Style properties, set Max to 3000 and Target to 2000.
KPIs (Key Performance Indicators) are helpful decision-making tools, and Power BI has a convenient KPI visual we can use.
To add it, under "SalesTransaction", check "AmountOfSale" and choose KPI under “Visualizations”. Then, drag "Year" to "Trend axis".
To align all charts and visuals, simply click and drag a visual, and when an edge or center is close to aligning with the edge or center of another visual or set of visuals, red dashed lines appear.
You also can go to the View tab and enable "Show GridLines" and "Snap Objects to Grid".
We’ll finish up by adding a map that shows HoleFoods global presence. Set Longitude and Latitude on "Outlet" to "Don't Summarize" on the Modeling tab.
You can find the map tool in the Visualizations pane. After adding it, drag the Latitude and Longitude fields from Outlet to respective properties on the map. Also from SalesTransaction, drag the AmountOfSale to Size property and UnitsSold to ToolTips.
And our dashboard is finally complete.
You can share your dashboard by publishing it to the Power BI Service. To do this, you’ll have to sign up for a Power BI account.
Conclusion
In just a few minutes, we were able to connect Power BI to InterSystems IRIS and then create amazing interactive visualizations.
As developers, this is great. Instead of spending hours or days developing dashboards for managers, we can get the job done in minutes. Even better, we can show managers how to quickly and easily create reports for themselves.
Although developing visualizations is often part of a developer’s job, our time is usually better spent developing mission-critical architecture and applications. Using IRIS and Power BI together ensures that developer time is used effectively and that managers are able to access and visualize data immediately — without waiting weeks for dashboards to be developed, tested, and deployed to production.
Perfect! Great! Thanks Henry.
Few queries -
1. Does Power Bi offer an advantage over Intersystems own analytics? If yes, what are those? In general, I believe visualization is way better in Power BI.,data modelling would be much easier. In addition Power BI offers its user to leverage from Microsoft's cognitive services. Did you notice any performance issue?
2. I believe the connector is free to avail, can you confirm if this true?
Thanks,
SS.
3. tagging @Carmen.Logue to provide more details. That's right. There is no charge for the PowerBI connector; but you do need licenses for Microsoft PowerBI. The connector is available with PowerBI starting in version 2019.2. See this article in the product documentation. 💡 This article is considered as InterSystems Data Platform Best Practice. Nice article: while testing, trying to load the tables (which I can select) I got the following errors:
I am using Iris installed locally and PowerBI Desktop.
Any suggestions? Access Denied errors can stem from a variety of reasons. As a sanity check, running the windows ODBC connection manager's test function never hurts to rule out connectivity issues. By any means, you can consult with the WRC on support issues like this.
Announcement
Michelle Spisak · Oct 17, 2019
New from InterSystems Online Learning: two new exercises that help you get hands-on with InterSystems IRIS to see how easy it is to use to solve your problems!
Using Multi-Model with Python and Node.js: This exercise takes you through the steps of using the InterSystems IRIS multi-model capability to create a Node.js application that sends JSON data straight to your database instance without any parsing or mapping.
Build a Smart Ticketing System: In this exercise, you will build on the Red Light Violation application used in the Interoperability QuickStart. Here, you will add another route to identify at-risk intersections based on data from Chicago traffic system. This involves:
Building a simple interface to consume data from a file and store data to a file.
Adding in logic to only list intersections that have been deemed high risk based on the number of red light violations.
Adding in routing to consume additional data about populations using REST from public APIs.
Modifying the data to be in the right format for the downstream system.
Get started with InterSystems IRIS today!
Article
Murray Oldfield · Nov 14, 2019
Released with no formal announcement in [IRIS preview release 2019.4](https://community.intersystems.com/post/intersystems-iris-and-iris-health-20194-preview-published "IRIS preview release 2019.1.4") is the /api/monitor service exposing IRIS metrics in **_Prometheus_** format. Big news for anyone wanting to use IRIS metrics as part of their monitoring and alerting solution. The API is a component of the new IRIS _System Alerting and Monitoring (SAM)_ solution that will be released in an upcoming version of IRIS.
>However, you do not have to wait for SAM to start planning and trialling this API to monitor your IRIS instances. In future posts, I will dig deeper into the metrics available and what they mean and provide example interactive dashboards. But first, let me start with some background and a few questions and answers.
IRIS (and Caché) is always collecting dozens of metrics about itself and the platform it is running on. There have always been [multiple ways to collect these metrics to monitor Caché and IRIS](https://docs.intersystems.com/irislatest/csp/docbook/Doc.View.cls?KEY=GCM "multiple ways to collect these metrics to monitor Caché and IRIS"). I have found that few installations use IRIS and Caché built-in solutions. For example, History Monitor has been available for a long time as a historical database of performance and system usage metrics. However, there was no obvious way to surface these metrics and instrument systems in real-time.
IRIS platform solutions (along with the rest of the world) are moving from single monolithic applications running on a few on-premises instances to distributed solutions deployed 'anywhere'. For many use cases existing IRIS monitoring options do not fit these new paradigms. Rather than completely reinvent the wheel InterSystems looked to popular and proven current Open Source solutions for monitoring and alerting.
## Prometheus?
Prometheus is a well known and widely deployed open source monitoring system based on proven technology. It has a wide variety of plugins. It is designed to work well within the cloud environment, but also is just as useful for on-premises. Plugins include operating systems, web servers such as Apache and many other applications. Prometheus is often used with a front end client, for example, _Grafana_, which provides a great UI/UX experience that is extremely customisable.
## Grafana?
Grafana is also open source. As this series of posts progresses, I will provide sample templates of monitoring dashboards for common scenarios. You can use the samples as a base to design dashboards for what you care about. The real power comes when you combine IRIS metrics in context with metrics from your whole solution stack. From the platform components, operating system, IRIS and especially when you add instrumentation from your applications.
## Haven't I seen this before?
Monitoring IRIS and Caché with Prometheus and Grafana is not new. I have been using these applications for several years to monitor my development and test systems. If you search the Developer Community for "Prometheus", you will find other posts ([for example, some excellent posts by Mikhail Khomenko](https://community.intersystems.com/post/making-prometheus-monitoring-intersystems-cach%C3%A9 "or example, this one by Mikhail Khomenko")) that show how to expose Caché metrics for use by Prometheus.
>The difference now is that the /api/monitor API is included and enabled by default. There is no requirement to code your own classes to expose metrics.
# Prometheus Primer
Here is a quick orientation to Prometheus and some terminology. I want you to see the high level and to lay some groundwork and open the door to how you think of visualising or consuming the metrics provided by IRIS or other sources.
Prometheus works by _scraping_ or pulling time series data exposed from applications as HTTP endpoints (APIs such as IRIS /api/monitor). _Exporters_ and client libraries exist for many languages, frameworks, and open-source applications — for example, for web servers like Apache, operating systems, docker, Kubernetes, databases, and now IRIS.
Exporters are used to instrument applications and services and to expose relevant metrics on an endpoint for scraping. Standard components such as web servers, databases, and the like - are supported by core exporters. Many other exporters are available open-source from the Prometheus community.
## Prometheus Terminology
A few key terms are useful to know:
- **Targets** are where the services are that you care about, like a host or application or services like Apache or IRIS or your own application.
- Prometheus **scrapes** targets over HTTP collecting metrics as time-series data.
- **Time-series data** is exposed by applications, for example, IRIS or via exporters.
- **Exporters** are available for things you don't control like Linux kernel metrics.
- The resulting time-series data is stored locally on the Prometheus server in a database \*\*.
- The time-series database can be queried using an optimised **query language (PromQL)**. For example, to create alerts or by client applications such as Grafana, to display the metrics in a dashboard.
>\*\* **Spoiler Alert;** For security, scaling, high availability and some other operational efficiency reasons, for the new SAM solution the database used for Prometheus time-series data is IRIS! However, access to the Prometheus database -- on IRIS -- is transparent, and applications such as Grafana do not know or care.
### Prometheus Data Model
Metrics returned by the API are in Prometheus format. Prometheus uses a simple text-based metrics format with one metric per line, the format is;
[ (time n, value n), ....]
Metrics can have labels as (key, value) pairs. Labels are a powerful way to filter metrics as dimensions. As an example, examine a single metric returned for IRIS /api/monitor. In this case journal free space:
iris_jrn_free_space{id="WIJ",dir=”/fast/wij/"} 401562.83
The identifier tells you what the metric is and where it came from:
iris_jrn_free_space
Multiple labels can be used to decorate the metrics, and then used to filter and query. In this example, you can see the WIJ and the directory where the WIJ is stored:
id="WIJ",dir="/fast/wij/"
And a value: `401562.83` (MB).
## What IRIS metrics are available?
The [preview documentation](https://irisdocs.intersystems.com/iris20194/csp/docbook/Doc.View.cls?KEY=GCM_rest "Will be subject to changes") has a list of metrics. However, be aware there may be changes. You can also simply query the `/api/monitor/metrics` endpoint and see the list. I use [Postman](https://www.getpostman.com "Postman") which I will demonstrate in the next community post.
# What should I monitor?
Keep these points in mind as you think about how you will monitor your systems and applications.
- When you can, instrument key metrics that affect users.
- - Users don't care that one of your machines is short of CPU.
- - Users care if the service is slow or having errors.
- - For your primary dashboards focus on high-level metrics that directly impact users.
- For your dashboards avoid a wall of graphs.
- - Humans can't deal with too much data at once.
- - For example, have a dashboard per service.
- Think about services, not machines.
- - Once you have isolated a problem to one service, then you can drill down and see if one machine is the problem.
# References
**Documentation and downloads** for: [Prometheus](https://prometheus.io "Prometheus") and [Grafana](https://grafana.com "Grafana")
I presented a pre-release overview of SAM (including Prometheus and Grafana) at **InterSystems Global Summit 2019** you can find [the link at InterSystems learning services](https://learning.intersystems.com/mod/page/view.php?id=5599 "Learning Services"). If the direct link does not work go to the [InterSystems learning services web site](https://learning.intersystems.com "Learning Services") and search for: "System Alerting and Monitoring Made Easy"
Search here on the community for "Prometheus" and "Grafana".
Please include node_exporter setup.
What gets put into the isc_prometheus.yml
THis is what the doc says to do this in isc_prometheus.yml
- job_name: NODE metrics_path: /metrics scheme: http static_configs: - labels: cluster: "2" group: node targets: - csc2cxn00020924.cloud.kp.org:9100 - csc2cxn00021271.cloud.kp.org:9100
It does not work.
The node_exporter is installed and running.
From what I can see the values returned are updated very quickly - maybe every second? I'm unclear as to how to contextualize the metrics for a periodic collection. Specifically, if I call the API every minute I may get a value for global references that is very low or very high - but it may not be indicative of the value over time. Is there any information on how the metrics are calculated internally that might help? Single points in time may be very deceptive.
Article
Evgeny Shvarov · Nov 19, 2019
Hi developers!
I just want to share with you the knowledge aka experience which could save you a few hours someday.
If you are building REST API with IRIS which contains more than 1 level of "/", e.g. '/patients/all' don't forget to add parameter 'recurse=1' into your deployment script in %Installer, otherwise all the second and higher entries won't work. And all the entries of level=1 will work.
/patients
- will work, but
/patients/all
- won't.
Here is an example of CSPApplicatoin section which fix the issue and which you may want to use in your %Installer class:
<CSPApplication Url="${CSPAPP}"
Recurse="1"
Directory="${CSPAPPDIR}"
Grant="${RESOURCE},%SQL"
AuthenticationMethods="96"
/>
Article
Eduard Lebedyuk · Nov 22, 2019
This quick guide shows how to serve HTTPS requests with InterSystems API Management. Advantage here is that you have your certs on one separated server and you don't need to configure each backend web-server separately.
Here's how:
1. Buy the domain name.
2. Adjust DNS records from your domain to the IAM IP address.
3. Generate HTTPS certificate and private key. I use Let's Encrypt - it's free.
4. Start IAM if you didn't already.
5. Send this request to IAM:
POST http://host:8001/certificates/
{
"cert": "-----BEGIN CERTIFICATE-----...",
"key": "-----BEGIN PRIVATE KEY-----...",
"snis": [
"host"
]
}
Note: replace newlines in cert and key with \n.
You'll get a response, save id value from it.
6. Go to your IAM workspace, open SNIs, create new SNI with the name - your host and ssl_certificate_id - id from the previous step.
7. Update your routes to use https protocol (leave only https to force secure connection, or specify http, https to allow both protocols)
8. Test HTTPS requests by sending them to https://host:8443/<your route> - that's where IAM listens for HTTPS connections by default. Eduard, thank you for a very good webinar.
You mentioned that IAM can be helpful even if there is "service-mix": some services are IRIS based, others - not. How can IAM help with non-IRIS services? Can any Target Object be non-IRIS base?
Can any Target Object be non-IRIS base?
Absolutely. The services you offer via IAM can be sourced anywhere. Both from InterSystems IRIS and not.
How can IAM help with non-IRIS services?
All the benefits you get from using IAM (ease of administration, control, analytics) are available for both InterSystems IRIS-based and non InterSystems IRIS-based services
Announcement
Anastasia Dyubaylo · Sep 3, 2019
Hi Community!
We are super excited to announce the Boston FHIR @ InterSystems Meetup on 10th of September at the InterSystems meeting space!
There will be two talks with Q&A and networking.
Doors open at 5:30pm, we should start the first talk around 6pm. We will have a short break between talks for announcements, including job opportunities.
Please check the details below.
#1 We are in the middle of changes in healthcare technology that affect the strategies of companies and organizations across the globe, including many startups right here in Massachusetts. Micky Tripathi from the Massachusetts eHealth Collaborative is going to talk to us about the opportunities and consequences of API-based healthcare.
By Micky Tripathi - MAeHC#2 FHIR Analytics
The establishment of FHIR as a new healthcare data format creates new opportunities and challenges. Health professionals would like to acquire patient data from Electronic Health Records (EHR) with FHIR, and use it for population health management and research.FHIR provides resources and foundations based on XML and JSON data structures. However, traditional analytic tools are difficult to use with these structures. We created a prototype application to ingest FHIR bundles and save the Patient and Observation resources as objects/tables in InterSystems IRIS for Health. Developers can then easily create derived "fact tables" that de-normalize these tables for exploration and analytics.We will demo this application and our analytics tools using the InterSystems IRIS for Health platform.By Patrick Jamieson, M.D., Product Manager for InterSystems IRIS for Health and Carmen Logue, Product Manager - Analytics and AI
So, remember!
Date and time: Tuesday, 10 September 2019 5:30 pm to 7:30 pm
Venue: 1 Memorial Dr, Cambridge, MA 02142, USA
Event webpage: Boston FHIR @ InterSystems Meetup
Article
Evgeny Shvarov · Sep 6, 2019
Hi Developers!
InterSystems Package Manager (ZPM) is a great thing, but it is even better if you don't need to install it but can use immediately.
There are several ways how to do this and here is one approach of having IRIS container with ZPM built with dockerfile.
I've prepared a repository which has a few lines in dockerfile which perform the download and install the latest version of ZPM.
Add these lines to your standard dockerfile for IRIS community edition and you will have ZPM installed and ready to use.
To download the latest ZPM client:
RUN mkdir -p /tmp/deps \
&& cd /tmp/deps \
&& wget -q https://pm.community.intersystems.com/packages/zpm/latest/installer -O zpm.xml
to install ZPM into IRIS:
" Do \$system.OBJ.Load(\"/tmp/deps/zpm.xml\", \"ck\")" \
Great!
To try ZPM with this repository do the following:
$ git clone https://github.com/intersystems-community/objectscript-zpm-template.git
Build and run the repo:
$ docker-compose up -d
Open IRIS terminal:
$ docker-compose exec iris iris session iris
USER>
Call ZPM:
USER>zpm
zpm: USER>
Install webterminal
zpm: USER>install webterminal
webterminal] Reload START
[webterminal] Reload SUCCESS
[webterminal] Module object refreshed.
[webterminal] Validate START
[webterminal] Validate SUCCESS
[webterminal] Compile START
[webterminal] Compile SUCCESS
[webterminal] Activate START
[webterminal] Configure START
[webterminal] Configure SUCCESS
[webterminal] Activate SUCCESS
zpm: USER>
Use it!
And take a look at the whole process in this gif:
It turned out, that we don't need a special repository to add ZPM into your docker container.You just need another dockerfile - like this one. And here is the related docker-compose to make a handy start. See how it works:
Article
Dmitrii Kuznetsov · Oct 7, 2019
How can you allow computers to trust one another in your absence while maintaining security and privacy?
“A Dry Martini”, he said. “One. In a deep champagne goblet.”“Oui, monsieur.”“Just a moment. Three measures of Gordons, one of vodka, half a measure of Kina Lillet. Shake it very well until it’s ice-cold, then add a large thin slice of lemon peel. Got it?”"Certainly, monsieur." The barman seemed pleased with the idea.Casino Royale, Ian Fleming, 1953
OAuth helps to separate services with user credentials from “working” databases, both physically and geographically. It thereby strengthens the protection of identification data and, if necessary, helps you comply with the requirements of countries' data protection laws.
With OAuth, you can provide the user with the ability to work safely from multiple devices at once, while "exposing" personal data to various services and applications as little as possible. You can also avoid taking on "excess" data about users of your services (i.e. you can process data in a depersonalized form).
If you use Intersystems IRIS, you get a complete set of ready-made tools for testing and deploying OAuth and OIDC services, both autonomously and in cooperation with third-party software products.
OAuth 2.0 and OpenID Connect
OAuth and OpenID Connect — known as OIDC or simply OpenID — serve as a universal combination of open protocols for delegating access and identification — and in the 21st century, it seems to be a favorite. No one has come up with a better option for large-scale use. It's especially popular with frontenders because it sits on top of HTTP(S) protocols and uses a JWT (JSON Web Token) container.
OpenID works using OAuth — it is, in fact, a wrapper for OAuth. Using OpenID as an open standard for the authentication and creation of digital identification systems is nothing new for developers. As of 2019, it is in its 14th year (and its third version). It is popular in web and mobile development and in enterprise systems.
Its partner, the OAuth open standard for delegating access, is 12 years old, and it's been nine years since the relevant RFC 5849 standard appeared. For the purposes of this article, we will rely on the current version of the protocol, OAuth 2.0, and the current RFC 6749. (OAuth 2.0 is not compatible with its predecessor, OAuth 1.0.)
Strictly speaking, OAuth is not a protocol, but a set of rules (a scheme) for separating and transferring user identification operations to a separate trusted server when implementing an access-rights restriction architecture in software systems.
Be aware: OAuth can't say anything about a specific user! Who the user is, or where the user is, or even whether the user is currently at a computer or not. But OAuth makes it possible to interact with systems without user participation, using pre-issued access tokens. This is an important point (see "User Authentication with OAuth 2.0" on the OAuth site for more information).
The User-Managed Access (UMA) protocol is also based on OAuth. Using OAuth, OIDC and UMA together make it possible to implement a protected identity and access management (IdM, IAM) system in areas such as:
Using a patient's HEART (Health Relationship Trust) personal data profile in medicine.
Consumer Identity and Access Management (CIAM) platforms for manufacturing and trading companies.
Personalizing digital certificates for smart devices in IoT (Internet of Things) systems using the OAuth 2.0 Internet of Things (IoT) Client Credentials Grant.
A New Venn Of Access Control For The API Economy
Above all, do not store personal data in the same place as the rest of the system. Separate authentication and authorization physically. And ideally, give the identification and authentication to the individual person. Never store them yourself. Trust the owner's device.
Trust and Authentication
It is not a best practice to store users' personal data either in one’s own app or in a combined storage location along with a working database. In other words, we choose someone we trust to provide us with this service.
It is made up of the following parts:
The user
The client app
The identification service
The resource server
The action takes place in a web browser on the user's computer. The user has an account with the identification service. The client app has a signed contract with the identification service and reciprocal interfaces. The resource server trusts the identification service to issue access keys to anyone it can identify.
The user runs the client web app, requesting a resource. The client app must present a key to that resource to gain access.If the user doesn’t have a key, then the client app connects with an identification service with which it has a contract for issuing keys to the resource server (passing the user on to the identification service).
The Identification Service asks what kind of keys are required.
The user provides a password to access the resource. At this point, the user has been authenticated and identification of the user has been confirmed, thus providing the key to the resource (passing the user back to the client app), and the resource is made available to the user.
Implementing an Authorization Service
On the Intersystems IRIS platform, you can assemble a service from different platforms as needed. For example:
Configure and launch an OAuth server with the demo client registered on it.
Configure a demo OAuth client by associating it with an OAuth server and web resources.
Develop client apps that can use OAuth. You can use Java, Python, C#, or NodeJS. Below is an example of the application code in ObjectScript.
There are multiple settings in OAuth, so checklists can be helpful. Let's walk through an example. Go to the IRIS management portal and select the section System Administration > Security > OAuth 2.0 > Server.
Each item will then contain the name of a settings line and a colon, followed by an example or explanation, if necessary. As an alternative, you can use the screenshot hints in Daniel Kutac's three-part article, InterSystems IRIS Open Authorization Framework (OAuth 2.0) implementation - part 1, part 2, and part 3.
Note that all of the following screenshots are meant to serve as examples. You’ll likely need to choose different options when creating your own applications.
On the General Settings tab, use these settings:
Description: provide a description of the configuration, such as "Authorization server".
The endpoint of the generator (hereinafter EPG) host name: DNS name of your server.
Supported permission types (select at least one):
Authorization code
Implicit
Account details: Resource, Owner, Password
Client account details
SSL/TLS configuration: oauthserver
On the Scopes tab:
Add supported scopes: scope1 in our example
On the Intervals tab:
Access Key Interval: 3600
Authorization Code Interval: 60
Update Key Interval: 86400
Session Interruption Interval: 86400
Validity period of the client key (client secret): 0
On the JWT Settings tab:
Entry algorithm: RS512
Key Management Algorithm: RSA-OAEP
Content Encryption Algorithm: A256CBC-HS512
On the Customization tab:
Identify Class: %OAuth2.Server.Authenticate
Check User Class: %OAuth2.Server.Validate
Session Service Class: OAuth2.Server.Session
Generate Key Class: %OAuth2.Server.JWT
Custom Namespace: %SYS
Customization Roles (select at least one): %DB_IRISSYS and %Manager
Now save the changes.
The next step is registering the client on the OAuth server. Click the Customer Description button, then click Create Customer Description.
On the General Settings tab, enter the following information:
Name: OAuthClient
Description: provide a brief description
Client Type: Confidential
Redirect URLs: the address of the point to return to the app after identification from oauthclient.
Supported grant types:
Authorization code: yes
Implicit
Account details: Resource, Owner, Password
Client account details
JWT authorization
Supported response types: Select all of the following:
code
id_token
id_token key
token
Authorization type: Simple
The Client Account Details tab should be auto-completed, but ensure the information here is correct for the client.On the Client Information tab:
Authorization screen:
Client name
Logo URL
Client homepage URL
Policy URL
Terms of Service URL
Now configure the binding on the OAuth server client by going to System Administration > Security > OAuth 2.0 > Client.
Create a Server Description:
The endpoint of the generator: taken from the general parameters of the server (see above).
SSL/TLS configuration: choose from the preconfigured list.
Authorization server:
Authorization endpoint: EPG + /authorize
Key endpoint: EPG + /token
User endpoint: EPG + /userinfo
Key self-test endpoint: EPG + /revocation
Key termination endpoint: EPG + /introspection
JSON Web Token (JWT) settings:
Other source besides dynamic registration: choose JWKS from URL
URL: EPG + /jwks
From this list, for example, you can see (scopes_supported and claims_supported) that the server can provide the OAuth-client with different information about the user. And it's worth noting that when implementing your application, you should ask the user what data they are ready to share. In the example below, we will only ask for permission for scope1.
Now save the configuration.
If there is an error indicating the SSL configuration, then go to Settings > System Administration > Security > SSL/TSL Configurations and remove the configuration.
Now we're ready to set up an OAuth client:System Administration > Security > OAuth 2.0 > Client > Client configurations > Create Client configurationsOn the General tab, use these settings:
Application Name: OAuthClient
Client Name: OAuthClient
Description: enter a description
Enabled: Yes
Client Type: Confidential
SSL/TCL configuration: select oauthclient
Client Redirect URL: the DNS name of your server
Required Permission Types:
Authorization code: Yes
Implicit
Account details: Resource, Owner, Password
Client account details
JWT authorization
Authorization type: Simple
On Client Information tab:
Authorization screen:
Logo URL
Client homepage URL
Policy URL
Terms of Service URL
Default volume: taken from those specified earlier on the server (for example, scope1)
Contact email addresses: enter addresses, separated by commas
Default max age (in seconds): maximum authentication age or omit this option
On the JWT Settings tab:
JSON Web Token (JWT) settings
Creating JWT settings from X509 account details
IDToken Algorithms:
Signing: RS256
Encryption: A256CBC
Key: RSA-OAEP
Userinfo Algorithms
Access Token Algorithms
Query Algorithms
On the Client Credentials tab:
Client ID: as issued when the client registered on the server (see above).
Client ID Issued: isn't filled in
Client secret: as issued when the client registered on the server (see above).
Client Secret Expiry Period: isn't filled in
Client Registration URI: isn't filled in
Save the configuration.
Web app with OAuth authorization
OAuth relies on the fact that the communication channels between the interaction participants (server, clients, web application, user's browser, resource server) are somehow protected. Most often this role is played by protocols SSL/TLS. But OAuth will work and on unprotected channels. So, for example, server Keycloak, by default uses HTTP protocol and does without protection. It simplifies working out and debugging at working out. At real use of services, OAuth protection of channels should be included strictly obligatory is written down in the documentation Keycloak. Developers InterSystems IRIS adhere to a more strict approach for OAuth - use SSL/TSL is obligatory. The only simplification - you can use the self-signed certificates or take advantage of built-in IRIS service PKI (System administration >> Security >> Public key system).
Verification of the user's authorization is made with the explicit indication of two parameters - the name of your application registered on the OAuth server, and in the OAuth client scope.
Parameter OAUTH2APPNAME = "OAuthClient";
set isAuthorized = ##class(%SYS.OAuth2.AccessToken).IsAuthorized(
..#OAUTH2APPNAME,
.sessionId,
"scope1",
.accessToken,
.idtoken,
.responseProperties,
.error)
In the lack of authorization, we prepare a link to the request for user identification and obtaining permission to work with our application. Here we need to specify not only the name of the application registered on the OAuth server and in the OAuth client and the requested volume (scope) but also the backlink to which point of the web application to return the user.
Parameter OAUTH2CLIENTREDIRECTURI = "https://52773b-76230063.labs.learning.intersystems.com/oauthclient/"
set url = ##class(%SYS.OAuth2.Authorization).GetAuthorizationCodeEndpoint(
..#OAUTH2APPNAME,
"scope1",
..#OAUTH2CLIENTREDIRECTURI,
.properties,
.isAuthorized,
.sc)
We use IRIS and register users on the IRIS OAuth server. For example it is enough to set to the user only a name and the password.At transfer of the user under the received reference, the server will carry out the procedure of identification of the user and inquiry at it of the permissions for operation by the account data in the web application, and also will keep the result in itself in global OAuth2.Server.Session in the field %SYS:
3. Demonstrate the data of an authorized user. If the procedures are successful, we have, for example, an access token. Let's get it:
set valid = ##class(%SYS.OAuth2.Validation).ValidateJWT(
.#OAUTH2APPNAME,
accessToken,
"scope1",
.aud,
.JWTJsonObject,
.securityParameters,
.sc
)
The full working code of the OAuth example:
Class OAuthClient.REST Extends %CSP.REST
{
Parameter OAUTH2APPNAME = "OAuthClient";
Parameter OAUTH2CLIENTREDIRECTURI = "https://52773b-76230063.labs.learning.intersystems.com/oauthclient/";
// to keep sessionId
Parameter UseSession As Integer = 1;
XData UrlMap [ XMLNamespace = "http://www.intersystems.com/urlmap" ]
{
<Routes>
<Route Method="GET" Url = "/" Call = "Do" />
</Routes>
}
ClassMethod Do() As %Status
{
// Check for accessToken
set isAuthorized = ##class(%SYS.OAuth2.AccessToken).IsAuthorized(
..#OAUTH2APPNAME,
.sessionId,
"scope1",
.accessToken,
.idtoken,
.responseProperties,
.error)
// to show accessToken
if isAuthorized {
set valid = ##class(%SYS.OAuth2.Validation).ValidateJWT(
..#OAUTH2APPNAME,
accessToken,
"scope1",
.aud,
.JWTJsonObject,
.securityParameters,
.sc
)
&html< Hello!<br> >
w "You access token = ", JWTJsonObject.%ToJSON()
&html< </html> >
quit $$$OK
}
// perform the process of user and client identification and get accessToken
set url = ##class(%SYS.OAuth2.Authorization).GetAuthorizationCodeEndpoint(
..#OAUTH2APPNAME,
"scope1",
..#OAUTH2CLIENTREDIRECTURI,
.properties,
.isAuthorized,
.sc)
if $$$ISERR(sc) {
w "error handling here"
quit $$$OK
}
// url magic correction: change slashes in the query parameter to its code
set urlBase = $PIECE(url, "?")
set urlQuery = $PIECE(url, "?", 2)
set urlQuery = $REPLACE(urlQuery, "/", "%2F")
set url = urlBase _ "?" _ urlQuery
&html<
<html>
<h1>Authorization in IRIS via OAuth2</h1>
<a href = "#(url)#">Authorization in <b>IRIS</b></a>
</html>
>
quit $$$OK
}
}
You can also find a working copy of the code on the InterSystems GitHub repository: https://github.com/intersystems-community/iris-oauth-example.
If necessary, enable the advanced debug message mode on the OAuth server and OAuth client, which are written to the ISCLOG global in the %SYS area:
set ^%ISCLOG = 5
set ^%ISCLOG("Category", "OAuth2") = 5
set ^%ISCLOG("Category", "OAuth2Server") = 5
For more details, see the IRIS Using OAuth 2.0 and OpenID Connect documentation.
Conclusion
As you've seen, all OAuth features are easily accessible and completely ready to use. If necessary, you can replace the handler classes and user interfaces with your own. You can configure the OAuth server and the client settings from configuration files instead of using the management portal. Then that wonderful Ian Flemming intro gets reduced down to "vodka martini, shaken not stirred"