Clear filter
Announcement
Anastasia Dyubaylo · Jan 17, 2023
Hey Community,
We are glad to invite you to the upcoming kick-off webinar on the InterSystems Developer Tools Contest.
In this webinar, we'll talk about how to choose a project and show you how to develop, build and deploy applications on InterSystems IRIS data platform. Also, there will be information about the hot internal projects at the moment (SQL client, VS Code unit tests and Jupyter notebooks), how to look at community opportunities in the Ideas portal, and what InterSystems would like to do with the management portal.
Date & Time: Monday, January 23 – 12 pm EST | 6 PM CET
Speakers: 🗣 @Raj.Singh5479, InterSystems Product Manager 🗣 @Dean.Andrews2971, InterSystems Head of Developer Relations 🗣 @Evgeny.Shvarov, InterSystems Developer Ecosystem Manager
>> Register here <<
Developers!
Don't miss the opportunity to register for the upcoming webinar!
The webinar will start tomorrow at 12 pm EST | 6 PM CET. Hi dear community,
If you lack inspiration for the contest, here are some ideas:
* A tool to improve the load of DDL or SQL statements in IRIS.
* Why, because for now, we have to run an iris terminal then run an objectscritp command to load the DDL or SQL statements.
* I wish a way that we can have a tool that bring the ability to parse a DDL or SQL file from a shell.
* Example : iris load -f /path/to/file.sql
* A tool to automatically export objectscript classes to the local folder
* I know we can do it with Timothy's tool, but I wish a simple hook that just export new classes or modified classes to the local folder. Not the whole source control system.
Announcement
Evgeny Shvarov · Jan 26, 2023
Here're the technology bonuses for the InterSystems Developer Tools Contest 2023 that will give you extra points in the voting:
Embedded Python usage
Docker container usage
ZPM Package Deployment
Online Demo
Code Quality pass
Article on Developer Community
The second article on Developer Community
Video on YouTube
First Time Contribution
Community Idea Implementation
See the details below.
Embedded Python - 3 points
Use Embedded Python in your application and collect 3 extra points. You'll need at least InterSystems IRIS 2021.2 for it.
Docker container usage - 2 points
The application gets a 'Docker container' bonus if it uses InterSystems IRIS running in a docker container. Here is the simplest template to start from.
ZPM Package deployment - 2 points
You can collect the bonus if you build and publish the ZPM(ObjectScript Package Manager) package for your Full-Stack application so it could be deployed with:
zpm "install your-multi-model-solution"
command on IRIS with ZPM client installed.
ZPM client. Documentation.
Online Demo of your project - 2 pointsCollect 3 more bonus points if you provision your project to the cloud as an online demo. You can do it on your own or you can use this template - here is an Example. Here is the video on how to use it.
Code quality pass with zero bugs - 1 point
Include the code quality Github action for code static control and make it show 0 bugs for ObjectScript.
Article on Developer Community - 2 points
Post an article on Developer Community that describes the features of your project. Collect 2 points for each article. Translations to different languages work too.
The Second article on Developer Community - 1 point
You can collect one more bonus point for the second article or the translation regarding the application. The 3rd and more will not bring more points but the attention will all be yours.
Video on YouTube - 3 points
Make the Youtube video that demonstrates your product in action and collect 3 bonus points per each. Examples.
First Time Contribution - 3 points
Collect 3 bonus points if you participate in InterSystems Open Exchange contests for the first time!
Community Idea Implementation - 3 points
You can get 3 extra bonus points if the dev tool implements one of the ideas listed as Community Opportunity on the InterSystems Idea portal.
The list of bonuses is subject to change. Stay tuned!
Good luck with the competition! Bonus for the Community Idea implementation is introduced Rules changing to get additional points halfway through the contest period?
Hi Stefan! Usually not. We know, that rules changing doesn’t help.
But sometimes, very rarely, we add bonuses on-the-go. Like this time we believe that this bonus will help to pay attention to ideas, published by community members, and implement solutions, that community requested and voted for. And there is still 10 days to submit the app and one week more to polish it. We believe this bonus will not harm anyone. If it does - we can discuss it. Hi Evgeny,
As you can see in the article below, we used one idea from InterSystems Idea.
https://community.intersystems.com/post/iris-tripleslash-lets-rock-together
We only saw this additional (and very welcome) bonus now. Here it's the idea that inspired us, as mentioned in the article: https://ideas.intersystems.com/ideas/DP-I-175
Announcement
Anastasia Dyubaylo · Feb 14, 2023
Hi Community,
Let's meet together at the online meetup with the winners of the InterSystems Developer Tools Contest – a great opportunity to have a discussion with the InterSystems Experts team as well as our contestants.
Winners' demo included!
Date & Time: Thursday, February 16, 5 PM GMT | 12 PM EST
Join us to learn more about winners' applications and to have a talk with our experts.
➡️ REGISTER TODAY
See you all at our virtual meetup!
Article
Evgeny Shvarov · May 14, 2023
Hi Developers!
Often solutions with InterSystems IRIS BI can turn into a quite big solution with dozens of pivots and dashboards.
With every new IRIS BI solution release we can add changes that could influence the behavior of existing pivots or dashboards so they stop working. For example if we change the dimension or measure name, forget deploying some cubes or subject areas, conduct refactoring via mass renaming of cubes and its elements etc some widgets could stop functioning.
The solution is to test (manually?) every widget in every dashboard if the MDX queries are working.
Today I want to introduce a tool to test all the pivots and dashboards automatically.
Install isc-dev IPM module created by @Gevorg.Arutiunian9096 as:
USER>zpm "install isc-dev"
or programmatically:
set sc=$zpm("isc-dev")
NB: You'd need to have IPM client installed. If you don't have it you can use the following one-liner:
s r=##class(%Net.HttpRequest).%New(),r.Server="pm.community.intersystems.com",r.SSLConfiguration="ISC.FeatureTracker.SSL.Config" d r.Get("/packages/zpm/latest/installer"),$system.OBJ.LoadStream(r.HttpResponse.Data,"c")
The installed isc-dev module has two methods that could help in automatic testing. Here is the widget to test all the pivots:
set sc=##class(dev.bi).checkPivots()
If there are issues the sc will contain errors.
Also there is an option to stop on first error:
set sc=##class(dev.bi).checkPivots(1)
in this case you'll be presented to the first not working pivot which is provided via the util's log on the terminal.
Another utility will help to check all the dashboards:
set sc=##class(dev.bi).checkDashboards()
it will control all the widgets and its filter settings.
This two utilities are very handy to use in unittests. Here is an example of universal unittest class that I can recommend to use in any IRIS BI solution:
Class dc.irisbi.unittests.TestBI Extends %UnitTest.TestCase
{
Method TestPivots()
{
Do $$$AssertStatusOK(##class(dev.bi).checkPivots(),"Test Pivots")
}
/// Test dashboards in BI package
Method TestDashboards()
{
Do $$$AssertStatusOK(##class(dev.bi).checkDashboards(),"Test Dashboards")
}
}
And here is the template project that uses it.
See how it works in a related video.
Hope this article is useful and will save a lot of important developers' time!
Happy coding!
Very helpful tool! This is much better than testing everything manually. I'll be adding it into the build pipeline for my system using IRIS BI. thank you, @Pravin.Barton ! Any feedback and issues are welcome! And kudos to @Gevorg.Arutiunian9096 who introduced it! Here is also the related video
Question
Humza Arshad · May 16, 2023
Hi guys,
I want to develop a web application in which a user can log in through the user credential of intersystems that is created in Management Portal for a specific role. How can I authenticate the user and get any token or login cookie through which user can call other apis Hello @Humza.Arshad. Thanks for your question!
Since IRIS 2022.2, you can use JWT authentication to provide a RESTful way of logging in and maintaining that session, which is in line with how many frontend frameworks like to work. The documentation can be found on the JSON Web Token (JWT) Authentication page.
To take advantage of this, you will need to do the following:
Use Unauthenticated access on the web application that serves the UI app
Enable JWT authentication on the web application that handles REST requests
Set UseSession = 0 on the REST handler class for the web application that handles REST requests
Create your own custom login page in the front end. Upon login, this page should submit a payload containing { user: …, password: … } to the /login endpoint as explained in the documentation above
Add front end code to save the access token and refresh token that are returned. The access token needs to be supplied with every subsequent REST request as an Authentication header with the value ‘Bearer <access_token>’.
Add front end code to periodically refresh the access token – this is done by posting the { access_token: …, refresh_token: … } to the /refresh endpoint.
Take a look at this article, it's exactly what you need:
https://community.intersystems.com/post/creating-rest-api-jwt-authentication-objectscript
Announcement
Anastasia Dyubaylo · Jul 12, 2023
Hey Developers,
Enjoy watching the new video on InterSystems Developers YouTube:
⏯ Projecting Data into InterSystems IRIS with Foreign Tables
See the benefits of projecting data into InterSystems IRIS® data platform using Foreign Tables. This feature allows you to access external data without loading it into your instance, saving on storage space and guaranteeing your data is up to date.
Subscribe to our channel to stay tuned!
Announcement
Bob Kuszewski · Jun 2, 2023
We often get questions about recent and upcoming changes to the list of platforms and frameworks that are supported by the InterSystems IRIS data platform. This update aims to share recent changes as well as our best current knowledge on upcoming changes, but predicting the future is tricky business and this shouldn’t be considered a committed roadmap.
With that said, on to the update…
IRIS Production Operating Systems and CPU Architectures
Red Hat Enterprise Linux
Recent Changes
RHEL 9.2 & RHEL 8.8 were released in May, 2023. Red Hat is planning to support these releases for 4 years. InterSystems is planning to do additional testing of IRIS on RHEL 9.2 through a new process we’re calling “Minor OS version certification” that is intended to provide additional security that a minor OS update didn’t break anything obvious.
With the release of RHEL 9.2, Red Hat has ended support for RHEL 9.1. This is consistent with the “odd/even” support cycle that Red Hat has been using since RHEL 8.0.
RHEL 8.4 extended maintenance ends 5/31/2023, which means that IRIS will stop supporting this minor version at that time as well.
Upcoming Changes
RHEL 9.3 is planned for later in the year. This will be a short-term-support release from Red Hat, so InterSystems won’t be recommending it for production deployments.
Previous Updates
IRIS 2022.1.2 adds support for RHEL 9.0. 9.0 is a major OS release that updates the Linux Kernel to 5.14, OpenSSL to 3.0, and Python 3.9
IRIS 2022.2.0 removes support for RHEL 7.x. RHEL 7.9 is still supported in earlier versions of IRIS.
Further reading: RHEL Release Page
Ubuntu
Recent Changes
Ubuntu 22.04.02 LTS was released February 22, 2023. InterSystems is currently performing additional testing of IRIS on 22.04.02 LTS through a new process we’re calling “Minor OS version certification”. So far, so good.
Upcoming Changes
The next major update of Ubuntu is scheduled for April, 2024
Previous Updates
IRIS 2022.1.1 adds support for Ubuntu 22.04. 22.04 is a major OS release that updates the Linux Kernel to 5.15, OpenSSL to 3.0.2, and Python 3.10.6
IRIS 2022.2.0 removes support for Ubuntu 18.04. Ubuntu 18.04 is still supported in earlier versions of IRIS.
IRIS 2022.1.1 & up containers are based on Ubuntu 22.04.
Further Reading: Ubuntu Releases Page
SUSE Linux
Upcoming Changes
SUSE Linux Enterprise Server 15 SP5 is currently in public beta testing. We expect SUSE to release 15 SP5 late Q2 or early Q3 with support added to IRIS after that. SP5 will include Linux Kernel 5.14.21, OpenSSL 3.0.8, and Python 3.11
General Support from SUSE for Linux Enterprise Server 15 SP3 came to an end on 12/31/2022, but extended security support will continue until December, 2025.
Previous Updates
IRIS 2022.3.0 adds support for SUSE Linux Enterprise Server 15 SP4. 15 SP4 is a major OS release that updates the Linux Kernel to 5.14, OpenSSL to 3.0, and Python 3.9
Further Reading: SUSE lifecycle
Oracle Linux
Upcoming Changes
Based on their history, Oracle Linux 9 will include RHEL 9.2 sometime in the second half of 2023.
Previous Updates
IRIS 2022.3.0 adds support for Oracle Linux 9. Oracle Linux 9 is a major OS release that tracks RHEL 9, so it, too, updates the Linux Kernel to 5.14, OpenSSL to 3.0, and Python 3.9
Further Reading: Oracle Linux Support Policy
Microsoft Windows
Upcoming Changes
Windows Server 2012 will reach its end of extended support in October, 2023. If you’re still running on the platform, now is the time to plan migration. IRIS 2023.2 will not be available for Windows Server 2012.
Previous Updates
We haven’t made any changes to the list of supported Windows versions since Windows Server 2022 was added in IRIS 2022.1
Further Reading: Microsoft Lifecycle
AIX
Upcoming Changes
InterSystems is working closely with IBM to add support for OpenSSL 3.0. This will not be included in IRIS 2023.2.0 as IBM will need to target the feature in a further TL release. The good news is that IBM is looking to release OpenSSL 3.0 for both AIX 7.2 & 7.3 and the timing looks like it should align for IRIS 2023.3.
Previous Updates
We haven’t made any changes to the list of supported AIX versions since AIX 7.3 was added and 7.1 removed in IRIS 2022.1
Further Reading: AIX Lifecycle
Containers
Upcoming Changes
IRIS containers will only be tagged with the year and release, such as “2023.2” instead of the full build numbers we’ve been using in the past. This way, your application can, by default, pick up the latest maintenance build of your release.
We are also adding “latest-em” and “latest-cd” tags for the most recent extended maintenance and continuous distribution IRIS release. These will be good for demos, examples, and development environments.
We will also start to tag the preview containers with “-preview” so that it’s clear which container is the most recent GA release.
These changes will all be effective the 2023.2 GA release. We’ll be posting more about this in June.
Previous Updates
We are now publishing multi-architecture manifests for IRIS containers. This means that pulling the IRIS container tagged 2022.3.0.606.0 will download the right container for your machine’s CPU architecture (Intel/AMD or ARM).
IRIS Development Operating Systems and CPU Architectures
MacOS
Recent Changes
We’ve added support for MacOS 13 in IRIS 2023.1
Upcoming Changes
MacOS 14 will be announced soon and expect it will be GA later in the year.
CentOS
We are considering removing support for CentOS/CentOS Stream. See reasoning below.
Red Hat has been running a developer program for a few years now, which gives developers access to free licenses for non-production environments. Developers currently using CentOS are encouraged to switch to RHEL via this program.
CentOS Stream is now “upstream” of RHEL, meaning that it has bugs & features not yet included in RHEL. It also updates daily, which can cause problems for developers building on the platform (to say nothing of our own testing staff).
We haven’t made any changes to the list of supported CentOS versions since we added support for CentOS 8-Stream and removed support for CentOS 7.9 in IRIS 2022.1
InterSystems Components
InterSystems API Manager (IAM)
IAM 3.2 was released this quarter and it included a change to the container’s base image from Alpine to Amazon Linux.
Caché & Ensemble Production Operating Systems and CPU Architectures
Previous Updates
Cache 2018.1.7 adds support for Windows 11
InterSystems Supported Platforms Documentation
The InterSystems Supported Platforms documentation is source for definitive list of supported technologies.
IRIS 2020.1 Supported Server Platforms
IRIS 2021.1 Supported Server Platforms
IRIS 2022.1 Supported Server Platforms
IRIS 2023.1 Supported Server Platforms
Caché & Ensemble 2018.1.7 Supported Server Platforms
… and that’s all folks. Again, if there’s something more that you’d like to know about, please let us know.
AIX 7.3 TL 1 and later has OpenSSL 3.X
From the linked IBM page AIX 7.3 TL1 was released six months ago.$oslevel -s7300-01-02-2320$openssl versionOpenSSL 3.0.8 7 Feb 2023 (Library: OpenSSL 3.0.8 7 Feb 2023) Unfortunately, there are significant bugs in the OpenSSL 3 available for AIX 7.3 that make running IRIS (or any other high user of OpenSSL) impossible. We have a build in-house from IBM that has fixed the problems that should be available to the public later in the year.
Article
Dmitrii Kuznetsov · Oct 7, 2019
How can you allow computers to trust one another in your absence while maintaining security and privacy?
“A Dry Martini”, he said. “One. In a deep champagne goblet.”“Oui, monsieur.”“Just a moment. Three measures of Gordons, one of vodka, half a measure of Kina Lillet. Shake it very well until it’s ice-cold, then add a large thin slice of lemon peel. Got it?”"Certainly, monsieur." The barman seemed pleased with the idea.Casino Royale, Ian Fleming, 1953
OAuth helps to separate services with user credentials from “working” databases, both physically and geographically. It thereby strengthens the protection of identification data and, if necessary, helps you comply with the requirements of countries' data protection laws.
With OAuth, you can provide the user with the ability to work safely from multiple devices at once, while "exposing" personal data to various services and applications as little as possible. You can also avoid taking on "excess" data about users of your services (i.e. you can process data in a depersonalized form).
If you use Intersystems IRIS, you get a complete set of ready-made tools for testing and deploying OAuth and OIDC services, both autonomously and in cooperation with third-party software products.
OAuth 2.0 and OpenID Connect
OAuth and OpenID Connect — known as OIDC or simply OpenID — serve as a universal combination of open protocols for delegating access and identification — and in the 21st century, it seems to be a favorite. No one has come up with a better option for large-scale use. It's especially popular with frontenders because it sits on top of HTTP(S) protocols and uses a JWT (JSON Web Token) container.
OpenID works using OAuth — it is, in fact, a wrapper for OAuth. Using OpenID as an open standard for the authentication and creation of digital identification systems is nothing new for developers. As of 2019, it is in its 14th year (and its third version). It is popular in web and mobile development and in enterprise systems.
Its partner, the OAuth open standard for delegating access, is 12 years old, and it's been nine years since the relevant RFC 5849 standard appeared. For the purposes of this article, we will rely on the current version of the protocol, OAuth 2.0, and the current RFC 6749. (OAuth 2.0 is not compatible with its predecessor, OAuth 1.0.)
Strictly speaking, OAuth is not a protocol, but a set of rules (a scheme) for separating and transferring user identification operations to a separate trusted server when implementing an access-rights restriction architecture in software systems.
Be aware: OAuth can't say anything about a specific user! Who the user is, or where the user is, or even whether the user is currently at a computer or not. But OAuth makes it possible to interact with systems without user participation, using pre-issued access tokens. This is an important point (see "User Authentication with OAuth 2.0" on the OAuth site for more information).
The User-Managed Access (UMA) protocol is also based on OAuth. Using OAuth, OIDC and UMA together make it possible to implement a protected identity and access management (IdM, IAM) system in areas such as:
Using a patient's HEART (Health Relationship Trust) personal data profile in medicine.
Consumer Identity and Access Management (CIAM) platforms for manufacturing and trading companies.
Personalizing digital certificates for smart devices in IoT (Internet of Things) systems using the OAuth 2.0 Internet of Things (IoT) Client Credentials Grant.
A New Venn Of Access Control For The API Economy
Above all, do not store personal data in the same place as the rest of the system. Separate authentication and authorization physically. And ideally, give the identification and authentication to the individual person. Never store them yourself. Trust the owner's device.
Trust and Authentication
It is not a best practice to store users' personal data either in one’s own app or in a combined storage location along with a working database. In other words, we choose someone we trust to provide us with this service.
It is made up of the following parts:
The user
The client app
The identification service
The resource server
The action takes place in a web browser on the user's computer. The user has an account with the identification service. The client app has a signed contract with the identification service and reciprocal interfaces. The resource server trusts the identification service to issue access keys to anyone it can identify.
The user runs the client web app, requesting a resource. The client app must present a key to that resource to gain access.If the user doesn’t have a key, then the client app connects with an identification service with which it has a contract for issuing keys to the resource server (passing the user on to the identification service).
The Identification Service asks what kind of keys are required.
The user provides a password to access the resource. At this point, the user has been authenticated and identification of the user has been confirmed, thus providing the key to the resource (passing the user back to the client app), and the resource is made available to the user.
Implementing an Authorization Service
On the Intersystems IRIS platform, you can assemble a service from different platforms as needed. For example:
Configure and launch an OAuth server with the demo client registered on it.
Configure a demo OAuth client by associating it with an OAuth server and web resources.
Develop client apps that can use OAuth. You can use Java, Python, C#, or NodeJS. Below is an example of the application code in ObjectScript.
There are multiple settings in OAuth, so checklists can be helpful. Let's walk through an example. Go to the IRIS management portal and select the section System Administration > Security > OAuth 2.0 > Server.
Each item will then contain the name of a settings line and a colon, followed by an example or explanation, if necessary. As an alternative, you can use the screenshot hints in Daniel Kutac's three-part article, InterSystems IRIS Open Authorization Framework (OAuth 2.0) implementation - part 1, part 2, and part 3.
Note that all of the following screenshots are meant to serve as examples. You’ll likely need to choose different options when creating your own applications.
On the General Settings tab, use these settings:
Description: provide a description of the configuration, such as "Authorization server".
The endpoint of the generator (hereinafter EPG) host name: DNS name of your server.
Supported permission types (select at least one):
Authorization code
Implicit
Account details: Resource, Owner, Password
Client account details
SSL/TLS configuration: oauthserver
On the Scopes tab:
Add supported scopes: scope1 in our example
On the Intervals tab:
Access Key Interval: 3600
Authorization Code Interval: 60
Update Key Interval: 86400
Session Interruption Interval: 86400
Validity period of the client key (client secret): 0
On the JWT Settings tab:
Entry algorithm: RS512
Key Management Algorithm: RSA-OAEP
Content Encryption Algorithm: A256CBC-HS512
On the Customization tab:
Identify Class: %OAuth2.Server.Authenticate
Check User Class: %OAuth2.Server.Validate
Session Service Class: OAuth2.Server.Session
Generate Key Class: %OAuth2.Server.JWT
Custom Namespace: %SYS
Customization Roles (select at least one): %DB_IRISSYS and %Manager
Now save the changes.
The next step is registering the client on the OAuth server. Click the Customer Description button, then click Create Customer Description.
On the General Settings tab, enter the following information:
Name: OAuthClient
Description: provide a brief description
Client Type: Confidential
Redirect URLs: the address of the point to return to the app after identification from oauthclient.
Supported grant types:
Authorization code: yes
Implicit
Account details: Resource, Owner, Password
Client account details
JWT authorization
Supported response types: Select all of the following:
code
id_token
id_token key
token
Authorization type: Simple
The Client Account Details tab should be auto-completed, but ensure the information here is correct for the client.On the Client Information tab:
Authorization screen:
Client name
Logo URL
Client homepage URL
Policy URL
Terms of Service URL
Now configure the binding on the OAuth server client by going to System Administration > Security > OAuth 2.0 > Client.
Create a Server Description:
The endpoint of the generator: taken from the general parameters of the server (see above).
SSL/TLS configuration: choose from the preconfigured list.
Authorization server:
Authorization endpoint: EPG + /authorize
Key endpoint: EPG + /token
User endpoint: EPG + /userinfo
Key self-test endpoint: EPG + /revocation
Key termination endpoint: EPG + /introspection
JSON Web Token (JWT) settings:
Other source besides dynamic registration: choose JWKS from URL
URL: EPG + /jwks
From this list, for example, you can see (scopes_supported and claims_supported) that the server can provide the OAuth-client with different information about the user. And it's worth noting that when implementing your application, you should ask the user what data they are ready to share. In the example below, we will only ask for permission for scope1.
Now save the configuration.
If there is an error indicating the SSL configuration, then go to Settings > System Administration > Security > SSL/TSL Configurations and remove the configuration.
Now we're ready to set up an OAuth client:System Administration > Security > OAuth 2.0 > Client > Client configurations > Create Client configurationsOn the General tab, use these settings:
Application Name: OAuthClient
Client Name: OAuthClient
Description: enter a description
Enabled: Yes
Client Type: Confidential
SSL/TCL configuration: select oauthclient
Client Redirect URL: the DNS name of your server
Required Permission Types:
Authorization code: Yes
Implicit
Account details: Resource, Owner, Password
Client account details
JWT authorization
Authorization type: Simple
On Client Information tab:
Authorization screen:
Logo URL
Client homepage URL
Policy URL
Terms of Service URL
Default volume: taken from those specified earlier on the server (for example, scope1)
Contact email addresses: enter addresses, separated by commas
Default max age (in seconds): maximum authentication age or omit this option
On the JWT Settings tab:
JSON Web Token (JWT) settings
Creating JWT settings from X509 account details
IDToken Algorithms:
Signing: RS256
Encryption: A256CBC
Key: RSA-OAEP
Userinfo Algorithms
Access Token Algorithms
Query Algorithms
On the Client Credentials tab:
Client ID: as issued when the client registered on the server (see above).
Client ID Issued: isn't filled in
Client secret: as issued when the client registered on the server (see above).
Client Secret Expiry Period: isn't filled in
Client Registration URI: isn't filled in
Save the configuration.
Web app with OAuth authorization
OAuth relies on the fact that the communication channels between the interaction participants (server, clients, web application, user's browser, resource server) are somehow protected. Most often this role is played by protocols SSL/TLS. But OAuth will work and on unprotected channels. So, for example, server Keycloak, by default uses HTTP protocol and does without protection. It simplifies working out and debugging at working out. At real use of services, OAuth protection of channels should be included strictly obligatory is written down in the documentation Keycloak. Developers InterSystems IRIS adhere to a more strict approach for OAuth - use SSL/TSL is obligatory. The only simplification - you can use the self-signed certificates or take advantage of built-in IRIS service PKI (System administration >> Security >> Public key system).
Verification of the user's authorization is made with the explicit indication of two parameters - the name of your application registered on the OAuth server, and in the OAuth client scope.
Parameter OAUTH2APPNAME = "OAuthClient";
set isAuthorized = ##class(%SYS.OAuth2.AccessToken).IsAuthorized(
..#OAUTH2APPNAME,
.sessionId,
"scope1",
.accessToken,
.idtoken,
.responseProperties,
.error)
In the lack of authorization, we prepare a link to the request for user identification and obtaining permission to work with our application. Here we need to specify not only the name of the application registered on the OAuth server and in the OAuth client and the requested volume (scope) but also the backlink to which point of the web application to return the user.
Parameter OAUTH2CLIENTREDIRECTURI = "https://52773b-76230063.labs.learning.intersystems.com/oauthclient/"
set url = ##class(%SYS.OAuth2.Authorization).GetAuthorizationCodeEndpoint(
..#OAUTH2APPNAME,
"scope1",
..#OAUTH2CLIENTREDIRECTURI,
.properties,
.isAuthorized,
.sc)
We use IRIS and register users on the IRIS OAuth server. For example it is enough to set to the user only a name and the password.At transfer of the user under the received reference, the server will carry out the procedure of identification of the user and inquiry at it of the permissions for operation by the account data in the web application, and also will keep the result in itself in global OAuth2.Server.Session in the field %SYS:
3. Demonstrate the data of an authorized user. If the procedures are successful, we have, for example, an access token. Let's get it:
set valid = ##class(%SYS.OAuth2.Validation).ValidateJWT(
.#OAUTH2APPNAME,
accessToken,
"scope1",
.aud,
.JWTJsonObject,
.securityParameters,
.sc
)
The full working code of the OAuth example:
Class OAuthClient.REST Extends %CSP.REST
{
Parameter OAUTH2APPNAME = "OAuthClient";
Parameter OAUTH2CLIENTREDIRECTURI = "https://52773b-76230063.labs.learning.intersystems.com/oauthclient/";
// to keep sessionId
Parameter UseSession As Integer = 1;
XData UrlMap [ XMLNamespace = "http://www.intersystems.com/urlmap" ]
{
<Routes>
<Route Method="GET" Url = "/" Call = "Do" />
</Routes>
}
ClassMethod Do() As %Status
{
// Check for accessToken
set isAuthorized = ##class(%SYS.OAuth2.AccessToken).IsAuthorized(
..#OAUTH2APPNAME,
.sessionId,
"scope1",
.accessToken,
.idtoken,
.responseProperties,
.error)
// to show accessToken
if isAuthorized {
set valid = ##class(%SYS.OAuth2.Validation).ValidateJWT(
..#OAUTH2APPNAME,
accessToken,
"scope1",
.aud,
.JWTJsonObject,
.securityParameters,
.sc
)
&html< Hello!<br> >
w "You access token = ", JWTJsonObject.%ToJSON()
&html< </html> >
quit $$$OK
}
// perform the process of user and client identification and get accessToken
set url = ##class(%SYS.OAuth2.Authorization).GetAuthorizationCodeEndpoint(
..#OAUTH2APPNAME,
"scope1",
..#OAUTH2CLIENTREDIRECTURI,
.properties,
.isAuthorized,
.sc)
if $$$ISERR(sc) {
w "error handling here"
quit $$$OK
}
// url magic correction: change slashes in the query parameter to its code
set urlBase = $PIECE(url, "?")
set urlQuery = $PIECE(url, "?", 2)
set urlQuery = $REPLACE(urlQuery, "/", "%2F")
set url = urlBase _ "?" _ urlQuery
&html<
<html>
<h1>Authorization in IRIS via OAuth2</h1>
<a href = "#(url)#">Authorization in <b>IRIS</b></a>
</html>
>
quit $$$OK
}
}
You can also find a working copy of the code on the InterSystems GitHub repository: https://github.com/intersystems-community/iris-oauth-example.
If necessary, enable the advanced debug message mode on the OAuth server and OAuth client, which are written to the ISCLOG global in the %SYS area:
set ^%ISCLOG = 5
set ^%ISCLOG("Category", "OAuth2") = 5
set ^%ISCLOG("Category", "OAuth2Server") = 5
For more details, see the IRIS Using OAuth 2.0 and OpenID Connect documentation.
Conclusion
As you've seen, all OAuth features are easily accessible and completely ready to use. If necessary, you can replace the handler classes and user interfaces with your own. You can configure the OAuth server and the client settings from configuration files instead of using the management portal. Then that wonderful Ian Flemming intro gets reduced down to "vodka martini, shaken not stirred"
Announcement
Anastasia Dyubaylo · Nov 1, 2019
Hi Everyone,
Please welcome the new Global Summit 2019 video on InterSystems Developers YouTube Channel:
⏯ InterSystems IRIS and Intel Optane Memory
Optane is a new class of memory from Intel that can accelerate the performance of hard drives. In this video, we will review the performance benefits and show high-level cost comparisons of using Intel Optane memory with InterSystems IRIS. We will also outline various use cases for Optane memory and storage.
Takeaway: Attendees will learn the benefits of using Intel's Optane technology with InterSystems IRIS.Presenter: @Mark.Bolinsky, Senior Technology Architect, InterSystems
And...
Don't forget to subscribe to our InterSystems Developers YouTube Channel.
Enjoy and stay tuned!
Announcement
Anastasia Dyubaylo · Oct 30, 2019
Hi Community,
Please join the upcoming InterSystems Israel Meetup in Herzelia which will be held on November 21st, 2019!
It will take place in the Spaces Herzliya Oxygen Ltd from 9:00 a.m. to 5:30 p.m.
The event will be focused on the InterSystems IRIS: it will be divided into IRIS for Healthcare and IRIS Data Platform. A joint lunch will be also included.
Please check the draft of the agenda below:
09:00 – 13:00 Non-Healthcare Sessions:
API Management
Showcase: InterSystems IRIS Directions
IRIS Adopting InterSystems IRIS
REST at Ease
IRIS Containers for Developers
13:00 – 14:00 Joint lunch for both morning and afternoon groups14:00 – 17:30 Healthcare Sessions:
API Management
Build HL7 Interfaces in a Flash
FHIR Update
Showcase: InterSystems IRIS Directions
Note: The final agenda will be published closer to the event.
So, remember:
⏱ Time: November 21st, 2019, from 9:30 a.m. to 5:30 p.m.
📍Venue: Spaces Herzliya Oxygen Ltd, 63 Medinat HaYehudim st., Herzelia, Israel
✅ Registration: Just send an email to ronnie.greenfield@intersystems.com*
We look forward to seeing you!
---
*Space is limited, so register today to secure your place. Admission free, registration is mandatory for attendees. Please check out the final agenda of the event:
Non-Healthcare Sessions:
📌 09:00 – 09:30 Gathering
📌 09:30 – 10:15 IRIS Data Platform Overview 📌 10:15 – 11:00 IRIS Adopting InterSystems IRISIn this session, we will introduce the InterSystems IRIS Adoption Guide, and describe the process of moving from Caché and/or Ensemble to InterSystems IRIS. We will also briefly touch on the conversion process for existing installations of Caché/Ensemble-based applications.Takeaway: InterSystems helps customers as they adopt InterSystems IRIS. 📌 11:00 – 11:45 API ManagementThis session will introduce the concept of API management and outline the InterSystems IRIS features that enable you to manage, monitor, and govern your APIs with full confidence.Takeaway: InterSystems IRIS includes comprehensive capabilities for API management. 📌 11:45 – 12:15 Resources and Services for InterSystems Developers. ObjectScript Package Manager IntroductionTakeaway: Attendees will learn about Developers Community, Open Exchange and other Resources and Services available for developers on InterSystems data platforms and will know about InterSystems Package Manager and how it can help in InterSystems IRIS solutions development 📌 12:45 – 13:00 REST at EaseThis session provides an overview of how to build REST APIs. Topics will include: using the %JSON adapter to expose and consume JSON data for REST endpoints, code-first and spec-first approaches for REST development, and a brief discussion of proper API management.Takeaway: Attendees will learn how to efficiently build, document, and manage REST APIs.
Healthcare Sessions:
📌 13:00 – 14:00 Welcome and Lunch
📌 14:00 – 14:45 InterSystems IRIS for Health Overview 📌 14:45 – 15:00 Showcase: InterSystems IRIS DirectionsThis session provides additional information about the new and future directions for InterSystems IRIS and InterSystems IRIS for Health.Takeaway: InterSystems IRIS and IRIS for Health have a compelling roadmap, with real meat behind it. 📌 15:00 – 15:45 API ManagementThis session will introduce the concept of API management and outline the InterSystems IRIS features that enable you to manage, monitor, and govern your APIs with full confidence.Takeaway: InterSystems IRIS includes comprehensive capabilities for API management. 📌 15:45 – 16:15 Resources and Services for InterSystems Developers. ObjectScript Package Manager IntroductionTakeaway: Attendees will learn about Developers Community, Open Exchange and other Resources and Services available for developers on InterSystems data platforms and will know about InterSystems Package Manager and how it can help in InterSystems IRIS solutions development 📌 16:15 – 17:00 Build HL7 Interfaces in a FlashThis session introduces our new HL7 productivity toolkit. We will give an overview, and demonstrate some key features, such as the Production Generator and Message Analyzer. We will also discuss how you can cost-effectively move from another interface engine to InterSystems technology.Takeaway: You can build HL7 interfaces more efficiently with the new productivity toolkit in InterSystems IRIS for Health.
The agenda is full of interesting stuff. Join the InterSystems Israel Meetup in Herzelia! 👍🏼 I'll participate in the meetup with the session:
📌 11:45 – 12:15 Resources and Services for InterSystems Developers. ObjectScript Package Manager IntroductionTakeaway: Attendees will learn about Developers Community, Open Exchange and other Resources and Services available for developers on InterSystems data platforms and will know about InterSystems Package Manager and how it can help in InterSystems IRIS solutions development
Come join InterSystems Developers Meetup in Israel!
Announcement
Jeff Fried · Nov 4, 2019
The 2019.3 versions of InterSystems IRIS, InterSystems IRIS for Health, and InterSystems IRIS Studio are now Generally Available!
These releases are available from the WRC Software Distribution site, with build number 2019.3.0.311.0.
InterSystems IRIS Data Platform 2019.3 has many new capabilities including:
Support for InterSystems API Manager (IAM)
Polyglot Extension (PeX) available for Java
Java and .NET Gateway Reentrancy
Node-level Architecture for Sharding and SQL Support
SQL and Performance Enhancements
Infrastructure and Cloud Deployment Improvements
Port Authority for Monitoring Port Usage in Interoperability Productions
X12 Element Validation in Interoperability Productions
These are detailed in the documentation:
InterSystems IRIS 2019.3 documentation and release notes
InterSystems IRIS for Health 2019.3 includes all of the enhancements of InterSystems IRIS. In addition, this release includes FHIR searching with chained parameters (including reverse chaining) and minor updates to FHIR and other health care protocols.
FHIR STU3 PATCH Support
New IHE Profiles XCA-I and IUA
These are detailed in the documentation:
InterSystems IRIS for Health 2019.3 documentation and release notes
InterSystems IRIS Studio 2019.3 is a standalone development image supported on Microsoft Windows. It works with InterSystems IRIS and InterSystems IRIS for Health version 2019.3 and below, as well as with Caché and Ensemble.
See the InterSystems IRIS Studio Documentation for details
2019.3 is a CD release, so InterSystems IRIS and InterSystems IRIS for Health 2019.3 are only available in OCI (Open Container Initiative) a.k.a. Docker container format. The platforms on which this is supported for production and development are detailed in the Supported Platforms document. Having gone through the pain of Installing Docker for Windows and then installing the InterSystems IRIS for HEALTH 2019.3 image and having got hold of a copy of the 2019.3 Studio I was please when I saw this announcement and excitedly went looking for my 2019.3......exe only to find out there is none and a small note at the end off the announcement to say that 2019.3 InterSystems IRIS and InterSystems IRIS for Heath will only be released in CD form.
Yours
Nigel Salm Nigel, just want to be sure that you read CD as Containers Deployment - so it will be available on every delivery site (WRC, download, AWS, GCP, Azure, Dockerhub) but in a container form. InterSystems Docker Imageshttps://wrc.intersystems.com/wrc/coDistContainers.csp
Article
Timothy Leavitt · Mar 24, 2020
This article will describe processes for running unit tests via the InterSystems Package Manager (aka IPM - see https://openexchange.intersystems.com/package/InterSystems-Package-Manager-1), including test coverage measurement (via https://openexchange.intersystems.com/package/Test-Coverage-Tool).
Unit testing in ObjectScript
There's already great documentation about writing unit tests in ObjectScript, so I won't repeat any of that. You can find the Unit Test tutorial here: https://docs.intersystems.com/irislatest/csp/docbook/Doc.View.cls?KEY=TUNT_preface
It's best practice to include your unit tests somewhere separate in your source tree, whether it's just "/tests" or something fancier. Within InterSystems, we end up using /internal/testing/unit_tests/ as our de facto standard, which makes sense because tests are internal/non-distributed and there are types of tests other than unit tests, but this might be a bit complex for simple open source projects. You may see this structure in some of our GitHub repos.
From a workflow perspective, this is super easy in VSCode - you just create the directory and put the classes there. With older server-centric approaches to source control (those used in Studio) you'll need to map this package appropriately, and the approach for that varies by source control extension.
From a unit test class naming perspective, my personal preference (and the best practice for my group) is:
UnitTest.<package/class being tested>[.<method/feature being tested>]
For example, if unit tests for method Foo in class MyApplication.SomeClass, the unit test class would be named UnitTest.MyApplication.SomeClass.Foo; if the tests were for the class as a whole, it'd just be UnitTest.MyApplication.SomeClass.
Unit tests in IPM
Making the InterSystems Package Manager aware of your unit tests is easy! Just add a line to module.xml like the following (taken from https://github.com/timleavitt/ObjectScript-Math/blob/master/module.xml - a fork of @Peter.Steiwer 's excellent math package from the Open Exchange, which I'm using as a simple motivating example):
<Module> ... <UnitTest Name="tests" Package="UnitTest.Math" Phase="test"/></Module>
What this all means:
The unit tests are in the "tests" directory underneath the module's root.
The unit tests are in the "UnitTest.Math" package. This makes sense, because the classes being tested are in the "Math" package.
The unit tests run in the "test" phase in the package lifecycle. (There's also a "verify" phase in which they could run, but that's a story for another day.)
Running Unit Tests
With unit tests defined as explained above, the package manager provides some really helpful tools for running them. You can still set ^UnitTestRoot, etc. as you usually would with %UnitTest.Manager, but you'll probably find the following options much easier - especially if you're working on several projects in the same environment.
You can try out all of these by cloning the objectscript-math repo listed above and then loading it with zpm "load /path/to/cloned/repo/", or on your own package by replacing "objectscript-math" with your package names (and test names).
To reload the module and then run all the unit tests:
zpm "objectscript-math test"
To just run the unit tests (without reloading):
zpm "objectscript-math test -only"
To just run the unit tests (without reloading) and provide verbose output:
zpm "objectscript-math test -only -verbose"
To just run a particular test suite (meaning a directory of tests - in this case, all the tests in UnitTest/Math/Utils) without reloading, and provide verbose output:
zpm "objectscript-math test -only -verbose -DUnitTest.Suite=UnitTest.Math.Utils"
To just run a particular test case (in this case, UnitTest.Math.Utils.TestValidateRange) without reloading, and provide verbose output:
zpm "objectscript-math test -only -verbose -DUnitTest.Case=UnitTest.Math.Utils.TestValidateRange"
Or, if you're just working out the kinks in a single test method:
zpm "objectscript-math test -only -verbose -DUnitTest.Case=UnitTest.Math.Utils.TestValidateRange -DUnitTest.Method=TestpValueNull"
Test coverage measurement via IPM
So you have some unit tests - but are they any good? Measuring test coverage won't fully answer that question, but it at least helps. I presented on this at Global Summit back in 2018 - see https://youtu.be/nUSeGHwN5pc .
The first thing you'll need to do is install the test coverage package:
zpm "install testcoverage"
Note that this doesn't require IPM to install/run; you can find more information on the Open Exchange: https://openexchange.intersystems.com/package/Test-Coverage-Tool
That said, you can get the most out of the test coverage tool if you're also using IPM.
Before running tests, you need to specify which classes/routines you expect your tests to cover. This is important because, in very large codebases (for example, HealthShare), measuring and collecting test coverage for all of the files in the project may require more memory than your system has. (Specifically, gmheap for the line-by-line monitor, if you're curious.)
The list of files goes in a file named coverage.list within your unit test root; different subdirectories (suites) of unit tests can have their own copy of this to override which classes/routines will be tracked while the test suite is running.
For a simple example with objectscript-math, see: https://github.com/timleavitt/ObjectScript-Math/blob/master/tests/UnitTest/coverage.list ; the user guide for the test coverage tool goes into further details.
To run the unit tests with test coverage measurement enabled, there's just one more argument to add to the command, specifying that TestCoverage.Manager should be used instead of %UnitTest.Manager to run the tests:
zpm "objectscript-math test -only -DUnitTest.ManagerClass=TestCoverage.Manager" The output (even in non-verbose mode) will include a URL where you can view which lines of your classes/routines were covered by unit tests, as well as some aggregate statistics.
Next Steps
What about automating all of this in CI? What about reporting unit test results and coverage scores/diffs? You can do that too! For a simple example using Docker, Travis CI and codecov.io, see https://github.com/timleavitt/ObjectScript-Math ; I'm planning to write this up in a future article that looks at a few different approaches. Excellent article Tim! Great description of how people can move the ball forward with the maturity of their development processes :) Hello @Timothy.Leavitt Thank you for this great article!
I tried to add "UnitTest" tag to my module.xml but something wrong during the publish process.<UnitTest Name="tests" Package="UnitTest.Isc.JSONFiltering.Services" Phase="test"/>
tests directory contain a directory tree UnitTest/Isc/JSONFiltering/Services/ with a %UnitTest.TestCase sublcass.
Exported 'tests' to /tmp/dirLNgC2s/json-filter-1.2.0/tests/.testsERROR #5018: Routine 'tests' does not exist[json-filter] Package FAILURE - ERROR #5018: Routine 'tests' does not existERROR #5018: Routine 'tests' does not exist
I also tried with objectscript-math project. This is the output of objectscript-math publish -v :Exported 'src/cls/UnitTests' to /tmp/dir7J1Fhz/objectscript-math-0.0.4/src/cls/unittests/.src/cls/unittestsERROR #5018: Routine 'src/cls/UnitTests' does not exist[objectscript-math] Package FAILURE - ERROR #5018: Routine 'src/cls/UnitTests' does not existERROR #5018: Routine 'src/cls/UnitTests' does not exist
Did I miss something or is a package manager issue ?Thank you. Perhaps try Name="/tests" with a leading slash? Yes, that's it ! We can see a dot.
It works fine.Thank you for your help. @Timothy.Leavitt Do you all still use your Test Coverage Tool at InterSystems? I haven't seen any recent updates to it on the repo so I I'm wondering if you consider it still useful and it's just in a steady state, stable place or are there different tactics for test coverage metrics since you published? @Michael.Davidovich yes we do! It's useful and just in a steady state (although I have a PR in process around some of the recent confusing behavior that's been reported in the community). Thanks, @Timothy.Leavitt!
For others working through this too, I wanted to sum some points up that I discussed with Tim over PM.
- Tim reiterated the usefulness of the Test Coverage tool and the Cobertura output for finding starting places based on complexity and what are the right blocks to test.
- When it comes to testing persistent data classes, it is indeed tricky but valuable (e.g. data validation steps). Using transactions (TSTART and TROLLBACK) is a good approach for this.
I also discussed the video from some years ago on the mocking framework. It's an awesome approach, but for me, it depends on retooling classes to fit the framework. I'm not in a place where I want to or can rewrite classes for the sake of testing, however this might be a good approach for others. There may be other open source frameworks for mocking available later.
Hope this helps and encourages more conversation! In a perfect world we'd start with our tests and code from there, but well, the world isn't perfect! great summary ... thank you! @Timothy.Leavitt and others: I know this isn't Jenkins support, but I seem to be having trouble allowing the account running Jenkins to get into IRIS. Just trying to get this to work locally at the moment. I'm running on Windows through an organizational account, so I created a new local account on the computer, jenkinsUser, which I'm to understand is the 'user' that logs in and runs everything on Jenkins. When I launch IRIS in the build script using . . .
C:\MyPath\bin\irisdb -s C:\MyPath\mgr -U MYNAMESPACE 0<inFile
. . . I can see in the console it's trying to login. I turned on O/S authentication for the system and allowed the %System.Login function to use Kerbose. I can launch Terminal from my tray and I'm logged in without a user/password prompt.
I am guessing that IRIS doesn't know about my jenkinsUser local account, so it won't allow that user to us O/S authentication? I'm trying to piece this together in my head. How can I allow this computer user trying to run Jenkins access to IRIS without authentication?
Hope this helps others who are trying to set this up. Not sure if this is right, but I created a new IRIS user and then created delegated access to %Service_Console and included this in my ZAUTHENTICATE routine. Seems to have worked.
Now . . . on to the next problem:
DO ##CLASS(UnitTest.Manager).OutputResultsXml("junit.xml")
^
<CLASS DOES NOT EXIST> *UnitTest.Manager Please try %UnitTest.Manager I had to go back . . . that was a custom class and method that was written for the Widgets Direct demo app. Trial and error folks:
@Timothy.Leavitt your presentation mentioned a custom version of the Coberutra plugin for the scatter plot . . . is that still necessary or does the current version support that? Not sure if I see any mention of the custom plugin on the GitHub page.
Otherwise, I seem to me missing something key: I don't have build logic in my script. I suppose I just thought that step was for automation purposes so that the latest code would be compiled on whatever server. I don't have anything like that yet and thought I could just run the test coverage utility but it's coming up with nothing. I'll keep playing tomorrow but appreciate anyone's thoughts on this especially if you've set it up before!
For those following along, I got this to work finally by creating the "coverage.list" file in the unit test root. I tried setting the parameter node "CoverageClasses" but that didn't work (maybe I used $LB wrong).
Still not sure how to get the scatter plot for complexity as @Timothy.Leavitt mentioned in the presentation the Cobertura plugin was customized. Any thoughts on that are appreciated! I think this is it: GitHub - timleavitt/covcomplplot-plugin: Jenkins covcomplplot pluginIt's written by Tim, it's on the plugin library, and it looks like what was in the presentation, however I have some more digging come Monday.
@Michael.Davidovich I was out Friday, so still catching up on all this - glad you were able to figure out coverage.list. That's generally a better way to go for automation than setting a list of classes.
re: the plugin, yes, that's it! There's a GitHub issue that's probably the same here: https://github.com/timleavitt/covcomplplot-plugin/issues/1 - it's back on my radar given what you're seeing. So I originally installed the scatter plot plugin from the library, not the one from your repo. I uninstalled that and I'm trying to install the one you modified. I'm having a little trouble because it seems I have to download your source, make sure I have a JDK installed and Maven and package the code into a .hpi file? Does this sound right? I'm getting some issues with the POM file while running 'mvn pacakge'. Is it possible to provide the packaged file for those of us not Java-savvy? For other n00bs like me . . . in GitHub you click the Releases link on the code page and you can find the packaged code. Edit: I created a separate thread about this so it gets more visibility: The thread can be found from here: https://community.intersystems.com/post/test-coverage-coverage-report-not-generating-when-running-unit-tests-zpm
...
Hello,
@Timothy.Leavitt, thanks for the great article! I am facing a slight problem and was wondering if you, or someone else, might have some insight into the matter.
I am running my unit tests in the following way with ZPM, as instructed. They work well and test reports are generated correctly. Test coverage is also measured correctly according to the logs. However, even though I instructed ZPM to generate Cobertura-style coverage reports, it is not generating one. When I run the GenerateReport() method manually, the report is generated correctly.
I am wondering what I am doing wrong. I used the test flags from the ObjectScript-Math repository, but they seem not to work.
Here is the ZPM command I use to run the unit tests:
zpm "common-unit-tests test -only -verbose
-DUnitTest.ManagerClass=TestCoverage.Manager
-DUnitTest.UserParam.CoverageReportClass=TestCoverage.Report.Cobertura.ReportGenerator
-DUnitTest.UserParam.CoverageReportFile=/opt/iris/test/CoverageReports/coverage.xml
-DUnitTest.Suite=Test.UnitTests.Fw
-DUnitTest.JUnitOutput=/opt/iris/test/TestReports/junit.xml
-DUnitTest.FailuresAreFatal=1":1
The test suite runs okay, but coverage reports do not generate. However, when I run these commands stated in the TestCoverage documentation, the reports are generated.
Set reportFile = "/opt/iris/test/CoverageReports/coverage.xml"
Do ##class(TestCoverage.Report.Cobertura.ReportGenerator).GenerateReport(<index>, reportFile)
Here is a short snippet from the logs where you can see that test coverage analysis is run:
Collecting coverage data for Test: .036437 seconds
Test passed
Mapping to class/routine coverage: .041223 seconds
Aggregating coverage data: .019707 seconds
Code coverage: 41.92%
Use the following URL to view the result:
http://192.168.208.2:52773/csp/sys/%25UnitTest.Portal.Indices.cls?Index=19&$NAMESPACE=COMMON
Use the following URL to view test coverage data:
http://IRIS-LOCALDEV:52773/csp/common/TestCoverage.UI.AggregateResultViewer.cls?Index=17
All PASSED
[COMMON|common-unit-tests] Test SUCCESS
What am I doing wrong?
Thank you, and have a good day!Kari Vatjus-Anttila %UnitTest mavens may be interested in this announcement:
https://community.intersystems.com/post/intersystems-testing-manager-new-vs-code-extension-unittest-framework
Question
Evgeny Shvarov · Jul 23, 2019
Hi guys!What is the IRIS analog for Ensemble.INC? Tried to compile the class in IRIS - says
Error compiling routine: Util.LogQueueCounts. Errors: Util.LogQueueCounts.cls : Util.LogQueueCounts.1(7) : MPP5635 : No include file 'Ensemble'
You just have to enable Ensemble in the installer
<Namespace Name="${NAMESPACE}" Code="${DBNAME}-CODE" Data="${DBNAME}-DATA" Create="yes" Ensemble="1">
That helped! Thank you! What do you mean? There are still Ensemble.inc
Announcement
Sourabh Sethi · Jul 29, 2019
A SOLID Design in Cache ObjectIn this session, we will discussing SOLID Principle of Programming and will implement in a example.I have used Cache Object Programming Language for examples.We will go step by step to understand the requirement, then what common mistakes we use to do while designing, understanding each principles and then complete design with its implementation via Cache Objects.If you have any questions or suggestions, please write to me - sethisourabh.hit@gmail.comCodeSet - https://github.com/sethisourabh/SolidPrinciplesTraining Thanks for sharing this knowledge on ObjectScript language.I haven't heard of SOLID Principle before, I'll apply it on my next code.BTW : can you share your sildes for an easier walkthrough ? Thank you for your response.I dont see any way to attach documents here. You can send your email id and I will send over there.My email ID - sethisourabh.hit@gmail.comRegards,Sourabh You could use https://www.slideshare.net/ or add the document to the GitHub repo.There is a way to post documents on Intersystems Community under Edit Post -> Change Additional Settings, which I documented here but it's not user friendly and I didn't automatically see links to attached documents within the post so I had to manually add the links. Community feedback suggests they may turn this feature off at some point so I'd recommend any of the above options instead. Thanks, @Stephen.Wilson!Yes, we plan to turn off the attachments feature. As you mention there are a lot of better ways to expose presentation and code.And as you see @Sourabh.Sethi6829 posted the recent package for his recent video on Open Exchange. Do I need the code set for this session in open exchange? Would be great - it’s even more presence and developers can collaborate DONE
Article
Murray Oldfield · Nov 14, 2019
Released with no formal announcement in [IRIS preview release 2019.4](https://community.intersystems.com/post/intersystems-iris-and-iris-health-20194-preview-published "IRIS preview release 2019.1.4") is the /api/monitor service exposing IRIS metrics in **_Prometheus_** format. Big news for anyone wanting to use IRIS metrics as part of their monitoring and alerting solution. The API is a component of the new IRIS _System Alerting and Monitoring (SAM)_ solution that will be released in an upcoming version of IRIS.
>However, you do not have to wait for SAM to start planning and trialling this API to monitor your IRIS instances. In future posts, I will dig deeper into the metrics available and what they mean and provide example interactive dashboards. But first, let me start with some background and a few questions and answers.
IRIS (and Caché) is always collecting dozens of metrics about itself and the platform it is running on. There have always been [multiple ways to collect these metrics to monitor Caché and IRIS](https://docs.intersystems.com/irislatest/csp/docbook/Doc.View.cls?KEY=GCM "multiple ways to collect these metrics to monitor Caché and IRIS"). I have found that few installations use IRIS and Caché built-in solutions. For example, History Monitor has been available for a long time as a historical database of performance and system usage metrics. However, there was no obvious way to surface these metrics and instrument systems in real-time.
IRIS platform solutions (along with the rest of the world) are moving from single monolithic applications running on a few on-premises instances to distributed solutions deployed 'anywhere'. For many use cases existing IRIS monitoring options do not fit these new paradigms. Rather than completely reinvent the wheel InterSystems looked to popular and proven current Open Source solutions for monitoring and alerting.
## Prometheus?
Prometheus is a well known and widely deployed open source monitoring system based on proven technology. It has a wide variety of plugins. It is designed to work well within the cloud environment, but also is just as useful for on-premises. Plugins include operating systems, web servers such as Apache and many other applications. Prometheus is often used with a front end client, for example, _Grafana_, which provides a great UI/UX experience that is extremely customisable.
## Grafana?
Grafana is also open source. As this series of posts progresses, I will provide sample templates of monitoring dashboards for common scenarios. You can use the samples as a base to design dashboards for what you care about. The real power comes when you combine IRIS metrics in context with metrics from your whole solution stack. From the platform components, operating system, IRIS and especially when you add instrumentation from your applications.
## Haven't I seen this before?
Monitoring IRIS and Caché with Prometheus and Grafana is not new. I have been using these applications for several years to monitor my development and test systems. If you search the Developer Community for "Prometheus", you will find other posts ([for example, some excellent posts by Mikhail Khomenko](https://community.intersystems.com/post/making-prometheus-monitoring-intersystems-cach%C3%A9 "or example, this one by Mikhail Khomenko")) that show how to expose Caché metrics for use by Prometheus.
>The difference now is that the /api/monitor API is included and enabled by default. There is no requirement to code your own classes to expose metrics.
# Prometheus Primer
Here is a quick orientation to Prometheus and some terminology. I want you to see the high level and to lay some groundwork and open the door to how you think of visualising or consuming the metrics provided by IRIS or other sources.
Prometheus works by _scraping_ or pulling time series data exposed from applications as HTTP endpoints (APIs such as IRIS /api/monitor). _Exporters_ and client libraries exist for many languages, frameworks, and open-source applications — for example, for web servers like Apache, operating systems, docker, Kubernetes, databases, and now IRIS.
Exporters are used to instrument applications and services and to expose relevant metrics on an endpoint for scraping. Standard components such as web servers, databases, and the like - are supported by core exporters. Many other exporters are available open-source from the Prometheus community.
## Prometheus Terminology
A few key terms are useful to know:
- **Targets** are where the services are that you care about, like a host or application or services like Apache or IRIS or your own application.
- Prometheus **scrapes** targets over HTTP collecting metrics as time-series data.
- **Time-series data** is exposed by applications, for example, IRIS or via exporters.
- **Exporters** are available for things you don't control like Linux kernel metrics.
- The resulting time-series data is stored locally on the Prometheus server in a database \*\*.
- The time-series database can be queried using an optimised **query language (PromQL)**. For example, to create alerts or by client applications such as Grafana, to display the metrics in a dashboard.
>\*\* **Spoiler Alert;** For security, scaling, high availability and some other operational efficiency reasons, for the new SAM solution the database used for Prometheus time-series data is IRIS! However, access to the Prometheus database -- on IRIS -- is transparent, and applications such as Grafana do not know or care.
### Prometheus Data Model
Metrics returned by the API are in Prometheus format. Prometheus uses a simple text-based metrics format with one metric per line, the format is;
[ (time n, value n), ....]
Metrics can have labels as (key, value) pairs. Labels are a powerful way to filter metrics as dimensions. As an example, examine a single metric returned for IRIS /api/monitor. In this case journal free space:
iris_jrn_free_space{id="WIJ",dir=”/fast/wij/"} 401562.83
The identifier tells you what the metric is and where it came from:
iris_jrn_free_space
Multiple labels can be used to decorate the metrics, and then used to filter and query. In this example, you can see the WIJ and the directory where the WIJ is stored:
id="WIJ",dir="/fast/wij/"
And a value: `401562.83` (MB).
## What IRIS metrics are available?
The [preview documentation](https://irisdocs.intersystems.com/iris20194/csp/docbook/Doc.View.cls?KEY=GCM_rest "Will be subject to changes") has a list of metrics. However, be aware there may be changes. You can also simply query the `/api/monitor/metrics` endpoint and see the list. I use [Postman](https://www.getpostman.com "Postman") which I will demonstrate in the next community post.
# What should I monitor?
Keep these points in mind as you think about how you will monitor your systems and applications.
- When you can, instrument key metrics that affect users.
- - Users don't care that one of your machines is short of CPU.
- - Users care if the service is slow or having errors.
- - For your primary dashboards focus on high-level metrics that directly impact users.
- For your dashboards avoid a wall of graphs.
- - Humans can't deal with too much data at once.
- - For example, have a dashboard per service.
- Think about services, not machines.
- - Once you have isolated a problem to one service, then you can drill down and see if one machine is the problem.
# References
**Documentation and downloads** for: [Prometheus](https://prometheus.io "Prometheus") and [Grafana](https://grafana.com "Grafana")
I presented a pre-release overview of SAM (including Prometheus and Grafana) at **InterSystems Global Summit 2019** you can find [the link at InterSystems learning services](https://learning.intersystems.com/mod/page/view.php?id=5599 "Learning Services"). If the direct link does not work go to the [InterSystems learning services web site](https://learning.intersystems.com "Learning Services") and search for: "System Alerting and Monitoring Made Easy"
Search here on the community for "Prometheus" and "Grafana".
Please include node_exporter setup.
What gets put into the isc_prometheus.yml
THis is what the doc says to do this in isc_prometheus.yml
- job_name: NODE metrics_path: /metrics scheme: http static_configs: - labels: cluster: "2" group: node targets: - csc2cxn00020924.cloud.kp.org:9100 - csc2cxn00021271.cloud.kp.org:9100
It does not work.
The node_exporter is installed and running.
From what I can see the values returned are updated very quickly - maybe every second? I'm unclear as to how to contextualize the metrics for a periodic collection. Specifically, if I call the API every minute I may get a value for global references that is very low or very high - but it may not be indicative of the value over time. Is there any information on how the metrics are calculated internally that might help? Single points in time may be very deceptive.