Clear filter
Question
Scott Roth · Jan 7, 2019
Can someone tell me if intersystems-ru/deepsee-sysmon-dashboards is developed for a specific version of Ensemble? Looks like it could be useful to my group but we aren't upgrading till later this year and we are on 2015.2.2.ThanksScott Hi Scott!First, thanks for mentioning Open Exchange, I appreciate :) Here is the link of Sysmon Dashboards on OEX.Also, here is the article by @Semen.Makarov, which could help.The tool just visualizes the data you have %SYSMONMGR, I believe the utility appeared at early versions of Caché.The visualization is better with DeepSee Web, this will require at least Caché 2014 for REST and JSON.Also I'll ping @Daniel.Kutac for more details who initially introduced the tool
Question
Kurt Hofman · May 13, 2019
Hello,I'm trying to get ODBC running on a Mac but I don't get it working.Has someone an overview how te install,configure and use it in Excel on MacOSX ?Regards,Kurt Hofman. Thanks, but I can(t find the correct way to install Caché ODBC. Do I have to extract it in a special folder or so and then follow the UNIX-like installation or something else ? It actually, does not matter, where to install drivers.My Caché works in docker, so, I downloaded ODBC drivers from the ftp.extracted it, just in Downloads, and run there ODBCInstall from a terminal.with ODBCInstall I got file mgr/cacheodbc.ini, which I copied to ~/Library/ODBC/odbc.ini, there will be as User DSN. /Library/ODBC/odbc.ini for System DSN.DYLD_LIBRARY_PATH should follow to bin folder in extracted ODBC Drivers folderin my case
export DYLD_LIBRARY_PATH=/Users/daimor/Downloads/ODBC-2018.1.2.309.0-macx64/bin
you can check connection with iODBC Manager, running right from terminal
DYLD_LIBRARY_PATH=/Users/daimor/Downloads/ODBC-2018.1.2.309.0-macx64/bin open /Applications/iODBC/iODBC\ Administrator64.app
and open excel same way
DYLD_LIBRARY_PATH=/Users/daimor/Downloads/ODBC-2018.1.2.309.0-macx64/bin open /Applications/Microsoft\ Excel.app
So, you'll be able to test settings without relogin What do you have already?Did you configure ODBC DSN, DYLD_LIBRARY_PATH?Some information about configuring ODBC, you can find here.First of all I think you need iODBC installed on your mac.then ODBC Caché drivers, with correct environment variable DYLD_LIBRARY_PATH, do not forget to relogin after the set variable.As proof, that's possible to do it. Done with Caché 2018.1
Question
Nagarjuna Reddy Kurmayyagari · May 6, 2020
After I loaded the intersystems/iam:0.34-1-1 image in local, I am trying to do the next step of configuration.
2) Configure your InterSystems IRIS instance
2a) Enable the /api/IAM web application 2b) Enable the IAM user 2c) Change the password for the IAM user
Do we need to change these settings in iris.cpf file? any inputs where we have to configure? I have IRIS 2019.1.1 installed in my machine.
Thanks,
Nag. Hello Nagarjuna,
IAM run on InterSystems IRIS Data Platform 2019.2 and higher. You need to upgrade your server.
Once you're done, you would find everything (the /api/iam application , IAM user) in the System Management Portal.
HTH
Dan 2019.1.1 supports IAM. 2019.1.0 does not.
So no need to upgrade.
Docs:
Web Application management
User Management
Question
Andrew Brown · Apr 7, 2021
We are looking at moving from Cache to IRIS, if we do this we will want to use Intersystems Reports. We have a lot of Crystal Reports to convert. is there a conversion tool or best practice for doing the conversion Hello! Now we are carrying out a similar project to transfer the system from Cache to IRIS, including the tasks of converting reports from Crystal Reports to JReports format (Logi Report https://www.logianalytics.com/jreport/)The project is nearing completion and we can share our experience. We are also interested in converting from SAP Business Objects Crystal, Webi, etc. reports to the JReports format. Please share any insights, tools, articles etc. that you have! We only used the official documentation. Our developments on this topic have not yet been presented in the form of an article. Perhaps, after the end of the project, we will make a publication on this topic. I just received confirmation that Logi/Jreports used to provide a converter, unfortunately they have not supported it for a few years and no longer available.
Article
Athanassios Hatzis · Feb 16, 2017
Hi,
I would like to draw your attention on a recently published article, titled "A Quick Guide on How to Prevail in the Graph Database Arena", that has been posted also at LinkedIn. Intersystems Caché has been referenced several times. In the "Multi-model Database Engine" section of this article, there is a quick description of Caché as an
object database with relational access, integrated support for JSON documents and a multidimensional key-value storage mechanism that can be easily extended to cover Graph data model
I believe strongly that Intersystems Caché is an exceptional database product. Its architecture is unique and powerful. I have emphasized this in the following section of the same article,
No matter what is their physical implementation, i.e. hash tables or trees, based on this abstract data type you can model all four NoSQL database types, (Key/Value, Tabular/Columnar, Document, Graph). For one reason or another, we are of the opinion that associative/multidimensional arrays will eventually prevail in the world of databases.
These two key points justify that graph data model can be a perfect fit on the architecture of Intersystems Caché. Although I have not seen any movement from Intersystems towards that area, I am confident there is a great potential in business value by integrating/expanding the multi-model functionality of Cache. I presume the database market is matured enough for graph databases.
But the problem with this database sector is that it is both cluttered and perplexed with many different kinds of graph databases. Let me focus on the conceptual/logical layer where my work is based. Depending on the structure of nodes and edges you get different graph topologies and stores.
Property Graph Data Model
Entity centric with Embedded Properties and edges with BIDIRECTIONAL LINKING, Directed Labeled GraphNeo4J, OrientDB, ArrangoDB, etc...Triple/Quadruple Data Model
Edge centric with UNIDIRECTIONAL LINKING on vertices, Directed Labeled GraphGraphDB, AllegroGraph, OpenLink Virtuoso,Associative Data Model
Hypernodes, Hyperedges, BIDIRECTIONAL LINKING, Hypergraph/Bipartite GraphTopic Maps, R3DM/S3DM, X10SYS (AtomicDB), HypergraphDB, Qlik
I will not get into analyzing their differences in detail here, I have covered to some extend their differentiation criteria in my article mentioned already. My research work and development has been on the third type above, the Associative Data Model also known as hypergraph or bipartite graph. Currently there are only two major players in the market with products that are based on associative technology.
In "Associative Data Modeling Demystified" series of posts that is written with a hands-on practice style, I introduce my audience to the design of R3DM/S3DM and I am making an attempt to clear the information glut of many-to-many relationships (a.k.a associations) with a thorough examination of well-known data models and graph/associative software products.
Part 6 - R3DM/S3DM: Build Powerful, Meaningful, Cohesive Relationships Easily
Part 5 - Qlik Associative Model
Part 4 - Association in RDF Data Model
Part 3 - Association in Property Graph Data Model - (Intersystems Caché is mentioned at this section)
Part 2 - Association in Topic Map Data Model
Part 1 - Relation, Relationship and Association
So what is the objective of writing in this community ?
Well there is a two pronged approach,
To promote the fundamental principles of R3DM/S3DM framework in the design of hypergraph-associative based DBMSTo implement this technology on top of suitable DBMS.
What is the ultimate goal ?
The ultimate goal is to have a single operational database that acts as a unification platform for all data from other data sources and enables a 360 degrees self-service dynamic data visualization and analysis with any part of the database without ETL and without queries.
What is the current status of R3DM/S3DM ?
There are currently two implementations :
A working prototype with OrientDB as the back-storage and Wolfram Language as the front-end APIA partial implementation with Intersystems Cache as the back-storage and the construction of R3DM/S3DM subsystems and low-level commands with Cache ObjectScript.
In the sixth and last part of our series detailed information about the architectural design and API commands of R3DM/S3DM are exposed to the public.
How you can help ?
You may actively take part in any discussion within this community about graph data modeling with Intersystems Cache. Make a start here. In particular, you may be interested in R3DM/S3DM project, then I suggest you get connected with me at LinkedIn.
I am also actively looking for partners and investors for this technology so perhaps you would like to discuss your role or contribution.
Vendors and experts of the database domain are interested in this project. This is encouraging but the plan of course is to reach production and deployment level and that is where they can help. Hello [@athanassios.hatzis] !I thought about "Graphs on Cache". I attached my point of view how to store simple graph db.slides: https://drive.google.com/open?id=0B7nnNs0_XYaSWXp1S3BoUjFXdFk
Question
Murali krishnan · Apr 13, 2017
Please let know how to invoke a java program from intersystems. Also let know how to expose / consume webservices / APIs from intersystems Various Options...1. Call out to Java on the command line using $ZFhttps://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=RCOS_fzf-12. Access POJO's directly using Jalapeñohttps://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=GBJJ3. Consume a web service using the Cache soap wizard...https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=ESOAP_web_client4. Publish a web service from Cache...https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=ESOAP_web_service From InterSystems Ensemble you can also use the Java Gateway:http://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=EJVG
Question
Murali krishnan · May 9, 2017
Please let know if it is feasible to integrate Security tools ( W3af and Iron Wasp ) with intersystems. If so, please let know how to do that.. Hi Murali,
from looking at both of their websites, it seems they are just web scanners?
If so, you don't need to do anything different to run it against your InterSytems powered web pages. You can use them the same as with any other webpage you're scanning.
Cheers,
Fab
Announcement
Teo Rusu · Aug 8, 2018
Good afternoon everyone,I hope you are well,I am currently working on the below opportunity and if anyone here would like to hear more details please feel free to contact me on 01908 886 030 or teo.rusu@identifiglobal.comMet with the client yesterday, beautiful office in Waterloo London, latest tech available with many benefits including working from home 2 days/w as well!Intersystems Caché Senior Software EngineerLondon Salary up to £75,000 (negotiable for the right candidate)An exciting Software as a Service company creating enterprise software used by 33,000+ people globally. They provides a secure, global cloud platform marketing teams use to rapidly expand services across channels and launch into new markets.In a highly collaborative working environment their clients benefit from the best practices evolved from their work with many of the world’s largest banks, retailers, agencies, and outsource service providers. The Platform is the result of 15 years and $18 million in continuous product innovation.Location – London – more than beautiful, shared office spaces with the latest technology included. A place you join as an individual, 'me', but where you become part of a greater 'we'. A place where we’re redefining success measured by personal fulfilment, not just the bottom line.Job descriptionThe business is a hosted, multi-tenanted SaaS solution. User data is partitioned logically by our data-security framework. All the User Data is retained till they are on the system and existing functionality is maintained (unless new feature explicitly replaces it, in which case there needs to outlined process replacement and migration of User Data). Data access/security is utmost important and be always done using existing data-security framework.The roleYou will be required to design, code, test and deliver practical and architecturally robust business solutions.You will be expected to recognize impacts throughout the solution your code could incur and communicate these to the relevant people. You will document your code, when required, so other developers can easily understand functionality. Producing clean, maintainable code is key to our organisations ethos.You will be working on features across our Platform plus integrations with third party systems (for example ERP solutions).Required skills & experience5+ year experience developing in Caché Objectscript, using CSP as web frontendExcellent understanding of SQLWorking knowledge of MUMPsEnterprise product / project development experienceHTMLCSSJavascriptDesired skills & experienceOOD background / experienceEfficiency driven / optimised coding styleExperience with SDLC methodologies, Version Control and software configuration management tools (preferably Jira)If you are interested in hearing more about this opportunity please feel free to contact Teo Rusu on 01908 886 030 or teo.rusu@identifiglobal.com
Article
Yuri Marx · Sep 3, 2018
Companies today face serious problems in managing their data and delivering strategic value to them. The structure and business logic of data is fragmented into different solutions, architectures and technology platforms. In addition, different project teams, one for each solution, impose their views on the business, limiting the business to which each of these solutions are able to do. The database becomes a simple repository of processed data under the partial view delivered by each application, process, analysis, message and integration.
The value of the data for the company strategy is not consolidated and corporate and then the database, as well as everything around it becomes anemic, devoid of logic or structure that transforms given into information and corporate knowledge.
The technology department suffers to deliver structuring and strategic solutions to the business. The technology department suffers to deliver structuring and strategic solutions to the business as it needs to assemble complex architectures, composed of several products from different providers.
Example: If a company needs to improve its logistics process, it will need to change its ERP and SCM applications, change the orchestrations in the ESB and manipulate messages and documents in MQ, and finally build new ETL maps and analytical visions to deliver what the user needs. These changes will also generate a strong impact on other processes, such as the purchasing process, requiring a change in the corresponding process. In addition, the entire data structure and business logic of the logistics theme will continue to reside in different systems and technology platforms, with different teams, because in the anemic model, structure and logic are separated and fragmented.
The correct scenario would be from a single, interdisciplinary team changing the logic of the architectural block of logistics theme within a unified data platform, evolving the structure and behavior of the logistics business theme uniformly under analytical, transactional and SOA perspectives.
This eliminates complexity, redundancy and cost and expands reuse, productivity, agility in delivery. It also increases performance and scalability and reduces network latency and consumption of processing, memory, and storage.
The ADP architecture also promotes the use of small, agile, multidisciplinary and full stack teams, because in a single technological environment all the transactional, analytical and business behavior structure and behavior is delivered.
The architecture of ADP also enables what will be the future of technology, the data lakes. In the data lake architecture, the ingestion and processing of data for the generation of strategic value is performed in real time and the data cannot arrive anemic. This would pollute the lake. Since most data lake implementations use anemic architecture, it is necessary to create noise reduction algorithms, but the lake remains polluted. This kills 5V Big Data (velocity, variety, variability, volume and value).
So the solution is to adopt a data platform (Intersystems IRIS is an excellent option) and retire the bag of legacy software responsible for fragmenting, separating and making it impossible to have data behavior and logic together in a single architectural block, to have for each business theme an architectural block with data structure and business logic together, from the transactional, analytical and integration perspectives. Please consider posting it here. Hi, Yuri! Thanks for posting the link to a really great post! But DC is more for articles posted directly here rather then links to somewhere else. Is it possible to repost it here too? Thank you, Yuri! OK now I'm curious where it was originally posted! :)
Question
Chandra Bandi · Sep 24, 2018
Hello Guys,Can you please guide me to create a RESTful API service in our cache (using intersystems cache kit) with CSP (Cache Server Pages ) object script.
Class REST.NativeDIspatcher Extends %CSP.REST
{
XData UrlMap [XMLNamespace = "http://www.native.rest.com/urlmap" ]
{
<Routes>
<!-- <Route Url="/:name" Method="GET" Call="displaySystem" Cors="false" /> -->
<Route Url="/:ostype" Method="GET" Call="externalFreeze" Cors="false" />
</Routes>
}
ClassMethod externalFreeze(Ostype as %String) as %Status{
set status = ##Class(Backup.General).ExternalFreeze()
WRITE "EXTERNAL FREEZE RESPONSE FOUND " _status
QUIT $$$OK
}
}
I had written above code snippet but, I need to explore more using %CSP.REST object script language.
Can anyone suggest/ share sample code snippet to create a REST service web application using %CSP.REST object script language.
Thanks,
Chandrasekhar Reddy,
Email: cbandi@purestorage.com | +91 9880318877 The online documentation is here:https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=GRESTThere is some training in the InterSystems Learning portal (you'll need to enrol/login):https://learning.intersystems.com/course/view.php?id=283e.g.Academy: Implementing RESTful Applications - https://learning.intersystems.com/course/view.php?id=83Setting Up RESTful Services - https://learning.intersystems.com/course/view.php?id=776 I wrote this tutorial some time ago https://community.intersystems.com/post/lets-write-angular-1x-app-cach%C3%A9-rest-backend-start-hereIt covers making an AngularJS front end with a basic Cache REST backend
Question
Robert Driver · Jun 30
Can someone point me to learning resources / documentation for Intersystems Terminal? I have scoured YouTube, Intersystems documentation, and the internet. Many of the Object Script commands I found don't work (and that are listed here) do not work in the version of terminal that I have:
https://docs.intersystems.com/ens201817/csp/docbook/DocBook.UI.Page.cls?KEY=RCOS
So far, I have only found 1 YouTube video that presented commands that 'actually' work my Cache terminal install:
https://www.youtube.com/watch?v=F3lw-2kGY6U&list=PLp4xNHWZ7IQmiSsryS0T3qjuVXlbSWqc8
Article
Vinicius Maranhao Ribeiro de Castro · Apr 2, 2020
Introduction
Nowadays, there is a lot of applications that are using Open Authorization framework (OAuth) to access resources from all kinds of services in a secure, reliable and efficient manner. InterSystems IRIS is already compatible with OAuth 2.0 framework, in fact, there is a great article in the community regarding OAuth 2.0 and InterSystems IRIS in the following link here.
However, with the advent of API Management tools, some organizations are using it as a single point of authentication, preventing unauthorized requests to arrive at downstream services and decoupling authorization/authentication complexities from the service itself.
As you may know, InterSystems has launched its API Management tool, called InterSystems API Management (IAM), which is available with IRIS Enterprise license (not IRIS Community Edition). Here is another great post in the community introducing InterSystems API Management.
This is the first part of 3-part series of articles that will show how you can use IAM to simply add security, according to OAuth 2.0 standards, to a previously unauthenticated service deployed in IRIS.
In this first part, will be provided some OAuth 2.0 background together with some IRIS and IAM initial definitions and configurations in order to facilitate the understanding of the whole process of securing your services.
After the first part, this article series will approach two possible scenarios to secure your services with IAM. In the first scenario, IAM will only validate the access token present in the incoming request and will forward the request to the backend if the validation succeeds. In the second scenario, IAM will both generate an access token (acting as an authorization server) and validate it.
Therefore, the second part will discuss and show in detail the steps needed to configure the scenario 1 and the third part will discuss and demonstrate the configurations for scenario 2, together with some final considerations.
If you want to try IAM, please contact your InterSystems Sales Representative.
OAuth 2.0 Background
Every OAuth 2.0 authorization flow basically consists of 4 parties:
User
Client
Authorization Server
Resource Owner
For the sake of simplicity, this article will use the “Resource Owner Password Credentials” OAuth flow, but you can use any OAuth flow in IAM. Also, this article will not specify any scope.
Note: You should only use Resource Owner Password Credentials flow when the client app is highly trusted, as it directly handles user credentials. In most cases, the client should be a first-party app.
Typically, the Resource Owner Password Credentials flow, follows these steps:
The user enters their credentials (for example username and password) in the client app
The client app sends the user credentials together with its own identification (client id and client secret, for example) to the authorization server. The authorization server validates the user credentials and client identification and returns an access token
The client uses the token to access resources on the resource server
The resource server validates the access token received before returning any information to the client
With that in mind, there are two scenarios where you can use IAM to deal with OAuth 2.0:
IAM acting as a validator, verifying the access token provided by the client app, forwarding the request to the resource server only if the access token is valid; In this case the access token would be generated by a third-party authorization server
IAM acting both as an authorization server, providing access token to the client, and as an access token validator, verifying the access token before redirecting the request to the resource server.
IRIS and IAM definitions
In this post, it will be used an IRIS Web Application called “/SampleService”. As you can see from the screenshot below, this is an unauthenticated REST service deployed in IRIS:
Furthermore, in IAM side is configured a Service called “SampleIRISService” containing one route, as you can see in the screenshot below:
Also, in IAM is configured a consumer called “ClientApp”, initially without any credentials, to identify who is calling the API in IAM:
With the configurations above, IAM is proxying to IRIS every GET request sent to the following URL:
http://iamhost:8000/event
At this point, no authentication is used yet. Therefore, if we send a simple GET request, with no authentication, to the URL
http://iamhost:8000/event/1
we get the desired response.
In this article, we are going to use an app called “PostMan” to send requests and check the responses. In the PostMan screenshot below, you can see the simple GET request together with its response.
Continue reading to the second part of this series to understand how to configure IAM to validate access tokens present in the incoming requests. Great article!
Article
Vinicius Maranhao Ribeiro de Castro · Apr 2, 2020
In this 3-part series of articles, is shown how you can use IAM to simply add security, according to OAuth 2.0 standards, to a previously unauthenticated service deployed in IRIS.
In the first part, was provided some OAuth 2.0 background together with some IRIS and IAM initial definitions and configurations in order to facilitate the understanding of the whole process of securing your services.
This part will now discuss and show in detail the steps needed to configure IAM to validate the access token present in the incoming request and forward the request to the backend if the validation succeeds.
The last part of this series will discuss and demonstrate the configurations needed to IAM generate an access token (acting as an authorization server) and validate it, together with some important final considerations.
If you want to try IAM, please contact your InterSystems Sales Representative.
Scenario 1: IAM as an access token validator
In this scenario, it will be used an external authorization server that generates an access token in a JWT (JSON Web Token) format. This JWT is signed using the algorithm RS256 together with a private key. In order to verify the JWT signature, the other party (in this case IAM) needs to have the public key, provided by the authorization server.
This JWT generated by the external authorization server also includes, in its body, a claim called “exp” containing the timestamp of when this token expires, and another claim called “iss” containing the address of the authorization server.
Therefore, IAM needs to verify the JWT signature with the authorization server’s public key and the expiration timestamp contained in the “exp” claim inside the JWT before forwarding the request to IRIS.
In order to configure that in IAM, let’s start by adding a plugin called “JWT” to our “SampleIRISService” in IAM. To do so, go to the Services page in IAM and copy the id of the “SampleIRISService”, we are going to use that later.
After that, go to Plugins, click the “New Plugin” button, locate the “JWT” plugin and click Enable.
In the following page, paste the “SampleIRISService” id in the “service_id” field and select the box “exp” in the “config.claims_to_verify” parameter.
Note the value of the parameter “config.key_claim_name” is “iss”. We are going to use that later.
Then, hit the “Create” button.
Done that, go to the “Consumers” section in the left menu and click in our previously created “ClientApp”. Go to the “Credentials” tab and click the button “New JWT Credential”.
In the following page, select the algorithm used to sign the JWT (in this case RS256) and paste the public key in the field “rsa_public_key” (this is the public key provided to you by the authorization server in PEM format).
In the “key” field, you need to insert the contents of the JWT claim that you entered in the field “config.key_claim_name” when adding the JWT plugin. Therefore, in this case, I need to insert the content of the iss claim of my JWT, which, in my case, is the address of the authorization server.
After that, click on “Create” button.
Hint: For debugging purposes, there is an online tool to decode JWT so you can check the claims and its values and verify its signature by pasting the public key. Here is the link of this online tool: https://jwt.io/#debugger
Now, with the JWT plugin added, it is no longer possible to send the request with no authentication. As you can see below, a simple GET request, with no authentication, to the URL
http://iamhost:8000/event/1
return an unauthorized message together with the status code “401 Unauthorized”.
In order to get the results from IRIS, we need to add the JWT to the request.
Therefore, first we need to request the JWT to the authorization server. The custom authorization server that we are using here, returns a JWT if a POST request is made together with some key-value pairs in the body, including user and client information, to the following URL:
https://authorizationserver:5001/auth
Here is what this request and its response looks like:
Then, you can add the JWT obtained from the response below in the authorization header as a Bearer Token and send a GET request to the same URL previously used:
http://iamhost:8000/event/1
Or you can also add it as a querystring parameter, with the querystring key being the value specified in the field “config.uri_param_names” when adding the JWT plugin which, in this case, is “jwt”:
Finally, there is also the option to include JWT in the request as a cookie, if any name is entered in the field “config.cookie_names”.
Continue reading to the third and last part of this series to understand the configurations needed to IAM generate an access token and validate it, together with some important final considerations.
Article
Vinicius Maranhao Ribeiro de Castro · Apr 2, 2020
In this 3-part series of articles, is shown how you can use IAM to simply add security, according to OAuth 2.0 standards, to a previously unauthenticated service deployed in IRIS.
In the first part, was provided some OAuth 2.0 background together with some IRIS and IAM initial definitions and configurations in order to facilitate the understanding of the whole process of securing your services.
The second part discussed and showed in detail the steps needed to configure IAM to validate the access token present in the incoming request and forward the request to the backend if the validation succeeds.
This last part of the series will discuss and demonstrate the configurations needed to IAM generate an access token (acting as an authorization server) and validate it, together with some important final considerations.
If you want to try IAM, please contact your InterSystems Sales Representative.
Scenario 2: IAM as an authorization server and access token validator
In this scenario, differently from the first scenario, we are going to use a plugin called “OAuth 2.0 Authentication”.
In order to use IAM as the authorization server in this Resource Owner Password Credentials flow, the username and password must be authenticated by the client application. The request to get the access token from IAM should be made only if the authentication is successful.
Let’s start by adding it to our “SampleIRISService”. As you can see in the screenshot below, we have some different fields to fill up in order to configure this plugin.
First of all, we are going to paste the id of our “SampleIRISService” into the field “service_id” to enable this plugin to our service.
In the field “config.auth_header_name” we are going to specify the header name that will carry the authorization token. In this case, I’ll leave the default value of “authorization”.
The “OAuth 2.0 Authentication” plugin supports the Authorization Code Grant, Client Credentials, Implicit Grant or Resource Owner Password Credentials Grant OAuth 2.0 flows. As we are using the Resource Owner Password Credentials flow in this article, we will check the box “config.enable_password_grant”.
In the “config.provision_key” field, enter any string to be used as the provision key. This value will be used to request an access token to IAM.
In this case, I left all the other fields with the default value. You can check the full reference for each field in the plugin documentation available here.
Here is how the plugin configuration looks like at the end:
Once the plugin is created, we need to create the credentials to our “ClientApp” consumer.
To do so, go to “Consumers” on the left menu and click on “ClientApp”. Next, click on “Credentials” tab and then on “New OAuth 2.0 Application” button.
On the following page, enter any name to identify your application on the field “name”, define a client id and a client secret, respectively, on the fields “client_id” and “client_secret” and lastly, enter the URL in your application where users will be sent after authorization on the field “redirect_uri”. Then, click on “Create”.
Now, you are ready to send requests.
The first request that we need to make is to obtain the access token from IAM. The “OAuth 2.0 Authentication” plugin automatically create an endpoint appending the path “/oauth2/token” to the already created route.
Note: Make sure that you use HTTPS protocol and IAM’s proxy port listening to TLS/SSL requests (the default port is 8443). This is an OAuth 2.0 requirement.
Therefore, in this case, we would need to make a POST request to the URL:
https://iamhost:8443/event/oauth2/token
In the request body, you should include the following JSON:
{
"client_id": "clientid",
"client_secret": "clientsecret",
"grant_type": "password",
"provision_key": "provisionkey",
"authenticated_userid": "1"
}
As you can see, this JSON contains values defined both during “OAuth 2.0 Authentication” plugin creation, such as “grant_type” and “provision_key”, and during Consumer’s credentials creation, such as “client_id” and “client_secret”.
The parameter “authenticated_userid” should also be added by the client application when the username and password provided are successfully authenticated. Its value should be used to uniquely identify the authenticated user.
The request and its respective response should look like this:
With that, we can now make a request to get the event data including the “access_token” value from the response above as a “Bearer Token” in a GET request to the URL
https://iamhost:8443/event/1
If your access token expires, you can generate a new access token using the refresh token that you received together with the expired access token by making a POST request to the same endpoint used to get an access token, with a slightly different body:
{
"client_id": "clientid",
"client_secret": "clientsecret",
"grant_type": "refresh_token",
"refresh_token": "E50m6Yd9xWy6lybgo3DOvu5ktZTjzkwF"
}
The request and its respective response should look like this:
One interesting feature of the “OAuth 2.0 Authentication” plugin is the ability to view and invalidate access tokens.
To list the tokens, send a GET request to the following endpoint of IAM’s Admin API:
https://iamhost:8444/{workspace_name}/oauth2_tokens
where {workspace_name} is the name of the IAM workspace used. Make sure to enter the necessary credentials to call IAM’s Admin API in case if you have enabled RBAC.
Note that “credential_id” is the id of the OAuth application that we created inside the ClientApp consumer (in this case is called SampleApp) and “service_id” is the id of our “SampleIRISService” which this plugin is applied to.
To invalidate a token, you can send a DELETE request to the following endpoint
https://iamhost:8444/Sample/oauth2_tokens/{token_id}
where {token_id} is the id of the token to be invalidated.
If we try to use the invalidated token, we get a message saying that the token is invalid or expired if we send a GET request containing this invalidated token as a Bearer Token to the URL:
https://iamhost:8443/event/1
Final Considerations
In this article was demonstrated how you can add OAuth 2.0 authentication in IAM to an unauthenticated service deployed in IRIS. You should keep in mind that the service itself will continue to be unauthenticated in IRIS. Therefore, if anyone calls the IRIS service endpoint directly, bypassing the IAM layer, will be able to see the information without any authentication. For that reason, it is important to have security rules in a network level to prevent unwanted requests to bypass IAM layer.
You can learn more about IAM here.
If you want to try IAM, please contact your InterSystems Sales Representative. Great article, Vinicius! It surely helps to get an idea of how OAuth works on IAM.
I have been trying to setup the OAuth plugin to implict grant, but every time I send a post message, I get the "invalid grant type" even sending the grant type as implicit. Do we have some material or documentation to study on how should we do it?
Article
Evgeny Shvarov · May 25, 2020
Hi ObjectScript developers!
InterSystems ObjectScript is perhaps the best language on the planet to deal with globals - and it is an interpretable language.
Yes, it has a compiler. But even the compiler can compile some lines in ObjectScript which will then fire as bugs during the runtime.
There are some technics on how to avoid that such as unit testing, coding guidelines and your coding experience, of course ;)
Here I want to present to you the yet another approach to how you can reduce the number of errors in your ObjectScript runtime and enforce coding guidelines - it's an ObjectScript Quality tool developed by Lite Solutions, InterSystems solution partner.
See the details below.
We requested Lite Solutions to set up the analysis for the following 17 rules, which we find the most common "misses" of the compiler and which could be considered as possible bugs and "code smells" - violations of coding guidelines and deprecated functions.
You can check how this rules work against the ObjectScript class (probably the worst ObjectScript class ever), where each method represents a certain problem the tool recognizes. And here is the analysis of this artificially introduced class.
Here you can check other projects which already being analyzed by ObjectScriptQ tool.
How can you add the ObjectScript analysis to your project?
This is very easy. Lite Solutions provides free-of-charge analysis for all the Open Source Github repositories with ObjectScript. Introduce this workflow.yml file in
.github/workflows
folder of your public GitHub repository. After that every push to the repository will trigger a new analysis and you will get a newly made report on investigated issues in the repository.
This tool can work with private repositories too - you can examine the possible options and pricing plans on ObjectScriptQ site.
Collaboration and Evolution
If you find some false-positives or if you want to add a new rule to make ObjectScript better submit the issue to the Lite Solutions repository or discuss here on Developer Community, e.g. right in this post.
Happy coding and have your ObjectScript cleaner and healthier!