Clear filter
Announcement
Teo Rusu · Aug 8, 2018
Good afternoon everyone,I hope you are well,I am currently working on the below opportunity and if anyone here would like to hear more details please feel free to contact me on 01908 886 030 or teo.rusu@identifiglobal.comMet with the client yesterday, beautiful office in Waterloo London, latest tech available with many benefits including working from home 2 days/w as well!Intersystems Caché Senior Software EngineerLondon Salary up to £75,000 (negotiable for the right candidate)An exciting Software as a Service company creating enterprise software used by 33,000+ people globally. They provides a secure, global cloud platform marketing teams use to rapidly expand services across channels and launch into new markets.In a highly collaborative working environment their clients benefit from the best practices evolved from their work with many of the world’s largest banks, retailers, agencies, and outsource service providers. The Platform is the result of 15 years and $18 million in continuous product innovation.Location – London – more than beautiful, shared office spaces with the latest technology included. A place you join as an individual, 'me', but where you become part of a greater 'we'. A place where we’re redefining success measured by personal fulfilment, not just the bottom line.Job descriptionThe business is a hosted, multi-tenanted SaaS solution. User data is partitioned logically by our data-security framework. All the User Data is retained till they are on the system and existing functionality is maintained (unless new feature explicitly replaces it, in which case there needs to outlined process replacement and migration of User Data). Data access/security is utmost important and be always done using existing data-security framework.The roleYou will be required to design, code, test and deliver practical and architecturally robust business solutions.You will be expected to recognize impacts throughout the solution your code could incur and communicate these to the relevant people. You will document your code, when required, so other developers can easily understand functionality. Producing clean, maintainable code is key to our organisations ethos.You will be working on features across our Platform plus integrations with third party systems (for example ERP solutions).Required skills & experience5+ year experience developing in Caché Objectscript, using CSP as web frontendExcellent understanding of SQLWorking knowledge of MUMPsEnterprise product / project development experienceHTMLCSSJavascriptDesired skills & experienceOOD background / experienceEfficiency driven / optimised coding styleExperience with SDLC methodologies, Version Control and software configuration management tools (preferably Jira)If you are interested in hearing more about this opportunity please feel free to contact Teo Rusu on 01908 886 030 or teo.rusu@identifiglobal.com
Article
Yuri Marx · Sep 3, 2018
Companies today face serious problems in managing their data and delivering strategic value to them. The structure and business logic of data is fragmented into different solutions, architectures and technology platforms. In addition, different project teams, one for each solution, impose their views on the business, limiting the business to which each of these solutions are able to do. The database becomes a simple repository of processed data under the partial view delivered by each application, process, analysis, message and integration.
The value of the data for the company strategy is not consolidated and corporate and then the database, as well as everything around it becomes anemic, devoid of logic or structure that transforms given into information and corporate knowledge.
The technology department suffers to deliver structuring and strategic solutions to the business. The technology department suffers to deliver structuring and strategic solutions to the business as it needs to assemble complex architectures, composed of several products from different providers.
Example: If a company needs to improve its logistics process, it will need to change its ERP and SCM applications, change the orchestrations in the ESB and manipulate messages and documents in MQ, and finally build new ETL maps and analytical visions to deliver what the user needs. These changes will also generate a strong impact on other processes, such as the purchasing process, requiring a change in the corresponding process. In addition, the entire data structure and business logic of the logistics theme will continue to reside in different systems and technology platforms, with different teams, because in the anemic model, structure and logic are separated and fragmented.
The correct scenario would be from a single, interdisciplinary team changing the logic of the architectural block of logistics theme within a unified data platform, evolving the structure and behavior of the logistics business theme uniformly under analytical, transactional and SOA perspectives.
This eliminates complexity, redundancy and cost and expands reuse, productivity, agility in delivery. It also increases performance and scalability and reduces network latency and consumption of processing, memory, and storage.
The ADP architecture also promotes the use of small, agile, multidisciplinary and full stack teams, because in a single technological environment all the transactional, analytical and business behavior structure and behavior is delivered.
The architecture of ADP also enables what will be the future of technology, the data lakes. In the data lake architecture, the ingestion and processing of data for the generation of strategic value is performed in real time and the data cannot arrive anemic. This would pollute the lake. Since most data lake implementations use anemic architecture, it is necessary to create noise reduction algorithms, but the lake remains polluted. This kills 5V Big Data (velocity, variety, variability, volume and value).
So the solution is to adopt a data platform (Intersystems IRIS is an excellent option) and retire the bag of legacy software responsible for fragmenting, separating and making it impossible to have data behavior and logic together in a single architectural block, to have for each business theme an architectural block with data structure and business logic together, from the transactional, analytical and integration perspectives. Please consider posting it here. Hi, Yuri! Thanks for posting the link to a really great post! But DC is more for articles posted directly here rather then links to somewhere else. Is it possible to repost it here too? Thank you, Yuri! OK now I'm curious where it was originally posted! :)
Question
Chandra Bandi · Sep 24, 2018
Hello Guys,Can you please guide me to create a RESTful API service in our cache (using intersystems cache kit) with CSP (Cache Server Pages ) object script.
Class REST.NativeDIspatcher Extends %CSP.REST
{
XData UrlMap [XMLNamespace = "http://www.native.rest.com/urlmap" ]
{
<Routes>
<!-- <Route Url="/:name" Method="GET" Call="displaySystem" Cors="false" /> -->
<Route Url="/:ostype" Method="GET" Call="externalFreeze" Cors="false" />
</Routes>
}
ClassMethod externalFreeze(Ostype as %String) as %Status{
set status = ##Class(Backup.General).ExternalFreeze()
WRITE "EXTERNAL FREEZE RESPONSE FOUND " _status
QUIT $$$OK
}
}
I had written above code snippet but, I need to explore more using %CSP.REST object script language.
Can anyone suggest/ share sample code snippet to create a REST service web application using %CSP.REST object script language.
Thanks,
Chandrasekhar Reddy,
Email: cbandi@purestorage.com | +91 9880318877 The online documentation is here:https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=GRESTThere is some training in the InterSystems Learning portal (you'll need to enrol/login):https://learning.intersystems.com/course/view.php?id=283e.g.Academy: Implementing RESTful Applications - https://learning.intersystems.com/course/view.php?id=83Setting Up RESTful Services - https://learning.intersystems.com/course/view.php?id=776 I wrote this tutorial some time ago https://community.intersystems.com/post/lets-write-angular-1x-app-cach%C3%A9-rest-backend-start-hereIt covers making an AngularJS front end with a basic Cache REST backend
Question
Robert Driver · Jun 30
Can someone point me to learning resources / documentation for Intersystems Terminal? I have scoured YouTube, Intersystems documentation, and the internet. Many of the Object Script commands I found don't work (and that are listed here) do not work in the version of terminal that I have:
https://docs.intersystems.com/ens201817/csp/docbook/DocBook.UI.Page.cls?KEY=RCOS
So far, I have only found 1 YouTube video that presented commands that 'actually' work my Cache terminal install:
https://www.youtube.com/watch?v=F3lw-2kGY6U&list=PLp4xNHWZ7IQmiSsryS0T3qjuVXlbSWqc8
Article
Vinicius Maranhao Ribeiro de Castro · Apr 2, 2020
Introduction
Nowadays, there is a lot of applications that are using Open Authorization framework (OAuth) to access resources from all kinds of services in a secure, reliable and efficient manner. InterSystems IRIS is already compatible with OAuth 2.0 framework, in fact, there is a great article in the community regarding OAuth 2.0 and InterSystems IRIS in the following link here.
However, with the advent of API Management tools, some organizations are using it as a single point of authentication, preventing unauthorized requests to arrive at downstream services and decoupling authorization/authentication complexities from the service itself.
As you may know, InterSystems has launched its API Management tool, called InterSystems API Management (IAM), which is available with IRIS Enterprise license (not IRIS Community Edition). Here is another great post in the community introducing InterSystems API Management.
This is the first part of 3-part series of articles that will show how you can use IAM to simply add security, according to OAuth 2.0 standards, to a previously unauthenticated service deployed in IRIS.
In this first part, will be provided some OAuth 2.0 background together with some IRIS and IAM initial definitions and configurations in order to facilitate the understanding of the whole process of securing your services.
After the first part, this article series will approach two possible scenarios to secure your services with IAM. In the first scenario, IAM will only validate the access token present in the incoming request and will forward the request to the backend if the validation succeeds. In the second scenario, IAM will both generate an access token (acting as an authorization server) and validate it.
Therefore, the second part will discuss and show in detail the steps needed to configure the scenario 1 and the third part will discuss and demonstrate the configurations for scenario 2, together with some final considerations.
If you want to try IAM, please contact your InterSystems Sales Representative.
OAuth 2.0 Background
Every OAuth 2.0 authorization flow basically consists of 4 parties:
User
Client
Authorization Server
Resource Owner
For the sake of simplicity, this article will use the “Resource Owner Password Credentials” OAuth flow, but you can use any OAuth flow in IAM. Also, this article will not specify any scope.
Note: You should only use Resource Owner Password Credentials flow when the client app is highly trusted, as it directly handles user credentials. In most cases, the client should be a first-party app.
Typically, the Resource Owner Password Credentials flow, follows these steps:
The user enters their credentials (for example username and password) in the client app
The client app sends the user credentials together with its own identification (client id and client secret, for example) to the authorization server. The authorization server validates the user credentials and client identification and returns an access token
The client uses the token to access resources on the resource server
The resource server validates the access token received before returning any information to the client
With that in mind, there are two scenarios where you can use IAM to deal with OAuth 2.0:
IAM acting as a validator, verifying the access token provided by the client app, forwarding the request to the resource server only if the access token is valid; In this case the access token would be generated by a third-party authorization server
IAM acting both as an authorization server, providing access token to the client, and as an access token validator, verifying the access token before redirecting the request to the resource server.
IRIS and IAM definitions
In this post, it will be used an IRIS Web Application called “/SampleService”. As you can see from the screenshot below, this is an unauthenticated REST service deployed in IRIS:
Furthermore, in IAM side is configured a Service called “SampleIRISService” containing one route, as you can see in the screenshot below:
Also, in IAM is configured a consumer called “ClientApp”, initially without any credentials, to identify who is calling the API in IAM:
With the configurations above, IAM is proxying to IRIS every GET request sent to the following URL:
http://iamhost:8000/event
At this point, no authentication is used yet. Therefore, if we send a simple GET request, with no authentication, to the URL
http://iamhost:8000/event/1
we get the desired response.
In this article, we are going to use an app called “PostMan” to send requests and check the responses. In the PostMan screenshot below, you can see the simple GET request together with its response.
Continue reading to the second part of this series to understand how to configure IAM to validate access tokens present in the incoming requests. Great article!
Article
Vinicius Maranhao Ribeiro de Castro · Apr 2, 2020
In this 3-part series of articles, is shown how you can use IAM to simply add security, according to OAuth 2.0 standards, to a previously unauthenticated service deployed in IRIS.
In the first part, was provided some OAuth 2.0 background together with some IRIS and IAM initial definitions and configurations in order to facilitate the understanding of the whole process of securing your services.
This part will now discuss and show in detail the steps needed to configure IAM to validate the access token present in the incoming request and forward the request to the backend if the validation succeeds.
The last part of this series will discuss and demonstrate the configurations needed to IAM generate an access token (acting as an authorization server) and validate it, together with some important final considerations.
If you want to try IAM, please contact your InterSystems Sales Representative.
Scenario 1: IAM as an access token validator
In this scenario, it will be used an external authorization server that generates an access token in a JWT (JSON Web Token) format. This JWT is signed using the algorithm RS256 together with a private key. In order to verify the JWT signature, the other party (in this case IAM) needs to have the public key, provided by the authorization server.
This JWT generated by the external authorization server also includes, in its body, a claim called “exp” containing the timestamp of when this token expires, and another claim called “iss” containing the address of the authorization server.
Therefore, IAM needs to verify the JWT signature with the authorization server’s public key and the expiration timestamp contained in the “exp” claim inside the JWT before forwarding the request to IRIS.
In order to configure that in IAM, let’s start by adding a plugin called “JWT” to our “SampleIRISService” in IAM. To do so, go to the Services page in IAM and copy the id of the “SampleIRISService”, we are going to use that later.
After that, go to Plugins, click the “New Plugin” button, locate the “JWT” plugin and click Enable.
In the following page, paste the “SampleIRISService” id in the “service_id” field and select the box “exp” in the “config.claims_to_verify” parameter.
Note the value of the parameter “config.key_claim_name” is “iss”. We are going to use that later.
Then, hit the “Create” button.
Done that, go to the “Consumers” section in the left menu and click in our previously created “ClientApp”. Go to the “Credentials” tab and click the button “New JWT Credential”.
In the following page, select the algorithm used to sign the JWT (in this case RS256) and paste the public key in the field “rsa_public_key” (this is the public key provided to you by the authorization server in PEM format).
In the “key” field, you need to insert the contents of the JWT claim that you entered in the field “config.key_claim_name” when adding the JWT plugin. Therefore, in this case, I need to insert the content of the iss claim of my JWT, which, in my case, is the address of the authorization server.
After that, click on “Create” button.
Hint: For debugging purposes, there is an online tool to decode JWT so you can check the claims and its values and verify its signature by pasting the public key. Here is the link of this online tool: https://jwt.io/#debugger
Now, with the JWT plugin added, it is no longer possible to send the request with no authentication. As you can see below, a simple GET request, with no authentication, to the URL
http://iamhost:8000/event/1
return an unauthorized message together with the status code “401 Unauthorized”.
In order to get the results from IRIS, we need to add the JWT to the request.
Therefore, first we need to request the JWT to the authorization server. The custom authorization server that we are using here, returns a JWT if a POST request is made together with some key-value pairs in the body, including user and client information, to the following URL:
https://authorizationserver:5001/auth
Here is what this request and its response looks like:
Then, you can add the JWT obtained from the response below in the authorization header as a Bearer Token and send a GET request to the same URL previously used:
http://iamhost:8000/event/1
Or you can also add it as a querystring parameter, with the querystring key being the value specified in the field “config.uri_param_names” when adding the JWT plugin which, in this case, is “jwt”:
Finally, there is also the option to include JWT in the request as a cookie, if any name is entered in the field “config.cookie_names”.
Continue reading to the third and last part of this series to understand the configurations needed to IAM generate an access token and validate it, together with some important final considerations.
Article
Vinicius Maranhao Ribeiro de Castro · Apr 2, 2020
In this 3-part series of articles, is shown how you can use IAM to simply add security, according to OAuth 2.0 standards, to a previously unauthenticated service deployed in IRIS.
In the first part, was provided some OAuth 2.0 background together with some IRIS and IAM initial definitions and configurations in order to facilitate the understanding of the whole process of securing your services.
The second part discussed and showed in detail the steps needed to configure IAM to validate the access token present in the incoming request and forward the request to the backend if the validation succeeds.
This last part of the series will discuss and demonstrate the configurations needed to IAM generate an access token (acting as an authorization server) and validate it, together with some important final considerations.
If you want to try IAM, please contact your InterSystems Sales Representative.
Scenario 2: IAM as an authorization server and access token validator
In this scenario, differently from the first scenario, we are going to use a plugin called “OAuth 2.0 Authentication”.
In order to use IAM as the authorization server in this Resource Owner Password Credentials flow, the username and password must be authenticated by the client application. The request to get the access token from IAM should be made only if the authentication is successful.
Let’s start by adding it to our “SampleIRISService”. As you can see in the screenshot below, we have some different fields to fill up in order to configure this plugin.
First of all, we are going to paste the id of our “SampleIRISService” into the field “service_id” to enable this plugin to our service.
In the field “config.auth_header_name” we are going to specify the header name that will carry the authorization token. In this case, I’ll leave the default value of “authorization”.
The “OAuth 2.0 Authentication” plugin supports the Authorization Code Grant, Client Credentials, Implicit Grant or Resource Owner Password Credentials Grant OAuth 2.0 flows. As we are using the Resource Owner Password Credentials flow in this article, we will check the box “config.enable_password_grant”.
In the “config.provision_key” field, enter any string to be used as the provision key. This value will be used to request an access token to IAM.
In this case, I left all the other fields with the default value. You can check the full reference for each field in the plugin documentation available here.
Here is how the plugin configuration looks like at the end:
Once the plugin is created, we need to create the credentials to our “ClientApp” consumer.
To do so, go to “Consumers” on the left menu and click on “ClientApp”. Next, click on “Credentials” tab and then on “New OAuth 2.0 Application” button.
On the following page, enter any name to identify your application on the field “name”, define a client id and a client secret, respectively, on the fields “client_id” and “client_secret” and lastly, enter the URL in your application where users will be sent after authorization on the field “redirect_uri”. Then, click on “Create”.
Now, you are ready to send requests.
The first request that we need to make is to obtain the access token from IAM. The “OAuth 2.0 Authentication” plugin automatically create an endpoint appending the path “/oauth2/token” to the already created route.
Note: Make sure that you use HTTPS protocol and IAM’s proxy port listening to TLS/SSL requests (the default port is 8443). This is an OAuth 2.0 requirement.
Therefore, in this case, we would need to make a POST request to the URL:
https://iamhost:8443/event/oauth2/token
In the request body, you should include the following JSON:
{
"client_id": "clientid",
"client_secret": "clientsecret",
"grant_type": "password",
"provision_key": "provisionkey",
"authenticated_userid": "1"
}
As you can see, this JSON contains values defined both during “OAuth 2.0 Authentication” plugin creation, such as “grant_type” and “provision_key”, and during Consumer’s credentials creation, such as “client_id” and “client_secret”.
The parameter “authenticated_userid” should also be added by the client application when the username and password provided are successfully authenticated. Its value should be used to uniquely identify the authenticated user.
The request and its respective response should look like this:
With that, we can now make a request to get the event data including the “access_token” value from the response above as a “Bearer Token” in a GET request to the URL
https://iamhost:8443/event/1
If your access token expires, you can generate a new access token using the refresh token that you received together with the expired access token by making a POST request to the same endpoint used to get an access token, with a slightly different body:
{
"client_id": "clientid",
"client_secret": "clientsecret",
"grant_type": "refresh_token",
"refresh_token": "E50m6Yd9xWy6lybgo3DOvu5ktZTjzkwF"
}
The request and its respective response should look like this:
One interesting feature of the “OAuth 2.0 Authentication” plugin is the ability to view and invalidate access tokens.
To list the tokens, send a GET request to the following endpoint of IAM’s Admin API:
https://iamhost:8444/{workspace_name}/oauth2_tokens
where {workspace_name} is the name of the IAM workspace used. Make sure to enter the necessary credentials to call IAM’s Admin API in case if you have enabled RBAC.
Note that “credential_id” is the id of the OAuth application that we created inside the ClientApp consumer (in this case is called SampleApp) and “service_id” is the id of our “SampleIRISService” which this plugin is applied to.
To invalidate a token, you can send a DELETE request to the following endpoint
https://iamhost:8444/Sample/oauth2_tokens/{token_id}
where {token_id} is the id of the token to be invalidated.
If we try to use the invalidated token, we get a message saying that the token is invalid or expired if we send a GET request containing this invalidated token as a Bearer Token to the URL:
https://iamhost:8443/event/1
Final Considerations
In this article was demonstrated how you can add OAuth 2.0 authentication in IAM to an unauthenticated service deployed in IRIS. You should keep in mind that the service itself will continue to be unauthenticated in IRIS. Therefore, if anyone calls the IRIS service endpoint directly, bypassing the IAM layer, will be able to see the information without any authentication. For that reason, it is important to have security rules in a network level to prevent unwanted requests to bypass IAM layer.
You can learn more about IAM here.
If you want to try IAM, please contact your InterSystems Sales Representative. Great article, Vinicius! It surely helps to get an idea of how OAuth works on IAM.
I have been trying to setup the OAuth plugin to implict grant, but every time I send a post message, I get the "invalid grant type" even sending the grant type as implicit. Do we have some material or documentation to study on how should we do it?
Article
Evgeny Shvarov · May 25, 2020
Hi ObjectScript developers!
InterSystems ObjectScript is perhaps the best language on the planet to deal with globals - and it is an interpretable language.
Yes, it has a compiler. But even the compiler can compile some lines in ObjectScript which will then fire as bugs during the runtime.
There are some technics on how to avoid that such as unit testing, coding guidelines and your coding experience, of course ;)
Here I want to present to you the yet another approach to how you can reduce the number of errors in your ObjectScript runtime and enforce coding guidelines - it's an ObjectScript Quality tool developed by Lite Solutions, InterSystems solution partner.
See the details below.
We requested Lite Solutions to set up the analysis for the following 17 rules, which we find the most common "misses" of the compiler and which could be considered as possible bugs and "code smells" - violations of coding guidelines and deprecated functions.
You can check how this rules work against the ObjectScript class (probably the worst ObjectScript class ever), where each method represents a certain problem the tool recognizes. And here is the analysis of this artificially introduced class.
Here you can check other projects which already being analyzed by ObjectScriptQ tool.
How can you add the ObjectScript analysis to your project?
This is very easy. Lite Solutions provides free-of-charge analysis for all the Open Source Github repositories with ObjectScript. Introduce this workflow.yml file in
.github/workflows
folder of your public GitHub repository. After that every push to the repository will trigger a new analysis and you will get a newly made report on investigated issues in the repository.
This tool can work with private repositories too - you can examine the possible options and pricing plans on ObjectScriptQ site.
Collaboration and Evolution
If you find some false-positives or if you want to add a new rule to make ObjectScript better submit the issue to the Lite Solutions repository or discuss here on Developer Community, e.g. right in this post.
Happy coding and have your ObjectScript cleaner and healthier!
Announcement
Anastasia Dyubaylo · Mar 13, 2020
Hi Community!
New "Coding Talk" video is already on InterSystems Developers YouTube:
⏯ How to Enable Docker and VSCode to Your InterSystems IRIS Solution
In this video, presented by @Evgeny Shvarov, you will learn how to add InterSystems IRIS Docker and VSCode environment to your current repository with InterSystems ObjectScript using iris-docker-dev-kit project on Open Excahnge.
➡️ iris-docker-dev-kit – a set of files to facilitate development with InterSystems IRIS using Docker and VSCode. It gives you the option to develop your InterSystems ObjectScript solution in IRIS Community Edition or IRIS Community Edition for Health on your laptop using VSCode ObjectScript extension.
And...
You're very welcome to watch all Coding Talks in a dedicated "Coding Talks" playlist on our InterSystems Developers YouTube Channel.
Stay tuned! 👍🏼
Announcement
Anastasia Dyubaylo · Apr 20, 2020
Hi Community,
New "Coding Talk" video was specially recorded by @Evgeny.Shvarov for the second IRIS Programming Contest:
⏯ Creating CRUD REST API for InterSystems IRIS in 5 minutes
In this video, presented by @Evgeny Shvarov, you'll learn how to create your own basic CRUD API for InterSystems IRIS using the GitHub template and expose it with Open API spec.
➡️ Check out the app used in this demo: objectscript-rest-docker-template
And...
You're very welcome to join the second IRIS Programming Contest! Show your best coding skills and win cool prizes!
Stay tuned! 👍🏼
Announcement
Thomas Carroll · Feb 14, 2019
Breaking news!
InterSystems just announced the availability of the InterSystems IRIS for Health™ Data Platform across the Amazon Web Services, Google Cloud, and Microsoft Azure marketplaces.
With access to InterSystems unified data platform on all three major cloud providers, developers and customers have flexibility to rapidly build and scale the digital applications driving the future of care on the platform of their choice.
To learn more please follow this link.
Announcement
Anastasia Dyubaylo · May 13, 2019
Hey Community!
The latest webinar, recorded by InterSystems Sales Engineers @Sergey Lukyanchikov and @Eduard Lebedyuk, is already on InterSystems Developers YouTube! Please welcome:
"Machine Learning Toolkit (Python, ObjectScript, Interoperability, Analytics) for InterSystems IRIS"
Big applause to these speakers, thank you guys!
Want more?
Please find all the details in this post.
Enjoy watching the webinar! Please don't forget to check out additional materials for this webinar:Python Gateway Part I: IntroductionPython Gateway Part II: InstallationIn addition, try out the apps on Open Exchange Marketplace:PythonGateway AppRGateway AppStay tuned! Great webinar, machines learning are high in the market, knowledge is always good.
Announcement
Evgeny Shvarov · Sep 27, 2019
Hi Developers!
Just want to share the information with you that we support TechCrunch Disrupt Hackathon 2019 this year!
It will take place from 2-4 of October in San Francisco, CA.
We introduced InterSystems special prizes for participants which solutions will use InterSystems IRIS $4,000 and InterSystems IRIS for Health 4,000.
Learn more here.
If you happen to participate we wish you luck and hope you'll leverage InterSystems IRIS data platform functionality to win Disrupt Hackathon 2019!
Announcement
Anastasia Dyubaylo · Feb 5, 2021
Hi Developers,
Please welcome the new video on InterSystems Developers YouTube:
⏯ Building REST API with InterSystems IRIS Docker Container in 5 Minutes
See the process of creating a new REST API back end for an application using development tools like GitHub, Docker, VS Code, and the ZPM package manager in InterSystems IRIS data platform. Learn about using OpenAPI specifications for ground-up and API-first development approaches.
🗣 Presenter: @Sergei.Shutov, Principal Applications Developer, InterSystems
Useful links:
ObjectScript REST Docker template
OpenAPI Examples
AppS.REST framework
Enjoy watching this video! 👍🏼
Article
Tomohiro Iwamoto · Dec 24, 2020
## About this article:
In InterSystems IRIS, the default form of access to the management portal is HTTP, which means that if the client is in the office and the server is in the cloud, many clients probably desire to encrypt their traffic in some way.
Thus, we would like to show you some ways to encrypt your traffic to and from the IRIS management portal (or various REST services) running on AWS.
> This article uses the IRIS built-in apache server for access. It should not be used for benchmarking purposes or as a method of access from production environments applications.
> It is designed to encrypt access for development, operation verification, and quick management and with a small number of people.
The best solution would be to prepare a domain name and an SSL server certificate issued by a central certification authority. However, in the case of the aforementioned applications, it would not be easy in terms of cost.
Therefore, the following certificates are assumed to be used:
- Self-signed (a Japanese so-called “ore ore” certificate)
- A certificate issued by a self-built certification authority (a Japanese so-called “ore ore” Certification Authority)
We additionally assume the following running environment:
| The PC environment ||
| ------------- | -------------:|
| O/S | Windows 10 |
| Browser | Chrome/FireFox/Edge |
| IDE | vscode+ObjectScript Extension |
| Unused port number in the local PC | 8888 |
| The secret key of the key pair used during the creation of the EC2 Instance | aws-secret.pem |
| AWS Environment ||
| ------------- | -------------:|
| IRIS host's public hostname | ec2-54-250-169-xxx.ap-northeast-1.compute.amazonaws.com |
| IRIS webserver port number | 52773 |
| O/S | Ubuntu 18.04LTS |
| O/S User name | ubuntu |
* * *
## Direct Access
### 1) Using Port Forwarding
This is the easiest way.
In the security group, the port for SSH (22) must be allowed inbound from the Internet.
C:\Users\xxxx>ssh -i aws-secret.pem -L 8888:localhost:52773 ubuntu@ec2-54-250-169-xxx.ap-northeast-1.compute.amazonaws.com
While running this command from the PC, you can access the IRIS host through the link below.
http://localhost:8888/csp/sys/%25CSP.Portal.Home.zen
Since you are logged in with ssh, you can use it for terminal operations such as those performed during the development process (starting and stopping IRIS, starting an IRIS session, etc.).
This method is also useful for the super-server port (51773), so it can encrypt the communication with Studio.
Note: For Windows, if you do not place the private key for ssh (aws-secret.pem) in %USERPROFILE%\, you will get an error.
C:\Users\xxxx>dir %USERPROFILE%\aws-secret.pem
2020/07/14 17:10 1,692 aws-secret.pem
1 File(s) 1,692 bytes
0 Dir(s) 100,576,694,272 bytes free
The vscode settings (settings.json) should look like the one below.
{
"objectscript.conn": {
"host": "localhost",
"https": false,
"port": 8888,
"ns": "USER",
"username": "xxx",
"password": "xxx",
"active": true
}
}
### 2) Using a Reverse Proxy with SSL configuration
In this case, you deploy a self-certified apache or Nginx with a reverse proxy configuration and access the IRIS host through it.
In the security group, a port for HTTPS (443) must be allowed inbound from the Internet.
A script to configure apache and Nginx is available in the [link](https://github.com/IRISMeister/apache-ssl.git). This will allow you to access the IRIS host from your browser through the link below.
https://ec2-54-250-169-xxx.ap-northeast-1.compute.amazonaws.com/csp/sys/%25CSP.Portal.Home.zen
The vscode settings (settings.json) should look like the following:
{
"objectscript.conn": {
"host": "ec2-54-250-169-xxx.ap-northeast-1.compute.amazonaws.com",
"https": true,
"port": 443,
"ns": "USER",
"username": "xxx",
"password": "xxx",
"active": true
}
}
* * *
## Via Bastion host
Suppose you are uncomfortable about letting user data, EC2 instance of SSH (including code) or HTTPS ports be published on the Internet, despite being for verification purposes. In that case, you can use a Bastion Host.
The security group must allow inbound TCP traffic between the Bastion Host and the IRIS host.
It is assumed that the execution environment is as follow:
| AWS Environment ||
| ------------- | -------------:|
| IRIS host's public hostname | none |
| Internal IP address of the IRIS host | 10.0.1.81 |
| Public hostname of the Bastion host | ec2-54-250-169-yyy.ap-northeast-1.compute.amazonaws.com |
### 1) Using Port Forwarding
In the security group, the port for SSH (22) must be allowed inbound to the Internet.
The following commands are executed against the Bastion host:
C:\Users\xxxx>ssh -i aws-secret.pem -L 8888:10.0.1.81:52773 ubuntu@ec2-54-250-169-yyy.ap-northeast-1.compute.amazonaws.com
Ditto.
### 2) Using a Reverse Proxy with SSL configuration
In the security group, the HTTPS port (443) must be allowed inbound to the Internet.
Do the same task (deploy apache/Nginx) on the Bastion host.
Change the [destination URL](https://github.com/IRISMeister/apache-ssl/blob/master/apache-conf/other/iris.conf) to the internal IP address of the IRIS host: 10.0.1.81.
ProxyRequests Off
ProxyPass / http://10.0.1.81:52773/
ProxyPassReverse / http://10.0.1.81:52773/
Ditto. (Except that now using the bastion hostname as URL)
* * *
## Via AWS/ALB
It is not something people typically do, but you can apply a self-certification to AWS/ALB. In this case, the ALB will be SSL-terminated, saving you the trouble of preparing a separate SSL-enabled apache.
Since creating an ALB requires at least two AZs, I have used mirrored DMs created by ICM (InterSystems Cloud Manager).
(For more information about ICM, see [How to Configure an IRIS Cluster with ICM](https://jp.community.intersystems.com/post/icm%E3%82%92%E5%88%A9%E7%94%A8%E3%81%97%E3%81%A6iris%E3%82%AF%E3%83%A9%E3%82%B9%E3%82%BF%E3%83%BC%E3%82%92%E6%A7%8B%E6%88%90%E3%81%99%E3%82%8B%E6%96%B9%E6%B3%95))
default.json (excerpt)
{
"Zone": "ap-northeast-1a,ap-northeast-1c",
"Mirror": "true",
}
definitions.json
[
{
"Role": "DM",
"Count": "2",
"MirrorMap": "primary,backup",
"ZoneMap": "0,1"
}
]

Use the certificate files you created in [setup.sh](https://github.com/IRISMeister/apache-ssl/blob/master/setup.sh).
Import these files to ACM (AWS Certificate Manager).
Certificate: contents of the server.crt
Certificate of the Private key: contents of the server.key
Certificate Chain: contents of the inca.pem
Tip) If you have access to awscli, uncomment the last line of [setup.sh](https://github.com/IRISMeister/apache-ssl/blob/master/setup.sh), then it will be registered automatically.
Create a new ALB with the following settings:
Step 1: Configure Load Balancer
Name: anything
Scheme: For Internet
IP address type: ipv4
Listeners: https (port: 443)
Availability Zones: ap-northeast-1a,ap-northeast-1c
Step 2: Configuration of Security Settings
Selecting a default certificate: Selecting a certificate from ACM
Certificate Name: (Select the certificate you have just imported into ACM)
Step 3: Configure Security Groups
Security group settings: Create a new security group →that Allows only HTTPS (port:443).
Step 4: Configure Routing
Target group: New target group
Name: anything
Target Type: Instance
Protocol: HTTP
Port: 52773
Health Checks
Protocol: http
Path: /csp/bin/Mirror_status.cxw
Health Check Advanced Settings
Port: Overwrite →52773
Step 5: Register Targets
"Add to registered" the EC2 Instance you've just provisioned.
After the ALB status become active, you can use HTTPS with DNS name of the ALB.
> https://[ALB DNS Name]/csp/sys/exp/%25CSP.UI.Portal.SQL.Home.zen