Search

Clear filter
Article
Mark Bolinsky · Mar 3, 2020

InterSystems IRIS and Intel Optane DC Persistent Memory

InterSystems and Intel recently conducted a series of benchmarks combining InterSystems IRIS with 2nd Generation Intel® Xeon® Scalable Processors, also known as “Cascade Lake”, and Intel® Optane™ DC Persistent Memory (DCPMM). The goals of these benchmarks are to demonstrate the performance and scalability capabilities of InterSystems IRIS with Intel’s latest server technologies in various workload settings and server configurations. Along with various benchmark results, three different use-cases of Intel DCPMM with InterSystems IRIS are provided in this report. Overview Two separate types of workloads are used to demonstrate performance and scaling – a read-intensive workload and a write-intensive workload. The reason for demonstrating these separately is to show the impact of Intel DCPMM on different use cases specific to increasing database cache efficiency in a read-intensive workload, and increasing write throughput for transaction journals in a write-intensive workload. In both of these use-case scenarios, significant throughput, scalability and performance gains for InterSystems IRIS are achieved. The read-intensive workload leveraged a 4-socket server and massive long-running analytical queries across a dataset of approximately 1.2TB of total data. With DCPMM in “Memory Mode”, benchmark comparisons yielded a significant reduction in elapsed runtime of approximately six times faster when compared to a previous generation Intel E7v4 series processor with less memory. When comparing like-for-like memory sizes between the E7v4 and the latest server with DCPMM, there was a 20% improvement. This was due to both the increased InterSystems IRIS database cache capability afforded by DCPMM and the latest Intel processor architecture. The write-intensive workload leverages a 2-socket server and InterSystems HL7 messaging benchmark which consists of numerous inbound interfaces, each message has several transformations and then four outbound messages for each inbound. One of the critical components in sustaining high throughput is the message durability guarantees of IRIS for Health, and the transaction journal write performance is crucial in that operation. With DCPMM in “APP DIRECT” mode as DAX XFS presenting an XFS file system for transaction journals, this benchmark demonstrated a 60% increase in message throughput. To summarize the test results and configurations; DCPMM offers significant throughput gains when used in the proper InterSystems IRIS setting and workload. The high-level benefits are increasing database cache efficiency and reducing disk IO block reads in read-intensive workloads and also increasing write throughput for journals in write-intensive workloads. In addition, Cascade Lake based servers with DCPMM provide an excellent update path for those looking into refreshing older hardware and improving performance and scaling. InterSystems technology architects are available to help with those discussions and provide advice on suggested configurations for your existing workloads. READ-INTENSIVE WORKLOAD BENCHMARK For the read-intensive workload, we used an analytical query benchmark comparing an E7v4 (Broadwell) with 512GiB and 2TiB database cache sizes, against the latest 2nd Generation Intel® Xeon® Scalable Processors (Cascade Lake) with 1TB and 2TB database cache sizes using Intel® Optane™ DC Persistent Memory (DCPMM). We ran several workloads with varying global buffer sizes to show the impact and performance gain of larger caching. For each configuration iteration we ran a COLD, and a WARM run. COLD is where the database cache was not pre-populated with any data. WARM is where the database cache has already been active and populated with data (or at least as much as it could) to reduce physical reads from disk. Hardware Configuration We compared an older 4-Socket E7v4 (aka Broadwell) host to a 4-socket Cascade Lake server with DCPMM. This comparison was chosen because it would demonstrate performance gains for existing customers looking for a hardware refresh along with using InterSystems IRIS. In all tests, the same version of InterSystems IRIS was used so that any software optimizations between versions were not a factor. All servers have the same storage on the same storage array so that disk performance wasn’t a factor in the comparison. The working set is a 1.2TB database. The hardware configurations are shown in Figure-1 with the comparison between each of the 4-socket configurations: Figure-1: Hardware configurations Server #1 Configuration Server #2 Configuration Processors: 4 x E7-8890 v4 @ 2.5Ghz Processors: 4 x Platinum 8280L @ 2.6Ghz Memory: 2TiB DRAM Memory: 3TiB DCPMM + 768GiB DRAM Storage: 16Gbps FC all-flash SAN @ 2TiB Storage: 16Gbps FC all-flash SAN @ TiB DCPMM: Memory Mode only Benchmark Results and Conclusions There is a significant reduction in elapsed runtime (approximately 6x) when comparing 512GiB to either 1TiB and 2TiB DCPMM buffer pool sizes. In addition, it was observed that in comparing 2TiB E7v4 DRAM and 2TiB Cascade Lake DCPMM configurations there was a ~20% improvement as well. This 20% gain is believed to be mostly attributed to the new processor architecture and more processor cores given that the buffer pool sizes are the same. However, this is still significant in that in the 4-socket Cascade Lake tested only had 24 x 128GiB DCPMM installed, and can scale to 12TiB DCPMM, which is about 4x the memory of what E7v4 can support in the same 4-socket server footprint. The following graphs in figure-2 depict the comparison results. In both graphs, the y axis is elapsed time (lower number is better) comparing the results from the various configurations. Figure-2: Elapse time comparison of various configurations WRITE-INTENSIVE WORKLOAD BENCHMARK The workload in this benchmark was our HL7v2 messaging workload using all T4 type workloads. The T4 Workload used a routing engine to route separately modified messages to each of four outbound interfaces. On average, four segments of the inbound message were modified in each transformation (1-to-4 with four transforms). For each inbound message four data transformations were executed, four messages were sent outbound, and five HL7 message objects were created in the database. Each system is configured with 128 inbound Business Services and 4800 messages sent to each inbound interface for a total of 614,400 inbound messages and 2,457,600 outbound messages. The measurement of throughput in this benchmark workload is “messages per second”. We are also interested in (and recorded) the journal writes during the benchmark runs because transaction journal throughput and latency are critical components in sustaining high throughput. This directly influences the performance of message durability guarantees of IRIS for Health, and the transaction journal write performance is crucial in that operation. When journal throughput suffers, application processes will block on journal buffer availability. Hardware Configuration For the write-intensive workload, we decided to use a 2-socket server. This is a smaller configuration than our previous 4-socket configuration in that it only had 192GB of DRAM and 1.5TiB of DCPMM. We compared the workload of Cascade Lake with DCPMM to that of the previous 1st Generation Intel® Xeon® Scalable Processors (Skylake) server. Both servers have locally attached 750GiB Intel® Optane™ SSD DC P4800X drives. The hardware configurations are shown in Figure-3 with the comparison between each of the 2-socket configurations: Figure-3: Write intensive workload hardware configurations Server #1 Configuration Server #2 Configuration Processors: 2 x Gold 6152 @ 2.1Ghz Processors: 2 x Gold 6252 @ 2.1Ghz Memory: 192GiB DRAM Memory:1.5TiB DCPMM + 192GiB DRAM Storage: 2 x 750GiB P4800X Optane SSDs Storage: 2 x 750GiB P4800X Optane SSDs DCPMM: Memory Mode & App Direct Modes Benchmark Results and Conclusions Test-1: This test ran the T4 workload described above on the Skylake server detailed as Server #1 Configuration in Figure-3. The Skylake server provided a sustained throughput of ~3355 inbound messages a second with a journal file write rate of 2010 journal writes/second. Test-2: This test ran the same workload on the Cascade Lake server detailed as Server #2 Configuration in Figure-3, and specifically with DCPMM in Memory Mode. This demonstrated a significant improvement of sustained throughput of ~4684 inbound messages per second with a journal file write rate of 2400 journal writes/second. This provided a 39% increase compared to Test-1. Test-3: This test ran the same workload on the Cascade Lake server detailed as Server #2 Configuration in Figure-3, this time using DCPMM in App Direct Mode but not actually configuring DCPMM to do anything. The purpose of this was to gauge just what the performance and throughput would be comparing Cascade Lake with DRAM only to Cascade Lake with DCPMM + DRAM. Results we not surprising in that there was a gain in throughput without DCPMM being used, albeit a relatively small one. This demonstrated an improvement of sustained throughput of ~4845 inbound message a second with a journal file write rate of 2540 journal writes/second. This is expected behavior because DCPMM has a higher latency compared to DRAM, and with the massive influx of updates there is a penalty to performance. Another way of looking at it there is <5% reduction in write ingestion workload when using DCPMM in Memory Mode on the same exact server. Additionally, when comparing Skylake to Cascade Lake (DRAM only) this provided a 44% increase compared to the Skylake server in Test-1. Test-4: This test ran the same workload on the Cascade Lake server detailed as Server #2 configuration in Figure-3, this time using DCPMM in App Direct Mode and using App Direct Mode as DAX XFS mounted for the journal file system. This yielded even more throughput of 5399 inbound messages per second with a journal file write rate of 2630/sec. This demonstrated that DCPMM in App Direct mode for this type of workload is the better use of DCPMM. Comparing these results to the initial Skylake configuration there was a 60% increase in throughput compared to the Skylake server in Test-1. InterSystems IRIS Recommended Intel DCPMM Use Cases There are several use cases and configurations for which InterSystems IRIS will benefit from using Intel® Optane™ DC Persistent Memory. Memory Mode This is ideal for massive database caches for either a single InterSystems IRIS deployment or a large InterSystems IRIS sharded cluster where you want to have much more (or all!) of your database cached into memory. You will want to adhere to a maximum of 8:1 ratio of DCPMM to DRAM as this is important for the “hot memory” to stay in DRAM acting as an L4 cache layer. This is especially important for some shared internal IRIS memory structures such as seize resources and other memory cache lines. App Direct Mode (DAX XFS) – Journal Disk Device This is ideal for using DCPMM as a disk device for transaction journal files. DCPMM appears to the operating system as a mounted XFS file system to Linux. The benefit of using DAX XFS is this alleviates the PCIe bus overhead and direct memory access from the file system. As demonstrated in the HL7v2 benchmark results, the write latency benefits significantly increased the HL7 messaging throughput. Additionally, the storage is persistent and durable on reboots and power cycles just like a traditional disk device. App Direct Mode (DAX XFS) – Journal + Write Image Journal (WIJ) Disk Device In this use case, this extends the use of App Direct mode to both the transaction journals and the write image journal (WIJ). Both of these files are write-intensive and will certainly benefit from ultra-low latency and persistence. Dual Mode: Memory + App Direct Modes When using DCPMM in dual mode, the benefits of DCPMM are extended to allow for both massive database caches and ultra-low latency for the transaction journals and/or write image journal devices. In this use case, DCPMM appears both as mounted XFS filesystem to the OS and RAM to operating systems. This is achieved by allocating a percentage of DCPMM as DAX XFS and the remaining is allocated in memory mode. As mentioned previously, the DRAM installed will operate as an L4 like cache to the processors. “Quasi” Dual Mode To extend the use case models on a bit of slant with a Quasi Dual Mode in that you have a concurrent transaction and analytic workloads (also known as HTAP workloads) type workload where there is a high rate of inbound transactions/updates for OLTP type workloads, and also an analytical or massive querying need, then having each InterSystems IRIS node type within an InterSystems IRIS sharded cluster operating with different modes for DCPMM. In this example there is the addition of InterSystems IRIS compute nodes which will handle the massive querying/analytics workload running with DCPMM Memory Mode so that they benefit from massive database cache in the global buffers, and the data nodes either running in Dual mode or App Direct the DAX XFS for the transactional workloads. Conclusion There are numerous options available for InterSystems IRIS when it comes to infrastructure choices. The application, workload profile, and the business needs drive the infrastructure requirements, and those technology and infrastructure choices influence the success, adoption, and importance of your applications to your business. InterSystems IRIS with 2nd Generation Intel® Xeon® Scalable Processors and Intel® Optane™ DC Persistent Memory provides for groundbreaking levels of scaling and throughput capabilities for your InterSystems IRIS based applications that matter to your business. Benefits of InterSystems IRIS and Intel DCPMM capable servers include: Increases memory capacity so that multi-terabyte databases can completely reside in InterSystems IRIS or InterSystems IRIS for Health database cache with DCPMM in Memory Mode. In comparison to reading from storage (disks), this can increase query response performance by up six times with no code changes due to InterSystems IRIS proven memory caching capabilities that take advantage of system memory as it increases in size. Improves the performance of high-rate data interoperability throughput applications based on InterSystems IRIS and InterSystems IRIS for Health, such as HL7 transformations, by as much as 60% in increased throughput using the same processors and only changing the transaction journal disk from the fastest available NVMe drives to leveraging DCPMM in App Direct mode as a DAX XFS file system. Exploiting both the memory speed data transfers and data persistence is a significant benefit to InterSystems IRIS and InterSystems IRIS for Health. Augment the compute resources where needed for a given workload whether read or write-intensive, or both, without over-allocating entire servers just for the sake of one resource component with DCPMM in Mixed Mode. InterSystems Technology Architects are available to discuss hardware architectures ideal for your InterSystems IRIS based application. Great article, Mark! I have a few notes and questions: 1. Here's a brief comparison of different storage categories: Intel® Optane™ DC Persistent Memory has read throughput of 6.8 GB/s and write throughput 1.85 GB/s (source). Intel® Optane™ SSD has read throughput of 2.5 GB/s and write throughput of 2.2 GB/s at (source). Modern DDR4 RAM has read throughput of ~25 GB/s. While I certainly see the appeal of DC Persistent Memory if we need more memory than RAM can provide, is it useful on smaller scale? Say I have a few hundred gigabytes of indices I need to keep in global buffer and be able to read-access fast. Would plain DDR4 RAM be better? Costs seem comparable and read throughput of 25 Gb/s seems considerably better. 2. What RAM was used in a Server #1 configuration? 3. Why are there different CPUs between servers? 4. Workload link does not work. 6252 supports DCPM, while 6152 does not 6252 can be used for both DCPM and DRAM configuration. Hi Eduard, Thanks for you questions. 1- On small scale I would stay with traditional DRAM. DCPMM becomes beneficial when >1TB of capacity. 2- That was DDR4 DRAM memory in both read-intensive and write-intensive Server #1 configurations. In the read-intensive server configuration it was specifically DDR-2400, and in the write-intensive server configuration it was DDR-2600. 3- There are different CPUs in configuration in the read-intensive workload because this testing is meant to demonstrate upgrade paths from older servers to new technologies and the scalability increases offered in that scenario. The write-intensive workload only used a different server in the first test to compare previous generation to the current generation with DCPMM. Then the three following results demonstrated the differences in performance within the same server - just different DCPMM configurations. 4- Thanks. I will see what happened to the link and correct. Correct. Gold 6252 series (aka "Cascade Lake") supports both DCPMM and DRAM. However, keep in mind that when using DCPMM you need to have DRAM and should adhere to at least a 8:1 ratio of DCPMM:DRAM.
Announcement
Anastasia Dyubaylo · Sep 3, 2019

[September 10, 2019] Boston FHIR @ InterSystems Meetup

Hi Community! We are super excited to announce the Boston FHIR @ InterSystems Meetup on 10th of September at the InterSystems meeting space! There will be two talks with Q&A and networking. Doors open at 5:30pm, we should start the first talk around 6pm. We will have a short break between talks for announcements, including job opportunities. Please check the details below. #1 We are in the middle of changes in healthcare technology that affect the strategies of companies and organizations across the globe, including many startups right here in Massachusetts. Micky Tripathi from the Massachusetts eHealth Collaborative is going to talk to us about the opportunities and consequences of API-based healthcare. By Micky Tripathi - MAeHC#2 FHIR Analytics The establishment of FHIR as a new healthcare data format creates new opportunities and challenges. Health professionals would like to acquire patient data from Electronic Health Records (EHR) with FHIR, and use it for population health management and research.FHIR provides resources and foundations based on XML and JSON data structures. However, traditional analytic tools are difficult to use with these structures. We created a prototype application to ingest FHIR bundles and save the Patient and Observation resources as objects/tables in InterSystems IRIS for Health. Developers can then easily create derived "fact tables" that de-normalize these tables for exploration and analytics.We will demo this application and our analytics tools using the InterSystems IRIS for Health platform.By Patrick Jamieson, M.D., Product Manager for InterSystems IRIS for Health and Carmen Logue, Product Manager - Analytics and AI So, remember! Date and time: Tuesday, 10 September 2019 5:30 pm to 7:30 pm Venue: 1 Memorial Dr, Cambridge, MA 02142, USA Event webpage: Boston FHIR @ InterSystems Meetup
Article
Evgeny Shvarov · Sep 6, 2019

Using Package Manager with InterSystems IRIS in Docker Container

Hi Developers! InterSystems Package Manager (ZPM) is a great thing, but it is even better if you don't need to install it but can use immediately. There are several ways how to do this and here is one approach of having IRIS container with ZPM built with dockerfile. I've prepared a repository which has a few lines in dockerfile which perform the download and install the latest version of ZPM. Add these lines to your standard dockerfile for IRIS community edition and you will have ZPM installed and ready to use. To download the latest ZPM client: RUN mkdir -p /tmp/deps \ && cd /tmp/deps \ && wget -q https://pm.community.intersystems.com/packages/zpm/latest/installer -O zpm.xml to install ZPM into IRIS: " Do \$system.OBJ.Load(\"/tmp/deps/zpm.xml\", \"ck\")" \ Great! To try ZPM with this repository do the following: $ git clone https://github.com/intersystems-community/objectscript-zpm-template.git Build and run the repo: $ docker-compose up -d Open IRIS terminal: $ docker-compose exec iris iris session iris USER> Call ZPM: USER>zpm zpm: USER> Install webterminal zpm: USER>install webterminal webterminal] Reload START [webterminal] Reload SUCCESS [webterminal] Module object refreshed. [webterminal] Validate START [webterminal] Validate SUCCESS [webterminal] Compile START [webterminal] Compile SUCCESS [webterminal] Activate START [webterminal] Configure START [webterminal] Configure SUCCESS [webterminal] Activate SUCCESS zpm: USER> Use it! And take a look at the whole process in this gif: It turned out, that we don't need a special repository to add ZPM into your docker container.You just need another dockerfile - like this one. And here is the related docker-compose to make a handy start. See how it works:
Article
Henry Pereira · Sep 16, 2019

Power BI Connector for InterSystems IRIS. Part I

In an ever-changing world, companies must innovate to stay competitive. This ensures that they’ll make decisions with agility and safety, aiming for future results with greater accuracy.Business Intelligence (BI) tools help companies make intelligent decisions instead of relying on trial and error. These intelligent decisions can make the difference between success and failure in the marketplace.Microsoft Power BI is one of the industry’s leading business intelligence tools. With just a few clicks, Power BI makes it easy for managers and analysts to explore a company’s data. This is important because when data is easy to access and visualize, it’s much more like it’ll be used to make business decisions. Power BI includes a wide variety of graphs, charts, tables, and maps. As a result, you can always find visualizations that are a good fit for your data. BI tools are only as useful as the data that backs them, however. Power BI supports many data sources, and InterSystems IRIS is a recent addition to those sources. Since Power BI provides an exciting new way to explore data stored in IRIS, we’ll be exploring how to use these two amazing tools together. This article will explain how to use IRIS Tables and Power BI together on real data. In a follow-up article, we’ll walk through using Power BI with IRIS Cubes. Project Prerequisites and Setup You will need the following to get started: InterSystems IRIS Data Platform Microsoft Power BI Desktop (April 2019 release or more recent) InterSystems Sample-BI data We'll be using the InterSystems IRIS Data Platform, so you’ll need access to an IRIS install to proceed. You can download a trial version from the InterSystems website if necessary. There are two ways to install the Microsoft Power BI Desktop. You can download an installer and, or install it through the Microsoft Store. Note that if you are running Power BI from a different machine than where you installed InterSystems IRIS, you will need to install the InterSystems IRIS ODBC drivers on that machine separately To create a dashboard on Power BI we'll need some data. We'll be using the HoleFoods dataset provided by InterSystems here on GitHub. To proceed, either clone or download the repository. In IRIS, I've created a namespace called SamplesBI. This is not required, but if you want to create a new namespace, in the IRIS Management Portal, go to System Administration > Configuration > System Configuration > Namespace and click on New Namespace. Enter a name, then create a data file or use an existing one. On InterSystems IRIS Terminal, enter the namespace that you want to import the data into. In this case, SamplesBI: Execute $System.OBJ.Load() with the full path of buildsample/Build.SampleBI.cls and the "ck" compile flags: Execute the Build method of Build.SampleBI class, and full path directory of the sample files: Connecting Power BI with IRIS Now it's time to connect Power BI with IRIS. Open Power BI and click on "Get Data". Choose "Database", and you will see the InterSystems IRIS connector: Enter the host address. The host address is the IP address of the host for your InterSystems IRIS instance (localhost in my case), the Port is the instance’s superserver port (IRIS default is 57773), and the Namespace is where your HoleFoods data is located. Under Data Connectivity mode, choose "DirectQuery", which ensures you’re always viewing current data. Next, enter the username and password to connect to IRIS. The defaults are "_SYSTEM" and "SYS". You can import both tables and cubes generated you’ve created in IRIS. Let’s start by importing some tables. Under Tables and HoleFoods, check: Country Outlet Product Region SalesTransaction We're almost there! To tell Power BI about the relationship between our tables, click on "Manage Relationships". Then, click on New. Let's make two relationships: "SalesTransaction" and "Product relationship". On top, select the "SalesTransaction" table and click on the "Product" column. Next, select the "Product" table and click on the "ID" column. You'll see that the Cardinality changes automatically to "Many to One (*:1)". Repeat this step for the following: "SalesTransaction(Outlet)" with "Outlet(ID)" "Outlet(Country)" with "Country(ID)" "Country(Region)" with "Region(ID)": Note that these relationships are imported automatically if they are expressed as Foreign Keys. Power BI also has a Relationships schema viewer. If you click the button on the left side of the application, it will show our data model. Creating a Dashboard We now have everything we need to create a dashboard. Start by clicking the button on the left to switch from schema view back to Report view. On the Home tab under the Insert Group, click the TextBox to add a Title. The Insert Group includes static elements like Text, Shapes, and Images we can use to enhance our reports. It's time to add our first visualization! In the Fields pane, check "Name" on "Product" and "UnitsSold" on "SalesTransaction". Next, go to Style and select "Bold Header". Now it's time to do some data transformation. Click on the ellipsis next to "SalesTransaction" in the Field pane. Then, click on "Edit Query". It will open the "Power Query Editor". Select the "DateOfSale" column and click on "Duplicate Column". Rename this new column to "Year", and click on "Date" and select "Year". Apply these changes. Next, select the new column and, on the "Modeling" tab, change "Default Summarization" to "Don't Summarize". Add a "Line Chart" visualization, then drag Year to Axis, drag "Name" from "Region" to Legend, and drag "AmountOfSale" from "SalesTransaction" to Values. Imagine that the HoleFoods sales team has a target of selling 2000 units. How can we tell if the team is meeting its goal? To answer, let's add a visual for metrics and targets. On "SalesTransaction" in the Field pane, check "UnitsSold", then click Gauge Chart. Under the Style properties, set Max to 3000 and Target to 2000. KPIs (Key Performance Indicators) are helpful decision-making tools, and Power BI has a convenient KPI visual we can use. To add it, under "SalesTransaction", check "AmountOfSale" and choose KPI under “Visualizations”. Then, drag "Year" to "Trend axis". To align all charts and visuals, simply click and drag a visual, and when an edge or center is close to aligning with the edge or center of another visual or set of visuals, red dashed lines appear. You also can go to the View tab and enable "Show GridLines" and "Snap Objects to Grid". We’ll finish up by adding a map that shows HoleFoods global presence. Set Longitude and Latitude on "Outlet" to "Don't Summarize" on the Modeling tab. You can find the map tool in the Visualizations pane. After adding it, drag the Latitude and Longitude fields from Outlet to respective properties on the map. Also from SalesTransaction, drag the AmountOfSale to Size property and UnitsSold to ToolTips. And our dashboard is finally complete. You can share your dashboard by publishing it to the Power BI Service. To do this, you’ll have to sign up for a Power BI account. Conclusion In just a few minutes, we were able to connect Power BI to InterSystems IRIS and then create amazing interactive visualizations. As developers, this is great. Instead of spending hours or days developing dashboards for managers, we can get the job done in minutes. Even better, we can show managers how to quickly and easily create reports for themselves. Although developing visualizations is often part of a developer’s job, our time is usually better spent developing mission-critical architecture and applications. Using IRIS and Power BI together ensures that developer time is used effectively and that managers are able to access and visualize data immediately — without waiting weeks for dashboards to be developed, tested, and deployed to production. Perfect! Great! Thanks Henry. Few queries - 1. Does Power Bi offer an advantage over Intersystems own analytics? If yes, what are those? In general, I believe visualization is way better in Power BI.,data modelling would be much easier. In addition Power BI offers its user to leverage from Microsoft's cognitive services. Did you notice any performance issue? 2. I believe the connector is free to avail, can you confirm if this true? Thanks, SS. 3. tagging @Carmen.Logue to provide more details. That's right. There is no charge for the PowerBI connector; but you do need licenses for Microsoft PowerBI. The connector is available with PowerBI starting in version 2019.2. See this article in the product documentation. 💡 This article is considered as InterSystems Data Platform Best Practice. Nice article: while testing, trying to load the tables (which I can select) I got the following errors: I am using Iris installed locally and PowerBI Desktop. Any suggestions? Access Denied errors can stem from a variety of reasons. As a sanity check, running the windows ODBC connection manager's test function never hurts to rule out connectivity issues. By any means, you can consult with the WRC on support issues like this.
Article
Renan Lourenco · Mar 9, 2020

InterSystems IRIS for Health ENSDEMO (supports arm64)

# InterSystems IRIS for Health ENSDEMO Yet another basic setup of ENSDEMO content into InterSystems IRIS for Health. **Make sure you have Docker up and running before starting.** ## Setup Clone the repository to your desired directory ```bash git clone https://github.com/OneLastTry/irishealth-ensdemo.git ``` Once the repository is cloned, execute: **Always make sure you are inside the main directory to execute docker-compose commands.** ```bash docker-compose build ``` ## Run your Container After building the image you can simply execute below and you be up and running 🚀: *-d will run the container detached of your command line session* ```bash docker-compose up -d ``` You can now access the manager portal through http://localhost:9092/csp/sys/%25CSP.Portal.Home.zen - **Username:** SuperUser - **Password:** SYS - **SuperServer port:** 9091 - **Web port:** 9092 - **Namespace:** ENSDEMO ![ensdemo](https://openexchange.intersystems.com/mp/img/packages/468/screenshots/zhnwycjrflt4q7gttwsidcntxk.png) To start a terminal session execute: ```bash docker exec -it ensdemo iris session iris ``` To start a bash session execute: ```bash docker exec -it ensdemo /bin/bash ``` Using [InterSystems ObjectScript](https://marketplace.visualstudio.com/items?itemName=daimor.vscode-objectscript) Visual Studio Code extension, you can access the code straight from _vscode_ ![vscode](https://openexchange.intersystems.com/mp/img/packages/468/screenshots/bgirfnblz2zym4zi2q92lnxkmji.png) ## Stop your Container ```bash docker-compose stop ``` ## Support to ZPM ```bash zpm "install irishealth-ensdemo" ``` Nice, Rhenan! And ZPM it, please, too! Interesting. Is it available for InterSystems IRIS? Will do soon! Haven't tested but I would guess yes, I will run some tests changing the version in Dockerfile and post the outcome here. Hi, here is a similar article about ensdemo for IRIS and IRIS for health. https://community.intersystems.com/post/install-ensdemo-iris Works for IRIS4Health Also available as ZPM module now: USER>zpm "install irishealth-ensdemo"[irishealth-ensdemo] Reload START (/usr/irissys/mgr/.modules/USER/irishealth-ensdemo/1.0.0/)[irishealth-ensdemo] Reload SUCCESS[irishealth-ensdemo] Module object refreshed.[irishealth-ensdemo] Validate START[irishealth-ensdemo] Validate SUCCESS[irishealth-ensdemo] Compile START[irishealth-ensdemo] Compile SUCCESS[irishealth-ensdemo] Activate START[irishealth-ensdemo] Configure START[irishealth-ensdemo] Configure SUCCESS[irishealth-ensdemo] MakeDeployed START[irishealth-ensdemo] MakeDeployed SUCCESS[irishealth-ensdemo] Activate SUCCESS USER> Here is the set of productions available: Is there any documentation on what the ens-demo module can do? Unfortunately not as I'd like it to have. Even when ENSDEMO was part of Ensemble information was a bit scattered all over. If you access the Ensemble documentation and search for "Demo." you can see some of the references I mentioned. (since IRIS does not have ENSDEMO by default, documentation has also been removed) Thanks, @Renan.Lourenco ! Perhaps, we could wrap this part of the documentation as a module too. Could be a nice extension to the app. I like your idea @Evgeny.Shvarov !! How do you envision that, a simple index with easy access like: DICOM: Link1 Link2 HL7 Link1 Link2 Or something more elaborated? Also would that be a separate module altogether or part of the existing? I see that the documentation pages are IRIS CSP classes. So I guess it could work if installed in IRIS. I guess also there is a set of static files (FILECOPY could help). IMHO, the reasonable approach to have a separate repo ensdemo-doc and a separate module the, which will be a dependent module to irishealth-ensdemo So people could contribute to documentation independently and update it independently too. I had my bit of fun with documentation before, they are not as straightforward as they appear to be. That's why I thought of having a separate index. I guess you know more about it. I’d also ping @Dmitry.Maslennikov as he tried to make a ZPM package for the whole documentation.
Announcement
Anastasia Dyubaylo · Apr 12, 2023

[Video] System Performance Review for InterSystems IRIS Applications

Hey Developers, Enjoy watching the new video on InterSystems Developers YouTube: ⏯ System Performance Review for InterSystems IRIS Applications @ Global Summit 2022 If you hit a bottleneck in system resources, your users will suffer. So to ensure your systems are the right size, you need to understand how your InterSystems IRIS applications use system resources. This is even more important when you move applications to the cloud, where vendors impose strict limits on capacity and throughput. In this session, you'll learn the key metrics and how to collect them. 🗣 Presenter: @Murray.Oldfield, Principal Technology Architect, InterSystems Hope you like it and stay tuned! 👍
Announcement
Bob Kuszewski · Nov 22, 2022

Announcing the InterSystems Container Registry web user interface

Announcing the InterSystems Container Registry web user interface InterSystems is pleased to announce the release of the InterSystems Container Registry web user interface. This tool is designed to make it easier to discover, access, and use the many Container images hosted on ICR. The InterSystems Container Registry UI is available at: https://containers.intersystems.com/contents Community Edition Containers When you visit the ICR user interface, you’ll have access to InterSystems' publicly available containers, including the IRIS community edition containers. Use the left-hand navigation to select the product family that interests you, then the container, and finally select the specific version. Enterprise Edition Containers You can see the private containers, including the IRIS enterprise edition, by clicking on the Login button. Once logged in, the left-hand navigation will include all the containers you have access to. Enjoy! That great, can be very useful to just check if a version is still available or the one that just came out. A great resource for our community - thanks! This is nice !! Thank you @Robert.Kuszewski !! SUPER !!!
Announcement
Raj Singh · Jan 10, 2023

InterSystems Package Manager 0.5.2 release

We have just released a minor update to the package manager, which has been renamed from ZPM to IPM as I explained in November. It purely a bug fix release, properly interpreting ROBOCOPY return codes and fixing a regression that prevented installation of certain packages. Get it here: https://github.com/intersystems/ipm/releases/tag/v0.5.2
Announcement
Olga Zavrazhnova · Jan 12, 2023

Community Roundtable: Modern development process with InterSystems IRIS

Hi Community, Let's meet virtually at our first Community Roundtable in 2023! Join the discussion on best practices for developing in VS Code/Studio, use cases, and Q&A.📅 Date: January 26🕑 Time: 9:00 am ET | 3:00 pm CET UPDATE: the roundtable recording is available here. Let us know in advance which questions you'd like us to cover - comment on this post or send your questions to me in DM.Not a member of Global Masters? Join now - log in using InterSystems SSO credentials. Image (here and on Global Masters) incorrectly shows the time as 4:00 pm CET. Hi @John.Murray thank you for bringing our attention to that!Working on a new picture right now, and will replace the current one asap :) One of my colleagues just tried using the registration link but got this response from the Global Masters page it leads to: Hi @John.Murray, could be if the onboarding challenge is not completed - the one which unlocks the Global Masters program. Maybe you could send me in DM his/her email address? I will forward an invitation by email, no additional registration will be needed.
Announcement
Anastasia Dyubaylo · Apr 3, 2023

Winners of Tech Article Contest: InterSystems IRIS Tutorials

Hi Developers! We have great new articles for your to read and enjoy, thanks to our wonderful participants of the 4th InterSystems Tech Article Contest: InterSystems IRIS Tutorials! 🌟 24 AMAZING ARTICLES 🌟 And now it's time to announce the winners! Let's meet the winners and their articles: ⭐️ Expert Awards – winners selected by InterSystems experts: 🥇 1st place: InterSystems Embedded Python in glance by @Muhammad.Waseem 🥈 2nd place: InterSystems Embedded Python with Pandas - Part 1 by @Rizmaan.Marikar2583 🥉 3rd place: SQLAlchemy - the easiest way to use Python and SQL with IRIS's databases by @Heloisa.Paiva ⭐️ Community Award – winner selected by Community members, article with the most likes: 🏆 Setting up VS Code to work with InterSystems technologies by @Maria.Gladkova And... ⭐️ We'd like to reward some more authors for the number of contributions: @Robert.Cemper1003: 4 articles! @Heloisa.Paiva: 3 articles! @Irene.Mikhaylova: 3 articles! These guys will get Magic Keyboard Folio for iPad or Bose Soundlink Micro Bluetooth Speaker! Let's congratulate all our heroes, who took part in the Tech Article contest #4: @Robert.Cemper1003 @Heloisa.Paiva @Muhammad.Waseem @wang.zhe @Irene.Mikhaylova @Maria.Gladkova @Yone.Moreno @Akio.Hashimoto1419 @Julian.Matthews7786 @Daniel.Aguilar @water.huang @Oliver.Wilms @Rizmaan.Marikar2583 @姚.鑫 @Zhong.Li7025 @Jude.Mukkadayil @Roger.Merchberger THANK YOU ALL! You have made an incredible contribution to our Dev Community! The prizes are in production now. We will contact all the participants when they are ready to ship. Congratulation to all the winners and participants 👏👏Thanks organizers for providing this amazing opportunity to participate Awesome! Thanks for the prize and congratulations to the winners and all participants! Python WON One of the winner's articles uses my project SQLAlchemy-IRIS. And one more could use it too, and there is an example with it in the comments. Congrats to the winners and thank you to everyone who took time to contribute :) Congratulations to all, amazing articles
Announcement
Anastasia Dyubaylo · Mar 10, 2023

InterSystems Developer Ecosystem Winter News 2022/2023

Hello and welcome to the Developer Ecosystem Winter News! This winter (or summer for those of you from another hemisphere) we've had a lot of online and offline activities in the InterSystems Developer Ecosystem. In case you missed something, we've prepared for you a selection of the hottest news and topics to catch up on! News 🔥 Top 2022 Articles for InterSystems Developers 🔥 Top Questions about InterSystems Data Platforms for 2022 🔥 Top 2022 Applications on InterSystems Open Exchange 🔥 Top 2022 Videos for InterSystems Developers 👋 See what you've been up to in 2022! 💡 InterSystems Ideas News #2, #3, #4 🔥 New Global Masters badges - for reviews on Open Exchange 📝 Updated Vulnerability Handling Policy 📝 February 15, 2023 – Alert: Use of Large Pages for Shared Memory on Windows Platforms 📝 Speeding up AND becoming more predictable - updates to our release cadence ✅ InterSystems Supported Platforms Update Feb-2023 👋 The Second Batch of Caelestinus - a Digital Health Interoperability and FHIR Startup Incubator is Starting! Contests & Events InterSystems Developer Tools Contest Announcement Kick-off Webinar Technology Bonuses Time to Vote Technical Bonuses Results Winners Announcement Meetup with Winners Advent of Code Contest Announcement Winners Announcement Community Roundtables Modern development process with InterSystems IRIS AI / ML 📄 [DC Contest] Spanish Tech Article Contest: Second Edition ⏯️ [Webinar] Deltanji demo: source control tailored for InterSystems IRIS ⏯️ [Webinar] Validating FHIR profiles with InterSystems IRIS for Health ☕️ [Meetup] First Dutch HealthShare user group meeting Latest Releases ⬇️ InterSystems announces availability of InterSystems IRIS, IRIS for Health, & HealthShare Health Connect 2022.1.2 ⬇️ InterSystems announces availability of InterSystems IRIS, IRIS for Health, & HealthShare Health Connect 2021.1.3 ⬇️ InterSystems announces General Availability of InterSystems IRIS, IRIS for Health, HealthShare Health Connect, & InterSystems IRIS Studio 2022.3 ⬇️ InterSystems IRIS, IRIS for Health, & HealthShare Health Connect 2022.3 developer previews Preview 3 Preview 4 Preview 5 Preview 6 ⬇️ InterSystems IRIS, IRIS for Health, & HealthShare Health Connect 2023.1 developer previews Preview 1 Preview 2 ⬇️ InterSystems Package Manager Release 0.5.2 ⬇️ IAM 3.0 Release Announcement ⬇️ VS Code ObjectScript releases v2.6.0 ⬇️ Health Data De-ID Early Access Program ⬇️ New bio and pinned posts in your profile! ⬇️ What's new on Open Exchange Best Practices & Key Questions 🔥 Best Practices of Winter 2022 Splitting ORU Messages using ObjectScript and DTL Data models in InterSystems IRIS Tips and tricks of the brand new LOAD DATA command Create Stored Procedures using Embedded Python Getting an Angular UI for your InterSystems IRIS application in 5 minutes Working with Several ObjectScript Projects simultaneously Using VSCode and Docker in InterSystems IRIS Introduction to Web Scraping with Embedded Python - Let’s Extract python job’s UI for Ensemble Workflow in Angular Implementing IRIS Integrations with .NET or Java using PEX Display Management Portal Dashboard by using Python Flask web and Bootstrap Frameworks with the help of embedded python ❓ Key Questions of Winter 2022: December, January, February 💡 How to add InterSystems certification to your DC profile People and Companies to Know About 👋 Yuxiang Niu - New Developer Community Moderator 🌟 Global Masters of Winter 2022: December, January, February Top Open Exchange Developers and Applications for 2022 Top InterSystems Community Contributors for 2022 Job Opportunities 💼 Internship Available 💼 Senior Interoperability Engineer - Remote 💼 Senior Interface Engineer opportunity - Remote So... Here is our take on the most interesting and important things! What were your highlights from this past season? Share them in the comments section and let's remember the fun we've had!
Article
Maria Gladkova · Mar 15, 2023

Setting up VS Code to work with InterSystems technologies

Hi all! In this article I would like to review those VS Code extensions which I use myself to work with InterSystems and which make my work much more convenient. I am sure this article will be useful for those who are just starting their journey to learn InterSystems technologies. However, I also hope that this article could be useful for experienced developers with many years of experience and open up new possibilities for them when using VS Code for development. I recommend that everyone who works with InterSystems has these extensions installed and in this article I will show how to use some of them. You can read more about the functionality and use of each extension in the Extensions section of VS Code, there you can also download, update and uninstall extensions: After installation, extension icons appear on the side or at the bottom of the code editor. Mandatory extensions I think it makes sense to start our journey with those basic extensions, without which working with InterSystems in VS Code becomes impossible. The InterSystems Server Manager Extension for VS Code assists in specifying server connections. The InterSystems ObjectScript Extension for VS Code assists in writing and compiling code files. The InterSystems Language Server Extension for VS Code provides a language server implementation for ObjectScript, enabling coloring, code completion, linting, and more. Together, these extensions provide developers with a streamlined way of creating, testing, and deploying full-stack applications built on InterSystems. Additional extensions In addition to the basic necessary extensions, VS Code offers many other extensions. You can write code without them, but using them makes development more efficient when using any technology stack, including InterSystems technologies. I will describe a few which seem to me to be a must-have. The Docker extension makes the management of dockerised projects a little easier. You can automatically generate a Dockerfile for projects, run images and manage running containers. SQLTools Driver for InterSystems IRIS and SqlTools - are two very handy extensions that work together. Using them, you can create and execute the database SQL queries in VS Code without having to go into the management portal and perform sql queries to interact with the table contents there. Today, it's hard to imagine developing a project without using version control. Most often this is Git, and Visual Studio Code has minimal support for it right out of the box. But if that's not enough for you, check out the next two extensions: Git Graph - shows branches and their status schematically. This is useful in situations where you need to quickly understand the status of branches, e.g. when they merge. Git Lens - allows you to see the history of changes to the highlighted line and its author. It is indispensable for teamwork! EditorConfig - an extension to improve code appearance, requires writing .editorconfig file, in which you can specify any code formatting settings. It's important to note that by default such functionality can be implemented by InterSystems Language Server extension for VS Code, to apply standard ObjectScript code formatting in VS Code you need to use key combination: Windows - [Shift + Alt + F], Mac - [Shift + Option + F], Ubuntu - [Ctrl + Shift + I]. But when using .editorconfig file you can specify your own code formatting for different files within the project. In this article, I've only looked at the extensions I've used myself. I would be grateful if you could take the time to write in the comments what else can be used to make development more convenient. Then this article will become even more useful! Very nice initial How-To! Thank you especially for the animated GIFs! Excellent read, will be installing a couple of these now! Hi @Maria.Gladkova, Nice article and very well explained. Thanks Very nice article @Maria.Gladkova to start developing with VSCode! The most noteworthy extensions I'm using too in my VSCode setup: Encode decode (convert text to other formats like Base64) OpenAPI (swagger) Editor (specifying & documenting your REST api endpoints) WSL (enabling direct use of the Windows Subsystem for Linux in VSCode) Thank you very much, I've already installed and tried some of them out! Good luck coding! Thanks! Thank you!
Article
Yuri Marx Pereira Gomes · Apr 13, 2023

Adding Google Social Login into InterSystems Management Portal

It is a recommended security practice to login into sensitive Administrator Portals without any input passwords. Thus, it is necessary to identify and authenticate the users correctly. A common technique employed by web portals and mobile applications is to use Google social login. Today, Google Gmail has 2 billion users (source: https://www.usesignhouse.com/blog/gmail-stats). Therefore, it is a perfect shared login service to utilize to login InterSystems IRIS users when they need to manage their instances. This article will detail all the steps to embed Google Login into your InterSystems Management Portal. Register your InterSystems instance in the Google Console 1. Go to https://console.cloud.google.com and log in with your Google user account.2. On the header click Select a project: 3. Click the button NEW PROJECT: 4. Create a sample project for this article called InterSystemsIRIS and click the button CREATE: 5. Go to the Header again and select the created project InterSystemsIRIS hyperlink in the table: 6. Now the selected project is the working one: 7. In the header look for credentials on the Search field and choose API credentials (third option for this image): 8. On the top of the screen, click the + CREATE CREDENTIALS button and select OAuth 2.0 Client ID option: 9. Now click CONFIGURE CONSENT SCREEN: 10. Choose External (any person who has Gmail is able to use it) and click the CREATE button: 11. In Edit app registration, complete the field values as follow:App Information (use your email for user support email): App Logo (save any vertical InterSystems Logo on your computer and use it here): App Domain (set home page with the value http://localhost:52773/csp/sys/%25CSP.Portal.Home.zen) For Authorized domains, it is not necessary to set anything because this sample will use localhost. Set the developer contact information with your email and click the SAVE AND CONTINUE button: 12. Click ADD OR REMOVE SCOPES and select the following scopes, scroll the dialog, and click the UPDATE button: 13. Your non-sensitive, sensitive, and restricted scopes are listed. Accept all of them and click the SAVE AND CONTINUE button: 14. Include your email into the Test users list (using the +ADD USERS button) and click the SAVE AND CONTINUE button: 15. The wizard shows you the Summary of the filled fields. Scroll the screen and click the BACK TO DASHBOARD button.16. Now, it is time to configure the credentials for this new project. Select the option Credentials: 17. Click the button +CREATE CREDENTIALS and select OAuth client ID (we need to take this step again because now we have OAuth consent screen defined): 18. Select Web application in the Application Type and InterSystems IRIS in the Name field: 19. Configure JavaScript origins and redirects (click +ADD URI to get more fields) as follows and click the SAVE button: JavaScript URI 1: http://localhost:52773 JavaScript URI 2: http://localhost Redirect URI 1: http://localhost:52773 Redirect URI 2: http://localhost Redirect URI 3: http://localhost:52773/csp/sys/dc.login.GoogleLogin.cls 20. Now we have a Client ID and a Client Secret. You should copy them to a text file on your disk or save the data using DOWNLOAD JSON: Now we will embed Google Login to our local InterSystems IRIS instance. InterSystems IRIS Delegated Authentication Mechanism to the Management Portal InterSystems IRIS allows us to use a few authentication options, including Instance Authentication (a default option that uses a user and a password managed by the IRIS instance), LDAP (users managed by an enterprise repository), and Delegated Authentication (for customized authentication mechanism).We will choose Delegated Authentication because this is the option where the login uses OAuth and a custom code to get user credentials and authenticate users without the input passwords. When using OAuth, the OAuth server (Google in our use case) will authenticate and return to the InterSystems IRIS (OAuth client) the user details logged in.In our example, we want to set up Delegated Authentication to the Management Portal (csp/sys application). To make it happen, you need to take the following steps: Create a class extending %CSP.Login and write the custom HTML, JavaScript, and CSS code for the customized login. In this article sample we will embed a Google button to log in with the Google user. Create and write Object Script in the macro ZAUTHENTICATE.mac with the custom backend authentication login logic. Enable Delegated Authentication to the InterSystems IRIS. Set up Delegated authentication and the custom login page (the class extending %CSP.Login) for the /csp/sys web application (Management Portal web application). All these steps should be done in the %SYS namespace. Using a sample application to learn how to embed Google Login into Management Portal To learn how to embed Google Social Login into an instance of the InterSystems IRIS, we will use the application Google IRIS Login (https://openexchange.intersystems.com/package/Google-IRIS-Login). Follow these steps in order to get it: 1. Clone/git pull the repo into any local directory: git clone https://github.com/intersystems-community/intersystems-iris-dev-template.git 2. Go to the .env file and input your CLIENT_ID generated in your Google Cloud Console in the first part of this article.3. Open the terminal in this directory and call the command to build InterSystems IRIS in a container docker-compose build 4. To run the InterSystems IRIS in a container, do the next: $ docker-compose up -d 5. Access the Management Portal (http://localhost:52773/csp/sys/%2525CSP.Portal.Home.zen) and use your Google user to enter: Ready! In the upcoming sections, we will see all the steps one needs to configure and develop inside IRIS to embed Google Social Login. Creating a custom Login Page extended from %CSP.Login To embed the Google Login into the Management Portal we need to create a new Login page extending from %CSP.Login. This allows us to reuse the default Login Page but requires adding the necessary source code for supporting Google Login. The custom login page will override the following class methods: OnLoginPage: used to write the HTML and JavaScript code to render the login form with the Google Login button. DrawHead: employed to add JavaScript code that will persist Google Login credentials returned into cookies for the backend logic use. GoogleLogin.cls source code Important actions on this page:1. We got CLIENT_ID and LOGIN_URI from the env file to use as parameters for the Google Login button: Set envClientId = ##class(%SYSTEM.Util).GetEnviron("CLIENT_ID") Set envLoginURI = ##class(%SYSTEM.Util).GetEnviron("LOGIN_URI") 2. Before the content, we needed to load the Google JavaScript library <script src="https://accounts.google.com/gsi/client" async defer></script> 3. Our next step was to write the HTML code to render the Google button using LOGIN_URI and CLIENT_ID environment variables collected from the .env file: <div id="g_id_onload" data-client_id="#(envClientId)#" data-context="signin" data-ux_mode="popup" data-callback="handleCredentialResponse" data-login_uri="#(envLoginURI)#" data-auto_prompt="false"> </div> <div class="g_id_signin" data-type="standard" data-shape="rectangular" data-theme="outline" data-text="signin_with" data-size="large" data-onsuccess="onSignIn" data-logo_alignment="left"> </div> 4. On the DrawHead class method, we wrote JavaScript functions to get the Google user credentials and store them in cookies for the backend use: function setCookie(name,value,days) { var expires = ""; if (days) { var date = new Date(); date.setTime(date.getTime() + (days*24*60*60*1000)); expires = "; expires=" + date.toUTCString(); } document.cookie = name + "=" + (value || "") + expires + "; path=/"; } function newSession() { eraseCookie('email'); window.location.reload(); } function eraseCookie(name) { // This function will attempt to remove a cookie from all paths. var pathBits = location.pathname.split('/'); var pathCurrent = ' path='; // do a simple pathless delete first. document.cookie = name + '=; expires=Thu, 01-Jan-1970 00:00:01 GMT;'; for (var i = 0; i < pathBits.length; i++) { pathCurrent += ((pathCurrent.substr(-1) != '/') ? '/' : '') + pathBits[i]; document.cookie = name + '=; expires=Thu, 01-Jan-1970 00:00:01 GMT;' + pathCurrent + ';'; } } function deleteAllCookies() { const cookies = document.cookie.split(";"); for (let i = 0; i < cookies.length; i++) { const cookie = cookies[i]; const eqPos = cookie.indexOf("="); const name = eqPos > -1 ? cookie.substr(0, eqPos) : cookie; document.cookie = name + "=;expires=Thu, 01 Jan 1970 00:00:00 GMT"; } } // called when page is loaded function pageLoad() { // see if we can give focus to the UserName field: if (self.document.Login && self.document.Login.IRISUsername) { self.document.Login.IRISUsername.focus(); self.document.Login.IRISUsername.select(); } return true; } function decodeJwtResponse(token) { var base64Url = token.split('.')[1]; var base64 = base64Url.replace(/-/g, '+').replace(/_/g, '/'); var jsonPayload = decodeURIComponent(window.atob(base64).split('').map(function(c) { return '%' + ('00' + c.charCodeAt(0).toString(16)).slice(-2); }).join('')); return JSON.parse(jsonPayload); } function handleCredentialResponse(response) { const responsePayload = decodeJwtResponse(response.credential); console.log("ID: " + responsePayload.sub); console.log('Full Name: ' + responsePayload.name); console.log('Given Name: ' + responsePayload.given_name); console.log('Family Name: ' + responsePayload.family_name); console.log("Image URL: " + responsePayload.picture); console.log("Email: " + responsePayload.email); setCookie('email', responsePayload.email); setCookie('googleToken', response.credential); if(responsePayload.email !== null) { document.getElementsByName('Login')[0].submit(); } } 5. To see all the content for the Login Page go to src > cls > dc > login > GoogleLogin.cls file. Creating the backend code to authenticate Google users To write the backend logic to handle custom login using delegated mechanism, it is necessary to apply a macro with the name ZAUTHENTICATE.mac into %SYS namespace. ZAUTHENTICATE.mac source code Important actions in this file:1. Define and store an encrypted password for the new user selected in the Google Login prompt: Set GooglePassword = $SYSTEM.Util.GetEnviron("GOOGLE_PASSWORD") Set GlobalPassword = $EXTRACT(##class(dc.login.CypherUtil).DoAESCBCEncrypt(GooglePassword),1,20) 2. Refer to the Management Portal application (csp/sys) for Google Login: Set App = $SYSTEM.Util.GetEnviron("ISC_App") Set:App="" App = "/csp/sys" Set GN = "^%ZAPM.AppsDelegate" If $EXTRACT(App,*)'="/" { Set App=App_"/" } 3. Get the Google user selected and stored in the email cookie: Set Email = %request.GetCookie("email") 4. Test if the user already exists. If not, create the user as a Superuser (My choice is the %All role, but you can pick another option, for example, the %Developer role). If the user already exists, just log in: Set EmailUser = $PIECE(Email,"@") Set qry = "select * from Security.Users where EmailAddress = ?" Set tStatement = ##class(%SQL.Statement).%New() Set qStatus = tStatement.%Prepare(qry) Set rset = tStatement.%Execute(Email) While rset.%Next() { Set UserExists = 1 } If UserExists = 1 { Set Username = rset.Name Set Password = GlobalPassword Quit $SYSTEM.Status.OK() } Else { Set $nameSpace = "%SYS" if '##class(Security.Users).Exists($PIECE(Email,"@")) { Set status = ##class(Security.Users).Create($PIECE(Email,"@"),"%ALL",GlobalPassword,$PIECE(Email,"@"),"","","",0,1,"",1,0,"","","","") Set status = ##class(Security.Users).Get($PIECE(Email,"@"),.properties) Set properties("EmailAddress")=Email Set status = ##class(Security.Users).Modify($PIECE(Email,"@"), .properties) If status = 1 { Set Username = $PIECE(Email,"@") Set Password = GlobalPassword Quit $SYSTEM.Status.OK() } } } 5. Compile these files into the “%SYS” namespace. It is time to set some parameters to ensure the delegated authentication using the custom source code written above. Adjust the settings to ensure delegated login 1. Go to the Management Portal (http://localhost:52773/csp/sys/%2525CSP.Portal.Home.zen) and log in with _system/sys credentials.2. Proceed to System Administration > Security > System Security > Authentication/Web Session Options: 3. The option “Allow Delegated authentication” must be checked: 4. Now go to System Administration > Security > Users and select the user CSPSystem. The user must have the role DB_IRISSYS or %All role (it is necessary because this user is utilized to run the login page and the authenticate.mac logic with the permissions required to execute it): 5. Move to the System Administration > Security > Applications > Web Applications and select the /csp/sys application.6. In the "Allowed Authentication Methods" field, check the option "Delegated.7. The Login Page field must have dc.login.GoogleLogin.cls as a value (we need it to call our custom login page, instead of the IRIS default page): 8. Now to ensure the correct functioning, attach Shell to your IRIS docker instance: 9. Now restart the page from the prompt writing and execute the command "iris restart iris": 10. Finally, go to the Login page again. Now you have Google as an option to log in. Enjoy it! wow - what an incredibly in-depth tutorial! thank you @Yuri.Gomes for the time you took putting this together for the Community! Many thanks! @Yuri.Gomes Thank you so much for this great work - tutorial, and application! 👏👏 Looks like you implemented the idea: 💡Please add google oauth authorization to login to the management portal💡 from Ideas Portal ✨ Many thanks!👏 Thanks! Yes, I implemented it. Thank you, @Yuri.Gomes! Can we have the similar implementation for GitHub OAuth too? Github is the most popular site for developers, so authentication via GItHub will be very handy for developers. Yes, I will work on Github too.
Announcement
Danusa Calixto · Apr 14, 2023

2nd InterSystems Tech Article Contest in Portuguese

Hey Developers, We'd like to invite you to join our next 🏆 InterSystems Tech Article Contest in Portuguese 🏆 Duration: May 12 - Jun 12, 2023 We can't wait for your articles! Good luck 🍀
Question
Humza Arshad · May 5, 2023

Can I connect InterSystems to third party app

I'm working with a client who has InterSystems in his hospital to the management of patients and nurses. Now he want to develop a web app which can schedule nurse sift. Is there any way I can authenticate the user from his hospital InterSystems and get data from the system when ever the user login through the web app. @Humza.Arshad You can create a REST API/ SOAP API, or any other kind of protocol/technology, to get authentication, retrieve and send data to IRIS. See the documentation: REST SOAP These other two link are two Open Exchange apllication to create Rest Services and Form UI. RESTForms2 RESTFormsUI2 Search in the Leraning Portal for some course that help you. Best Regards.