Search

Clear filter
Announcement
Anastasia Dyubaylo · Feb 2, 2021

Online Meetup with the InterSystems Multi-Model Contest Winners

Hi Community, We're pleased to invite you to the online meetup with the winners of the InterSystems Multi-Model Contest! Date & Time: Friday, Febraury 5, 2021 – 10:00 EDT What awaits you at this virtual Meetup? Our winners' bios. Short demos on their applications. An open discussion about technologies being used, bonuses, questions. Plans for the next contests. Our speakers: @José.Pereira, Business Intelligence Developer at Shift Consultoria e Sistemas Ltda @Renato.Banzai, Machine Learning Engineer Coordinator at Itaú Unibanco @Henrique.GonçalvesDias, System Management Specialist / Database Administrator, Sao Paulo Federal Court @Evgeny.Shvarov, InterSystems Developer Ecosystem Manager You will also have the opportunity to ask any questions to our developers in a special webinar chat. We will be happy to talk to you at our Virtual Meetup! ➡️ REGISTER TODAY! 🤩 YouTube stream here: https://youtu.be/m6xF5I6wfhg Please join! Hey Developers! The recording of this virtual meetup is already on InterSystems Developers YouTube: ⏯ Online Meetup with the InterSystems Multi-Model Contest Winners Big applause to all the speakers! 👏🏼
Announcement
Anastasia Dyubaylo · Feb 3, 2021

InterSystems Grand Prix Contest Kick-off Webinar

Hi Developers, We're pleased to invite all the developers to the upcoming InterSystems Grand Prix contest kick-off webinar! The topic of this webinar is dedicated to our mega Grand Prix contest. We invite to use IntegratedML, Native API, multi-model, Analytics and NLP, Open API and Interoperability, IKO. In this webinar, we'll talk about the topics to expect from participants and show you how to develop, build and deploy applications on InterSystems IRIS data platform. Date & Time: Monday, Febraury 8 — 10:00 AM EDT Speakers: 🗣 @Evgeny.Shvarov, InterSystems Developer Ecosystem Manager 🗣 @Thomas.Dyar, InterSystems Product Specialist - Machine Learning So... We will be happy to talk to you at our webinar! ➡️ JOIN THE KICK-OFF WEBINAR Hey guys, InterSystems Product Specialist @Thomas.Dyar will be one of the webinar speakers! Save your seat today – register here! 👈🏼 Hey Developers, The recording of this webinar is available on InterSystems Developers YouTube! Please welcome: ⏯ InterSystems Grand Prix Contest Kick-off Webinar Big applause to our speakers! 👏🏼 And thanks to everyone for joining our webinar!
Announcement
Anastasia Dyubaylo · Oct 12, 2020

Online Meetup with the Winners of the InterSystems Full Stack Contest

Hi Community! We're pleased to invite you to the Online meetup with the winners of the InterSystems Full Stack contest! Date & Time: Friday, October 16, 2020 – 11:00 EDT What awaits you at this virtual Meetup? Our winners' bios. Short demos on their applications. An open discussion about technologies being used, bonuses, questions. Plans for the next contests. Our speakers: @Henrique.GonçalvesDias, System Management Specialist / Database Administrator, Sao Paulo Federal Court @Dmitry.Maslennikov, Co-founder, CTO and Developer Advocate, CaretDev Corp @MikhailenkoSergey, Chief Specialist, JSC Mosvodokanal @Vasiliy.Bondar, Chief technical officer, Yagoda LLC @Evgeny.Shvarov, InterSystems Developer Ecosystem Manager You will also have the opportunity to ask any questions to our developers in a special webinar chat. We will be happy to talk to you at our Virtual Meetup! ➡️ JOIN US ON THE ONLINE MEETUP! TOMORROW! Don't miss our virtual meetup! 😉 Register here ➡️ Please join today's meetup using this link. See you! Hey Developers! The recording of this virtual Meetup is available on InterSystems Developers YouTube: ⏯ Online Meetup with the Winners of the InterSystems Full Stack Programming Contest Big applause to all the speakers! 👏🏼 And thanks to everyone for joining our online Meetup!
Announcement
Anastasia Dyubaylo · Nov 6, 2020

New Video: InterSystems IRIS Adaptive Analytics in Action

Hey Developers, See how InterSystems IRIS Adaptive Analytics can be used to aggregate and query billions of records from a virtual cube, applying machine learning and analytics to that data. ⏯ InterSystems IRIS Adaptive Analytics in Action 👉🏼 Subscribe to InterSystems Developers YouTube. Enjoy and stay tuned!
Announcement
Anastasia Dyubaylo · Nov 27, 2020

Advent of Code 2020 with InterSystems! Join to win prizes!

Hey Developers, Are you ready to participate in the annual December competition? Join the Advent of Code 2020 with InterSystems and participate in our ObjectScript contest to win some $$$ prizes! 🏆 Our Leaderboard 🏆 Here you can find our last year's participants. 👉🏼 Join the ObjectScript private leaderboard with 130669-ab1f69bf code. Note: You need to sign in to Advent of code (e.g. with GitHub / Google / Twitter / Reddit account) to see the leaderboard and participate in the contest. Prizes: 🥇 1st place - $3,000 🥈 2nd place - $2,000 🥉 3rd place - $1,000 All winners will also get a special high-level Global Master badge. Note: InterSystems employees are not eligible for money prizes. Win Conditions: To win our prize you should be on the top of the ObjectScript Leaderboard and upload all the solutions in a public repository on GitHub and present the code in InterSystems ObjectScript in UDL form as presented in the template below. ⬇️ The Advent of Code 2020 contest ObjectScript template So! The first puzzles will unlock on December 1st at midnight EST (UTC-5). See you then and good luck to all of you! Also join the discord channel to discuss terms, rules, everything. As we discussed and decided in discord it's OK to keep the repo private until the end of the contest. At the moment the following members are in the top 20 🔥 top 20 🔥 participants according to today's results: Hey Devs! Let's see the top 20 participants 🤩 Our 🔥 top 20! Today's best 20 🔥 Top 20)) Brilliant 20 participants 🤩 Please enjoy the best 20 participants! Only 4 days left till the end of the competition 💪 Let’s see the results! Devs! Please enjoy the results of the best 20🤩 Only 2 days left! Best 20 participants at the moment 🤩 Another way to look at results -- by medal count. Border is part 1 (doesn't count), background is part 2. https://bitbucket.org/bertsarens/aoc2020/src/master/ https://github.com/uvg/AdventOfCode/tree/master/2020/COS OK. The first time I followed to the end. The code I was forced to produce by time pressure is so ugly and dirty that I refuse to publish it.Nevertheless a clear demonstration for what was ISOS (aka COS) NOT designed for:Higher mathematical calculus, matrix transformations, arrays with independent dimensions, ....[ Though I have to admit that I encountered tensors, rotors, matrix transformations, Mr. Fermat's theories, ... only at university 50+ years ago. And I didn't miss it in between ] There was no demand for the real strengths of IRIS!In addition, I verified my personal credo that writing some code in ISOS/COSonly because it is possible makes no sense and is of no commercial value.Especially with the broad range of options in IRIS to include external code.
Article
Henrique Dias · Dec 27, 2020

How difficult is it to create a report using InterSystems Reports?

How difficult is it to create a report using InterSystems Reports? Spoiler Super easy! I created two simple examples using the InterSystems IRIS + InterSystems reports and will try to share how easy it is to set up these two applications together! In the next few paragraphs, I describe a step-by-step guide to this process's ease and simplicity. After installing the Logi Report Designer, open and choose the InterSystems IRIS icon. For the second step, you have the option of choosing an existing Catalog or creating a new one. We will create a new catalog, saving it in your chosen folder, then we will create a new DataSource. Fill in the fields with the server's IP or DNS name, port, namespace, user, and password. You can then add tables, views, and queries to your data source. With your connection information, your tables, views, and/or queries configured, let's move on. Clicking on the icon New, choose New Report. For our report, I chose the Table (Group Left) as the image below: The Table Wizard will guide us by choosing the data source, data display, grouping, summary, chart, filters, and style. I'm using the covid table that @Evgeny.Shvarov created for the iris-analytics-template to my data source on this report sample. The chosen fields are Confirmed and Deaths. The group for this report will be Contry_Region. The Summary for Country_Region group will be Confirmed with the Aggregate Function (SUM), the breaking field will be Country_Region in the Footer position. Inside the chart step, I chose the Bar Chart with Country_Region as my Category and showed values for the Confirmed field. I'm applying one of the pre-existents styles, and the chosen one was "Classic." The image below shows how the report shows itself after the wizard finish. The file covid19_cases_-_WorldCount-Sample.pdf it's a "print sample" of the report. If you want another simple sample, my repo has the file refugees_CitiesImpacted_ReportSample.pdf Showing the report for the dataset for the Refugee Admission to the US Ending FY 2018, the dataset is available on Refugee Admission to the US Ending FY 2018 - dataset by associatedpress | data.world If you liked the app and think I deserve your vote, please vote for iris-analytics-package! https://openexchange.intersystems.com/contest/current
Article
Yuri Marx · Jan 25, 2021

Creating the Documentation Portal for your InterSystems IRIS Application

The ObjectScript language of InterSystems IRIS has a very powerful metadata engine called XData. This feature allows the creation of metadata definitions for your classes, to be used by the compiler or by programs that will extend the standard features of the language, based on the XData definitions of its scope. An example is IRIS Publisher, an Open Source application from the community capable of collecting all XData definitions in HTML and / or Markdown format and generating a Web Portal with all the application documentation assembled from these XData elements. See the whole process here: Let's see a practical example, follow the steps: Access the link https://openexchange.intersystems.com/package/IRIS-Publisher to download IRIS Publisher. Clone the project repository in git in a local folder of your choice: git clone https://github.com/yurimarx/iris-publisher Open the project's source code. It is in the iris-publisher folder. We will create some XData elements. Go to the Person class inside src / dc / Sample. Inspect the two example XData definitions between lines 34 to 43. /// Documentation for Person in HTML XData PersonDocHtml [ MimeType = text/html ] { <h1>This is the Person class</h1> } /// Documentation for Person in Markdown XData PersonDocMarkdown [ MimeType = text/markdown ] { <h1>This is the Person class in MD</h1> } There are 3 large blocks, the text after ///, where it is possible to write a description of the XData element; the very definition of XData and MimeType (only HTML and Markdown are captured) and finally, between {}, all the HTML or Markdown content that will compose your documentation. In this example we use both, but we recommend defining only 1 of them. It is possible to annotate all classes of the project, or part of them with XData, creating a very complete documentation of your application. After documenting your classes using XData, we will compile the project documentation, in the terminal, execute: docker-compose up -d --build After execution, your instance will be live and you will be able to run the IRIS Publisher API. Run http://localhost:52773/swagger-ui/index.html?Url=http://localhost:52773/api/mgmnt/v1/USER/spec/crud#/default/UpdateDocConfig to define title, developer information and describe your application, use the following content as an example: { "SiteName":"Publisher", "Summary":"Documentation of the Publisher", "Description":"This an Application to generate documentation from XDATA", "DeveloperName":"Yuri Gomes", "DeveloperEmail": "yurimarx@gmail.com", "DeveloperWebsite": "ymservices.tech" } Run http://localhost:52773/swagger-ui/index.html?url=http://localhost:52773/api/mgmnt/v1/USER/spec/crud#/default/InitiatePublisher to generate the documentation and start Documentation portal for your application. Go to http: // localhost: 8000 and see all the documentation online! If you enjoyed, vote in my app: https://openexchange.intersystems.com/contest/current
Announcement
Evgeny Shvarov · Feb 4, 2021

Technology bonuses in InterSystems IRIS Grand Prix Contest

Hi Developers! Here're the technology bonuses for the InterSystems Grand Prix Contest that will give you extra points in the voting. Group Bonus Points General Docker 2 ZPM 2 Unit Testing 2 API and languages REST API 2 ODBC/JDBC 2 Embedded Python usage 4 Native API in Java, Python, .NET, node.js 3 Multi-model Globals (key-value) 2 SQL 2 Object 2 New model 3 Analytics IRIS BI 2 IRIS NLP 2 InterSystems Reports 3 AI/ML Integrated ML 4 Python or Julia Gateway 3 Interoperability BPL 3 Custom Adapter 2 PEX 4 Workflow Engine 2 FHIR FHIR Server REST API 3 FHIR SQL Scheme usage 2 Healthcare Data Transformations 3 Total 58 Below are the details and useful links on all the technical bonuses. General bonuses ZPM Package deployment - 2 points You can collect the bonus if you build and publish the ZPM(ObjectScript Package Manager) package for your Full-Stack application so it could be deployed with: zpm "install your-multi-model-solution" command on IRIS with ZPM client installed. ZPM client. Documentation. Docker container usage - 2 points The application gets a 'Docker container' bonus if it uses InterSystems IRIS running in a docker container. Here is the simplest template to start from. Unit Testing - 2 points Applications that have Unit Testing for the InterSystems IRIS code will collect the bonus. Learn more about ObjectScript Unit Testing in Documentation and on Developer Community. InterSystems IRIS API InterSystems IRIS REST API usage - 2 points You get the bonus if you access InterSystems IRIS via REST API in your Full-Stack application. You either can build the REST API by yourself, or use any built-in or install it via ZPM. Learn more on InterSystems REST API. Embedded Python usage - 4 points Embedded Python needs a certain docker image, e.g. this one: intersystemsdc/iris-ml-community:2020.3.0.302.0-zpm See the related video. Here is the template which shows how Embedded Python works and how to make a ZPM package to deploy it. InterSystems Native API usage - 3 points You get this bonus if you access the data in your Full-Stack application using any of the InterSystems Native API options: .NET, Java, Python, Node.js. Learn more here. InterSystems JDBC usage - 2 points InterSystems IRIS provides a JDBC driver to access the data. You get the bonus if you refer to the data in your Full-Stack application using SQL and InterSystems JDBC. Multi-model bonuses InterSystems Globals (key-value) - 2 points InterSystems Globals are multidimensional sparse arrays that are being used to store any data in InterSystems IRIS. Each Globals node could be considered a key, which you can set a value for. InterSystems IRIS provides a set of APIs, including ObjectScript commands and Native API to manage Globals. Tools: Managing globals in the management portal Documentation: Using Multidimensional Storage (Globals) Using Globals Articles: Globals are Magic Swords for managing data The art of mapping Globals to Classes Videos: Globals QuickStart You can collect 2 points for using Globals via ObjectScript or Native API in your InterSystems SQL - 2 points InterSystems IRIS provides SQL access to data via ObjectScript, REST API, JDBC. Tools: VSCode SQL Tools DBeaver SQL in Management Portal Other SQL tools Documentation: SQL Access InterSystems SQL Reference Articles: Class Queries in ObjectScript Videos: SQL Things you should know Collect 2 bonus points by using InterSystems SQL in your application. InterSystems Objects - 2 points InterSystems IRIS provides the way to store and change instances of objects in globals via ObjectScript/REST API, Native API for Java/.NET/Node.js/Python, and XEP for Java/.NET. Documentation: Object Access Get 2 bonus points for the usage of Object Access in your application. Your data model - 2 points InterSystems IRIS can be used as a data platform that exposes your own data model API. You are able to use ObjectScript, REST API or Native API to expose your own API which provides any special data model, like Time-Series, Spatial, Graph, RDF/Triple, Column store, Document store. Introduce any of the new data-model API and collect 2 bonus points. IRIS Analytics Bonuses InterSystems IRIS BI - 2 points InterSystems IRIS Business Intelligence is a feature of IRIS which gives you the option to create BI cubes and pivots against persistent data in IRIS and deliver then this information to users using interactive dashboards. Learn more. The basic iris-analytics-template contains examples of IRIS BI cube, pivot, and a dashboard. InterSystems IRIS NLP (iKnow) - 2 points InterSystems NLP a.k.a. iKnow is an InterSystems IRIS feature and is a library for Natural Language Processing that identifies entities (phrases) and their semantic context in natural language text in English, German, Dutch, French, Spanish, Portuguese, Swedish, Russian, Ukrainian, Czech and Japanese. Learn more about iKnow on Open Exchange. Examples: Covid iKnow Text Navigator Samples Aviation and more Use iKnow to manage unstructured data in your analytics solution and get 1 bonus point. InterSystems Reports - 3 points InterSystems Reports is a feature of InterSystems IRIS which lets you design printing reports, send them via email by schedule, and deliver interactive reports for clients. InterSystems Reports is a repackaging of Logi Report (formerly named JReport®), a product of Logi Analytics®. Learn more in Documentation. Check the Github repo with examples of InterSystems Reports. Also, watch the video with the demo of InterSystems Reports and try with the Learning Lab. You can download InterSystems Reports designer and server in the WRC download section. License keys for InterSystms Reports Designer and Server will be available in Discord. AI/ML IntegratedML - 3 points IntegratedML is a feature of InterSystems IRIS that expands SQL with a set of ML instructions that let you simplify and automate AI and Machine learning calculations for your solution. Learn more on IntegratedML. You need special images of IRIS to use IntegratedML, check it here. Examples: A basic integratedML template Several examples on Open Exchange Usage of IntegratedML in your IRIS Analytics solution gives you one extra point. 2. Python Gateway usage - 2 points Python Gateway is an addon to InterSystems IRIS which gives you the way to use Python in the InterSystems IRIS environment: Execute arbitrary Python code. Seamlessly transfer data from InterSystems IRIS into Python. Build intelligent Interoperability business processes with Python Interoperability Adapter. Save, examine, modify and restore Python context from InterSystems IRIS. Learn more about Python Gateway. You can use the Python Gateway template, which includes IntegratedML too. Interoperability Bonuses Business Process BPL or Business Rules Usage - 2 point One of the key features of IRIS Interoperability Productions is a business process, which could be described by BPL (Business Process Language). Learn more on Business Processes in the documentation. Business Rule is a no-code/low-code approach to manage the processing logic of the interoperability production. In InterSystems IRIS you can create a business rule which you can create visually or via the ObjectScript representation. You can collect the Business Process/Business Rule bonus if you create and use the business process or business rule in your interoperability production. Business Rule Example Learn more on Business Rules in the documentation Custom Interoperability Adapter Usage - 2 point InterSystems Interoperability production can contain inbound or Outbound adapters that are being used to communicate with external systems by business services and operations of the production. You can use out-of-the-box adapters (like File, or Email) or develop your own. You get the bonus if you develop your own custom inbound or outbound adapter and use it in your production. Example of an adapter Learn more on adapters Production EXtension (PEX) Usage - 4 points PEX is a Java or .NET extension of Interoperability productions. You get this bonus if you use PEX with JAVA or .NET in your interoperability production. PEX Demo Learn more on PEX in Documentation Workflow Engine Usage - 2 points Workflow Engine is a part of IRIS Interoperability which could be used to automate the distribution of tasks among users. You get this bonus if you use include the usage of Workflow Engine in your interoperability production. Learn more on Workflows in Documentation. There are Community modules WorkflowAPI and WorkflowUI-ngx which provide a nice UI layer on Angular for the Workflow engine. FHIR Bonuses FHIR Server REST API usage - 3 points You get the bonus if you use the REST API endpoint of the FHIR Server in InterSystems IRIS for health. You can take the IRIS-FHIR-Template which prepares the FHIR server during the docker image building. The documentation for FHIR API 4.0.1 could be found here. Learn more in InterSystems IRIS for Health documentation. FHIR SQL Schema usage - 2 points You can collect this technology bonus if you use FHIR SQL Schema in the SQL queries of your application. You can use this schema e.g. for making an FHIR Analytics solution. Use HSFHIR_I0001_R for resources schema for full resources and HSFHIR_I0001_S schema to search with SQL for resources. Check the examples in the template. Healthcare standards transformations - 3 points InterSystems IRIS for Health contains Healthcare Interoperability modules that help to perform data transformations from different healthcare standards to FHIR and vice-versa. Make CDA to FHIR, HL7v2 to FHIR, or any other transformations in your application to collect this bonus. See the examples of HL7v2 to FHIR and CDA to FHIR transformations. Learn more in the documentation. The list of bonuses is subject to change. Stay tuned! The part on Embedded Python is updated: here is the template that could be taken as a foundation to build an Embedded Python solution with IRIS. here is an article that describes Embedded Python usage and packaging with ZPM for deployment.
Article
Sergey Mikhailenko · Oct 20, 2020

InterSystems: Solution for Technical Support and DBMS-Interoperability Administration

In this article, we'll talk about an application that I use every day when monitoring applications and integration solutions on the InterSystems IRIS platform and finding errors when they occur. While looking for a solution for logging object changes in InterSystems IRIS, Ensemble, and Caché DBMS, I came across a great article about [logging with macros](https://community.intersystems.com/post/logging-using-macros-intersystems-cach%C3%A9). Inspired by the idea, I forked the project the paper had described and adapted it to some specific needs. The resulting solution is implemented as a panel subclass, %CSP.Util.Pane, which has the main window for commands, the Run button, and enabled command configuration. This application enables viewing and editing global arrays, executing queries (including JDBC and ODBC), emailing search results as zipped XLS files, viewing and editing objects, as well as several simple graphs for system protocols. The apptools-admin application is based on jQuery-UI, UiKit, chart.js, and jsgrid.js. You are welcome to have a look at the [source code](https://openexchange.intersystems.com/package/apptools-admin). ###Installation All installation methods are described in detail in the repo. However, the simplest approach is to use the package manager command: ``` zpm "install apptools-admin" [apptools-admin] Reload START [apptools-admin] Reload SUCCESS [apptools-admin] Module object refreshed. [apptools-admin] Validate START [apptools-admin] Validate SUCCESS [apptools-admin] Compile START [apptools-admin] Compile SUCCESS [apptools-admin] Activate START [apptools-admin] Configure START http://hp-msw:52773/apptools/apptools.core.LogInfo.cls http://hp-msw:52773/apptools/apptools.Tabs.PanelUikitPermissMatrx.cls?autoload=Matrix [apptools-admin] Configure SUCCESS [apptools-admin] Activate SUCCESS ``` The first suggested link must be opened in the address field of the browser. And in the loaded panel enter `?` and press the "Execute" button. The application then displays command examples. ![](https://lh5.googleusercontent.com/Tsh6XG7TAcQJHcxWPFIWU8FK6rPFYhxzTvxtiKvjw_QAKxGicy_sJt0WhTcG8zBXNvkQzLlRQPTN4juAk8vOn3gyUXJREfgPs9rqUoM8) ###Commands In the panel, you can run utilities, view and edit globals, and execute queries. Each launch is saved in history in the context of the namespace, so it can be found and repeated. In this context, the word "launch" means starting the execution of commands, and commands will mean everything that we enter in the panel. This screenshot shows an example of a global array `^%apptools.History` view command ![](https://lh4.googleusercontent.com/Viy-pXX3dVlNrfUX7SV4Alxb9pM3I-uDKAYgHRVJKP1hK9BuvkMIuP6oPfDNYrmJb-VTl8b12Fy61q63O-nH0FYG2u8zIeux2e-vvl1h) As you know, automatic error detection and notifications can be handled by popular solutions like Prometheus. But often the severity of errors can be assessed visually. Very often I need to quickly get information about bugs in production in all namespaces. For this, I implemented a utility: `##class(apptools.core.Production).FindAndDrawAllErr` This starts a daily search request for errors for each namespaces that contains working products, and allows you to view these errors with a quick transition to visual tracing. You can run this utility like any others in the apptools panel with the `xec` prefix. ![](https://lh3.googleusercontent.com/0olzck-lNvNLsCwBphoTWLwZdSZJrNpb3qbkul4WuuXD9NnMnwpXofCsay9FxVW8S4iWvZD7L3z-s5UrKpicBofeXUrHsAfeQrnkEU8C-fjXqcdV3dmVGBcZOtgnSuFxWAHI-2Dr) All useful commands can be memorized in the global extensions, in the context of the scope, to be found and repeated at any time. ![](https://lh4.googleusercontent.com/qBRtuZL_gOFOZD92CQOr0-w-NH8PfpVhIpQZYZENmblg8_jpW-dN_pF7bKiPAcWjkE3Tew6pU0k0NsLelUE1KFcCd4Xhl3bF4SjNdtttUGqNq0_eW6GtTIiP9iBx7bjJ2UnAkrF8) ###Globals A large part of the apptools-admin application is dedicated to working with globals. Globals can be viewed in reverse order, as well as by applying a filter on both the link and the data. The displayed notes can be edited or deleted. ![](https://lh4.googleusercontent.com/_FfwdGX_A11k4ue8vZ51_3qwuVvTJd8a0UgFqDPsRJICYuUGmcRMFjOxdG1sdHkLJR3Ea7m30BHSpx33wjDDCd5qVvN01ewWUefSfgNaTzA9Z9HK2iFdYmZZ9yLYuTlTSHFAGfqJ) You can enter the `*` wildcard after the global name to get a list of globals with additional characteristics. A second `*` will add a new field, Allocated MB. A third one will add the Used MB field. This syntax resolves to a Union of the two reports, and the asterisks divide the report that is typically rather long into manageable sections. ![](https://lh6.googleusercontent.com/1osTx0tWcdQlteMHlFjIw3K6SEjH_3gO6EpTUEsfyPgR3_ns8LR3mMIPQGt31ToANPqx0fB_Fkjh6tc6WeUSwS9_8bYx5UgRjHnOkUF0o0izVz7dBB9eok2skmsCoWZbeB7gk_kY) When you get a report as a list of globals (in the screenshot above), you can follow the active links to view the global itself. You can also view and edit the global in the standard way from the management portal, by clicking `R` or `W` in the `Permission` field. Quite often, writing to the global is used to log the states of variables and objects when debugging a project. I use special macros for this: `set $$$AppL("MSW","anyText")=$$$AppObJs(%request)` In this example, `$$$AppL` forms a link to a glob with the `^log` prefix, and the date and time in the index value. `$$$AppObJs` is the object serialization macro. ![](https://lh4.googleusercontent.com/AOTb2Axpzo_YlYNJacWM4k9RAdO0OmYlkYjnUtvEWM3Djc9VQL6NTuEo1mXR5m5K-PtHtsRVUXNwsd7lwkKjuicOvRCq5j2Mwx5P2eBN8lpyPiFacue4riVFkmakPidY5P5-Iyrw) You can view the protocol global in the panel, and the object can be displayed in the window fully formatted. ###Query The function that sees almost as much use as globals is query. You run this function by entering a statement as a command. For example, you can perform an SQL statement. ![](https://lh3.googleusercontent.com/uPQs2IAuSpdORQXaxy_rlzSFmaB9RxKoiVWRGyLsG_tthobEpxU8uBunOxOTi695q9yDCHr0Xjez8IE-U8HKWKOzpvczDmmgaFrcmHfCpo6hMXsxJCP05LtdeohTiTrrooYuSRyh) You can also save the result in the global `^mtempSQLGN`. ![](https://lh4.googleusercontent.com/HE44MxizdxfluYQyuEvEs1k7vmSNganzvoxPWTGfYnjJwgYWD7u9fBlCmHUFT2LOPzfLp8vBC23yJyDDYvnMZU7gwIoJjKaVMhv5WQ8Da2-_F-NrNdjpcYyd4V0BakEiRCcrbejZ) Subsequently, the saved result in the global can be displayed in the panel. ![[](https://lh3.googleusercontent.com/yJVXXpPBZMsT-eXI0FCaWs6f7YvWpMH4zBIKAv-ejtpCdAxqK8fSh3YEy_IbF-aPv9ijRLZXcgy026xLLEAS449CtVjzeKiv2coQa9eK7OmyIbCFOs7pLxJa7Trw525xO3DJFsMH) ###Converting Reports to Excel Format One of the things that was missing in the standard management portal was the ability to execute queries configured in the database JDBC or ODBC sources, output the results in XLS format, and then archive and send the file via email. To achieve this in the application, you simply select the Upload to Excel file checkbox before executing the command. This feature saves a lot of time in my daily routine, and allows me to successfully incorporate ready-made modules into new applications and integrated solutions. ![](https://lh4.googleusercontent.com/LhyeRllHAL6q-rBiRNbCAgGOflKF8OZjomLMCjVapJ2qbvhouPS44dIHmbwt4I3-LmADhgaSNPg-u57am73bcdNGTH97rWtdL1FEmXHI5O9eQYyTBINjidT2H8TGIrXIc6kt4MnV) To enable this functionality, you first need to configure the path for creating files on the server, user credentials, as well as the mail server. For that, in turn, you need to edit the nodes of the global program settings, `^%apptools.Setting`. ![](https://lh4.googleusercontent.com/cTDe7pUN7bhHYiweuWbdL0bXsF98UoVCPsyLt84xlp-vCEH5edjvTgxiNfPIZbKRnCGpUk1m8mr0aPKHFMs0JdIDdwqS53wCF_997Z3KrRrBqv6jKCam0zlPkklC_YTxm8gRXhPb) ###Saving Reports Globally Quite often, you need to save the results of a report execution to the global. For this, you can use these procedures: | | functions | |------------------------------|---------------------------------------------------------------------| |For JDBC: |##class(apptools.core.sys).SqlToDSN | |For ODBC: |##class(apptools.core.sys).SaveGateway | |For SQL: |##class(apptools.core.sys).SaveSQL | |For Query: |##class(apptools.core.sys).SaveQuery | For example, using the `##class(apptools.core.sys).SaveQuery` function, saves the result of the query `%SYSTEM.License:Counts` to the global `^mtempGN`. ![](https://lh5.googleusercontent.com/VTwoteSkKE0MRg00CojD8HCpcK7CNAr8wAVldyVRp3dweYbXampTmhfAkwdIqGdj6H3zkJQ4_qdnCugQkkdpkga1hbXCghSHyZ5pIOufqwu5vcEv9YF3zdE__AwHaPN-5DeK2t9k) You can then display in the panel what you’ve saved with the following command: `result ^mtempGN("%SYSTEM.License:Counts", 0)` https://lh5.googleusercontent.com/KCIekwZw3guq79GWxVdHYdAbWQc4u97-dr-hWT26lYE2oEzUTSkwCE4ki1zvNqRFBg6dKQshSqcy3YSgUbjFKgX3v7Ecpa5Bm_NEQuZhP8Fn8p1gzrmAdTR-Cg9jBeVcNWGukW3a ###Enhanced Functionality Modules What else simplified and automated my work? Changes that enabled me to execute custom modules when forming a query string. I can embed new functionality into the report on the fly — like active links for additional operations on the data. Let’s see some examples. We display the query result in the browser using the function: `##class(apptools.core.LogInfoPane).DrawSQL` ![](https://lh3.googleusercontent.com/2s0tgxOgbOBLy-Pt1e8bx_gKJWNe5YQ6AWLRUCU02TcpTiUscKYeoBEce2qdzCGlbAPzIukRn5EuJ9jwu8eATPCH13zoR8A2fQoAWZfx3RpieD_8rACgikcCZpcIoAIofxlzv2mT) Let's add the word marking function `##class(apptools.core.LogInfo)`.MarkRed to parameter 5. ![](https://lh6.googleusercontent.com/OyotzU3vmjoXw_MzA6amZbpPlpbL-li71OH5JRw7sAfiVoEsAvi8wSfY588kzdyXTURtGtinj0WvIKDhNLGyy50BD40E7NEQSpNv2Iv85lQisJaMBquvheuXVrMravp6OlNxkcqI) In the same way, you can supplement the output with additional features, for example, active links or tooltips. The globals editor in this solution is implemented according to the same principle. Here's a list of functions for outputting globals and queries in tabular form: | | functions | |---------------------------|----------------------------------------------------------------------------------------------------------------------------------------| |For globals: |##class(apptools.core.LogInfoPane).DrawArray("^mtempSQLGN") | |For SQL: |##class(apptools.core.LogInfoPane).DrawSQL("select * From %SYS.ProcessQuery") | |For Query: |##class(apptools.core.LogInfoPane).DrawSQL("query %SYSTEM.License:Counts") | |For global result: |##class(apptools.core.LogInfoPane).DrawSQL("result ^mtempSQLGN") | Working with the apptools.core.Parameter Class This link will open the CSP application in a browser in the context of an instance on which apptools-admin is installed: `http://localhost:52773/apptools/apptools.Form.Exp.cls?NSP=APP&SelClass=apptools.core.Parameter` Or select the active link in the panel. ![](https://lh4.googleusercontent.com/dt3oFX7Aum3yuJ4lvOtmhUWqm55GyFPPRGbsW7phZWRAnnJkB5xE0CdD3ddFEnS0-5xzSD_ydNe2hXp8Eqk1R39aioTZunY7bymF4EkPfaukm86sfFb-YrQp5Mx_KOyU9sr9cGbR) The CSP application will be loaded for editing instances of stored classes, in this example: `apptools.core.Parameter.` ![](https://lh5.googleusercontent.com/OLXVridH04HDDbufXUp9kZ70h08ptXrRcvDRThemPEira4KANa2ECTVGUJm7nuc3crqAnerWcJMToyipqM4YZCcnwqWRVXbOFKN0ZakCvpqrpdMsQZ0yXtCBgUt8z2U_JOmKnSFF) ![](https://lh4.googleusercontent.com/8JO4JQRssC22LGhab6dZiG6PnD2NRYIQtYsY9zj-Z99IIHxVpekzsxfV3Pw04SgxEqto_JepTXcht6vBu17D834Z_7_Hh-Yr5GmXSOsI5axLf7vIHxUi-tmTwcJH9DFlomurpgCH) ###Creating a apptools.core.Parameter via the Table Navigator If you open this link in a browser in the context of the instance on which apptools-admin is installed: `http://localhost:52773/apptools/apptools.Form.Exp.cls?panel=AccordionExp&NSP=APP` Or select the active link in the panel. ![](https://lh3.googleusercontent.com/5me2dJ5aItW6iixR4mBfVRMJHDZfXRnq_pkGrlmCtCTeKzsRx0MbopN0YcLdvEsoWs46Aqw_0fVJGk6L88AaFNWeajDgggpwYawRTbdUIhHRxWo7pFiv3dqj_JdO5wgmm_uZTL5_) The CSP application will be loaded for navigating the stored classes with the ability to edit them. ![](https://lh3.googleusercontent.com/TjtNmzFRS8fTOpsRU0XAXYCHYHNpenI9H1WsEPEtVz7bT2jhakKjsfdJ_inLXX-cBsu5PlKgSJjIS3VoHXD6dqzEm0PDrhy2eOPFT-BoHx6ToPB6Jio21lN1bloGk1xtdlRR7Gd-) ###An example of a simple CSP application If you open this link in a browser in the context of the instance on which apptools-admin is installed: `http://localhost:52773/apptools/apptools.Tabs.PanelSample.cls` Or select the active link in the panel. ![](https://lh3.googleusercontent.com/mqMOjz96bd2YmJdIQtdVYsZhYDFW73BMJtjQh2q1wzzKYrzE39kWGTd2-M0kpBQlxIT2bkv2V7o7ieIlyV7aU8XNF29oI3spIoLGJJAHOKppLrTVrrR2XwOJHAQgLXM3TEQPWGGj) This example also shows the ability to edit class instances `apptools.core.Parameter.` ![](https://lh4.googleusercontent.com/sFXN0QJJb1UuyNJPhykvDOJeOEWC3RrO7oV1dqYixKnPlgEDFJdBqj5bORhaXlftxvngbu-UdgCqvG2UEr7_hKUhjGtJk6jrDNgc43f7DwWCmuDnFubMuIcavHAh7Z1--R72Pf_Q) ###Graphs To visualize database growth, the application offers a page that displays a graph of the monthly measured database size. This graph is derived from the IRIS file.log (cconsole.log for Caché) on records "Expand" retrospectively from the current day. The program traverses the protocol, finds the database extension records and subtracts the incremental megabytes from the current database size. It turns out a graph of the growth of databases. For example, the screenshot below shows a graph of events in InterSystems IRIS formed by the protocol file. ![](https://lh4.googleusercontent.com/EbO0ZVyJwj1EgKF9SR6BpKPyBERj3cNgK4ckrDrzVWVu35LUlQAINvsbArTJ946XQWBhDUzS_dm4m3ize-RM7EjyRLQkesaNvNQOvK8FuUGwKx_8gqYlCMvmC2Xy2ih0xgZKx_q-) Another example below: a schedule of events in the system based on the system protocol file.log (cconsole.log). ![](https://lh6.googleusercontent.com/s3Uz-F88rFnBWSCS5_m4vtCQL9kdS2dEL101oWtlfmmNpfjF1PgtPppI2GC1g3syXIr39X1dUBO0O-gC5mDXcT1k6xOkXrz19TeRqpRAWrNG_FL6kMoyAZqS2N7mIDjG2BKpPy_j) ###Summary The application we’ve discussed in this article was designed to help me perform my daily tasks. It includes a set of modules which you can use as building blocks for a custom administrator tool. I would be very glad if you found it useful in your work. You are welcome to add your wishes and suggestions as tasks to the project [repo](https://github.com/SergeyMi37/apptools-admin).
Announcement
Anastasia Dyubaylo · Dec 4, 2020

New Video: Building a REST API with InterSystems IRIS

Hi Community, See how to build a REST API with InterSystems IRIS in just five minutes, leveraging Docker containers: ⏯ Building a REST API with InterSystems IRIS Subscribe to InterSystems Developers YouTube and stay tuned!
Announcement
Anastasia Dyubaylo · Dec 11, 2020

New Video: The Freedom of Visualization Choice: InterSystems BI

Hi Community, Please welcome the new video on InterSystems Developers YouTube: ⏯ The Freedom of Visualization Choice: InterSystems BI Find out how three visualization tools available to InterSystems customers — InterSystems IRIS Business Intelligence, Microsoft Power BI, and InterSystems Reports — can be used to answer one data question. See a demonstration of each and learn about the benefits of each tool and what makes them different. Additional materials to this video you can find in this InterSystems Online Learning Course. Enjoy watching this video! 👍🏼
Announcement
Anastasia Dyubaylo · Jan 7, 2021

InterSystems Multi-Model Contest Kick-off Webinar

Hi Developers, We're pleased to invite all the developers to the upcoming InterSystems Multi-model contest kick-off webinar! The topic of this webinar is dedicated to the Multi-model contest. On this webinar, we will demonstrate the APIs for each data model in action. Date & Time: Monday, January 11 — 10:00 AM EDT Speakers: 🗣 @Benjamin.DeBoe, InterSystems Product Manager🗣 @Robert.Kuszewski, InterSystems Product Manager - Developer Experience🗣 @Evgeny.Shvarov, InterSystems Developer Ecosystem Manager So... We will be happy to talk to you at our webinar! ➡️ JOIN THE KICK-OFF WEBINAR Today! Don't miss our kick-off webinar! ➡️ JOIN THE WEBINAR HERE Hey Developers, The recording of this webinar is available on InterSystems Developers YouTube! Please welcome: ⏯ InterSystems Multi-Model Contest Kick-off Webinar Big applause to our speakers! 👏🏼 And thanks to everyone for joining our webinar!
Article
Mikhail Khomenko · Jan 21, 2021

InterSystems Kubernetes Operator Deep Dive: Part 2

In the previous article, we looked at one way to create a custom operator that manages the IRIS instance state. This time, we’re going to take a look at a ready-to-go operator, InterSystems Kubernetes Operator (IKO). Official documentation will help us navigate the deployment steps. Prerequisites To deploy IRIS, we need a Kubernetes cluster. In this example, we’ll use Google Kubernetes Engine (GKE), so we’ll need to use a Google account, set up a Google Cloud project, and install gcloud and kubectl command line utilities. You’ll also need to install the Helm3 utility: $ helm version version.BuildInfo{Version:"v3.3.4"...} Note: Be aware that on Google free tier, not all resources are free. It doesn’t matter in our case which type of GKE we use – zonal, regional, or private. After we create one, let’s connect to the cluster. We’ve created a cluster called “iko” in a project called “iko-project”. Use your own project name in place of “iko-project” in the later text. This command adds this cluster to our local clusters configuration: $ gcloud container clusters get-credentials iko --zone europe-west2-b --project iko-project Install IKO Let’s deploy IKO into our newly-created cluster. The recommended way to install packages to Kubernetes is using Helm. IKO is not an exception and can be installed as a Helm chart. Choose Helm version 3 as it's more secure. Download IKO from the WRC page InterSystems Components, creating a free developer account if you do not already have one. At the moment of writing, the latest version is 2.0.223.0. Download the archive and unpack it. We will refer to the unpacked directory as the current directory. The chart is in the chart/iris-operator directory. If you just deploy this chart, you will receive an error when describing deployed pods: Failed to pull image "intersystems/iris-operator:2.0.0.223.0": rpc error: code = Unknown desc = Error response from daemon: pull access denied for intersystems/iris-operator, repository does not exist or may require 'docker login'. So, you need to make an IKO image available from the Kubernetes cluster. Let’s push this image into Google Container Registry first: $ docker load -i image/iris_operator-2.0.0.223.0-docker.tgz $ docker tag intersystems/iris-operator:2.0.0.223.0 eu.gcr.io/iko-project/iris-operator:2.0.0.223.0 $ docker push eu.gcr.io/iko-project/iris-operator:2.0.0.223.0 After that, we need to direct IKO to use this new image. You should do this by editing the Helm values file: $ vi chart/iris-operator/values.yaml ... operator: registry: eu.gcr.io/iko-project ... Now, we’re ready to deploy IKO into GKE: $ helm upgrade iko chart/iris-operator --install --namespace iko --create-namespace $ helm ls --all-namespaces --output json | jq '.[].status' "deployed" $ kubectl -n iko get pods # Should be Running with Readiness 1/1 Let’s look at the IKO logs: $ kubectl -n iko logs -f --tail 100 -l app=iris-operator … I1212 17:10:38.119363 1 secure_serving.go:116] Serving securely on [::]:8443 I1212 17:10:38.122306 1 operator.go:77] Starting Iris operator Custom Resource Definition irisclusters.intersystems.com was created during IKO deployment. You can look at the API schema it supports, although it is quite long: $ kubectl get crd irisclusters.intersystems.com -oyaml | less One way to look at all available parameters is to use the “explain” command: $ kubectl explain irisclusters.intersystems.com Another way is using jq. For instance, viewing all top-level configuration settings: $ kubectl get crd irisclusters.intersystems.com -ojson | jq '.spec.versions[].schema.openAPIV3Schema.properties.spec.properties | to_entries[] | .key' "configSource" "licenseKeySecret" "passwordHash" "serviceTemplate" "topology" Using jq in this way (viewing the configuration fields and their properties), we can find out the following configuration structure: configSource name licenseKeySecret name passwordHash serviceTemplate metadata annotations spec clusterIP externalIPs externalTrafficPolicy healthCheckNodePort loadBalancerIP loadBalancerSourceRanges ports type topology arbiter image podTemplate controller annotations metadata annotations spec affinity nodeAffinity preferredDuringSchedulingIgnoredDuringExecution requiredDuringSchedulingIgnoredDuringExecution podAffinity preferredDuringSchedulingIgnoredDuringExecution requiredDuringSchedulingIgnoredDuringExecution podAntiAffinity preferredDuringSchedulingIgnoredDuringExecution requiredDuringSchedulingIgnoredDuringExecution args env imagePullSecrets initContainers lifecycle livenessProbe nodeSelector priority priorityClassName readinessProbe resources schedulerName securityContext serviceAccountName tolerations preferredZones updateStrategy rollingUpdate type compute image podTemplate controller annotations metadata annotations spec affinity nodeAffinity preferredDuringSchedulingIgnoredDuringExecution requiredDuringSchedulingIgnoredDuringExecution podAffinity preferredDuringSchedulingIgnoredDuringExecution requiredDuringSchedulingIgnoredDuringExecution podAntiAffinity preferredDuringSchedulingIgnoredDuringExecution requiredDuringSchedulingIgnoredDuringExecution args env imagePullSecrets initContainers lifecycle livenessProbe nodeSelector priority priorityClassName readinessProbe resources limits requests schedulerName securityContext serviceAccountName tolerations preferredZones replicas storage accessModes dataSource apiGroup kind name resources limits requests selector storageClassName volumeMode volumeName updateStrategy rollingUpdate type data image mirrored podTemplate controller annotations metadata annotations spec affinity nodeAffinity preferredDuringSchedulingIgnoredDuringExecution requiredDuringSchedulingIgnoredDuringExecution podAffinity preferredDuringSchedulingIgnoredDuringExecution requiredDuringSchedulingIgnoredDuringExecution podAntiAffinity preferredDuringSchedulingIgnoredDuringExecution requiredDuringSchedulingIgnoredDuringExecution args env imagePullSecrets initContainers lifecycle livenessProbe nodeSelector priority priorityClassName readinessProbe resources limits requests schedulerName securityContext serviceAccountName tolerations preferredZones shards storage accessModes dataSource apiGroup kind name resources limits requests selector storageClassName volumeMode volumeName updateStrategy rollingUpdate type There are so many settings, but, you don’t need to set them all. The defaults are suitable. You can see examples of configuration in the file iris_operator-2.0.0.223.0/samples. To run a minimal viable IRIS, we need to specify only a few settings, like IRIS (or IRIS-based application) version, storage size, and license key. Note about license key: we’ll use a community IRIS, so we don’t need a key. We cannot just omit this setting, but can create a secret containing a pseudo-license. License secret generation is simple: $ touch iris.key # remember that a real license file is used in the most cases $ kubectl create secret generic iris-license --from-file=iris.key An IRIS description understandable by IKO is: $ cat iko.yaml apiVersion: intersystems.com/v1alpha1 kind: IrisCluster metadata: name: iko-test spec: passwordHash: '' # use a default password SYS licenseKeySecret: name: iris-license # use a Secret name bolded above topology: data: image: intersystemsdc/iris-community:2020.4.0.524.0-zpm # Take a community IRIS storage: resources: requests: storage: 10Gi Send this manifest into the cluster: $ kubectl apply -f iko.yaml $ kubectl get iriscluster NAME DATA COMPUTE MIRRORED STATUS AGE iko-test 1 Creating 76s $ kubectl -n iko logs -f --tail 100 -l app=iris-operator db.Spec.Topology.Data.Shards = 0 I1219 15:55:57.989032 1 iriscluster.go:39] Sync/Add/Update for IrisCluster default/iko-test I1219 15:55:58.016618 1 service.go:19] Creating Service default/iris-svc. I1219 15:55:58.051228 1 service.go:19] Creating Service default/iko-test. I1219 15:55:58.216363 1 statefulset.go:22] Creating StatefulSet default/iko-test-data. We see that some resources (Service, StatefulSet) are going to be created in a cluster in the “default” namespace. In a few seconds, you should see an IRIS pod in the “default” namespace: $ kubectl get po -w NAME READY STATUS RESTARTS AGE iko-test-data-0 0/1 ContainerCreating 0 2m10s Wait a little until the IRIS image is pulled, that is, until Status becomes Ready and Ready becomes 1/1. You can check what type of disk was created: $ kubectl get pv NAME CAPACITY ACCESS MODES RECLAIM POLICY STATUS CLAIM STORAGECLASS REASON AGE pvc-b356a943-219e-4685-9140-d911dea4c106 10Gi RWO Delete Bound default/iris-data-iko-test-data-0 standard 5m Reclaim policy “Delete” means that when you remove Persistent Volume, GCE persistent disk will be also removed. There is another policy, “Retain”, that allows you to save Google persistent disks to survive Kubernetes Persistent Volumes deletion. You can define a custom StorageClass to use this policy and other non-default settings. An example is present in IKO’s documentation: Create a storage class for persistent storage. Now, let’s check our newly created IRIS. In general, traffic to pods goes through Services or Ingresses. By default, IKO creates a service of ClusterIP type with a name from the iko.yaml metadata.name field: $ kubectl get svc iko-test NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE iko-test ClusterIP 10.40.6.33 <none> 1972/TCP,52773/TCP 14m We can call this service using port-forward: $ kubectl port-forward svc/iko-test 52773 Navigate a browser to http://localhost:52773/csp/sys/UtilHome.csp and type _system/SYS. You should see a familiar IRIS user interface (UI). Custom Application Let’s replace a pure IRIS with an IRIS-based application. First, download the COVID-19 application. We won’t consider a complete, continuous deployment here, just minimal steps: $ git clone https://github.com/intersystems-community/covid-19.git $ cd covid-19 $ docker build --no-cache -t covid-19:v1 . As our Kubernetes is running in a Google cloud, let’s use Google Docker Container Registry as an image storage. We assume here that you have an account in Google Cloud allowing you to push images. Use your own project name in the below-mentioned commands: $ docker tag covid-19:v1 eu.gcr.io/iko-project/covid-19:v1 $ docker push eu.gcr.io/iko-project/covid-19:v1 Let’s go to the directory with iko.yaml, change the image there, and redeploy it. You should consider removing the previous example first: $ cat iko.yaml ... data: image: eu.gcr.io/iko-project/covid-19:v1 ... $ kubectl delete -f iko.yaml $ kubectl -n iko delete deploy -l app=iris-operator $ kubectl delete pvc iris-data-iko-test-data-0 $ kubectl apply -f iko.yaml You should recreate the IRIS pod with this new image. This time, let’s provide external access via Ingress Resource. To make it work, we should deploy an Ingress Controller (choose nginx for its flexibility). To provide a traffic encryption (TLS), we will also add yet another component – cert-manager. To install both these components, we use a Helm tool, version 3. $ helm repo add ingress-nginx https://kubernetes.github.io/ingress-nginx $ helm upgrade nginx-ingress \ --namespace nginx-ingress \ ingress-nginx/ingress-nginx \ --install \ --atomic \ --version 3.7.0 \ --create-namespace Look at an nginx service IP (it’s dynamic, but you can make it static): $ kubectl -n nginx-ingress get svc NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE nginx-ingress-ingress-nginx-controller LoadBalancer 10.40.0.103 xx.xx.xx.xx 80:32032/TCP,443:32374/TCP 88s Note: your IP will differ. Go to your domain registrar and create a domain name for this IP. For instance, create an A-record: covid19.myardyas.club = xx.xx.xx.xx Some time will pass until this new record propagates across DNS servers. The end result should be similar to: $ dig +short covid19.myardyas.club xx.xx.xx.xx Having deployed Ingress Controller, we now need to create an Ingress resource itself (use your own domain name): $ cat ingress.yaml apiVersion: extensions/v1beta1 kind: Ingress metadata: name: iko-test annotations: kubernetes.io/ingress.class: nginx nginx.ingress.kubernetes.io/use-regex: "true" nginx.ingress.kubernetes.io/ssl-redirect: "true" certmanager.k8s.io/cluster-issuer: lets-encrypt-production # Cert manager will be deployed below spec: rules: - host: covid19.myardyas.club http: paths: - backend: serviceName: iko-test servicePort: 52773 path: / tls: - hosts: - covid19.myardyas.club secretName: covid19.myardyas.club $ kubectl apply -f ingress.yaml After a minute or so, IRIS should be available at http://covid19.myardyas.club/csp/sys/UtilHome.csp (remember to use your domain name) and the COVID-19 application at http://covid19.myardyas.club/dsw/index.html (choose namespace IRISAPP). Note: Above, we’ve exposed the HTTP IRIS port. If you need to expose via nginx TCP super-server port (1972 or 51773), you can read instructions at Exposing TCP and UDP services. Add Traffic Encryption The last step is to add traffic encryption. Let’s deploy cert-manager for that: $ kubectl apply -f https://raw.githubusercontent.com/jetstack/cert-manager/v0.10.0/deploy/manifests/00-crds.yaml $ helm upgrade cert-manager \ --namespace cert-manager \ jetstack/cert-manager \ --install \ --atomic \ --version v0.10.0 \ --create-namespace $ cat lets-encrypt-production.yaml apiVersion: certmanager.k8s.io/v1alpha1 kind: ClusterIssuer metadata: name: lets-encrypt-production spec: acme: # Set your email. Let’s Encrypt will send notifications about certificates expiration email: mvhoma@gmail.com server: https://acme-v02.api.letsencrypt.org/directory privateKeySecretRef: name: lets-encrypt-production solvers: - http01: ingress: class: nginx $ kubectl apply -f lets-encrypt-production.yaml Wait a few minutes until cert-manager notices IRIS-application ingress and goes to Let’s Encrypt for a certificate. You can observe Order and Certificate resources in progress: $ kubectl get order NAME STATE AGE covid19.myardyas.club-3970469834 valid 52s $ kubectl get certificate NAME READY SECRET AGE covid19.myardyas.club True covid19.myardyas.club 73s This time, you can visit a more secured site version - https://covid19.myardyas.club/dsw/index.html: About Native Google Ingress Controller and Managed Certificates Google supports its own ingress controller, GCE, which you can use in place of an nginx controller. However, it has some drawbacks, for instance, lack of rewrite rules support, at least at the moment of writing. Also, you can use Google managed certificates in place of cert-manager. It’s handy, but initial retrieval of certificate and any updates of Ingress resources (like new path) causes a tangible downtime. Also, Google managed certificates work only with GCE, not with nginx, as noted in Managed Certificates. Next Steps We’ve deployed an IRIS-based application into the GKE cluster. To expose it to the Internet, we’ve added Ingress Controller and a certification manager. We’ve tried the IrisCluster configuration to highlight that setting up IKO is simple. You can read about more settings in Using the InterSystems Kubernetes Operator documentation. A single data server is good, but the real fun begins when we add ECP, mirroring, and monitoring, which are also available with IKO. Stay tuned and read the upcoming article in our Kubernetes operator series to take a closer look at mirroring.
Article
Yuri Marx · Feb 18, 2021

Do security scan in your InterSystems IRIS container

There are many options to do a full security scan in your docker images, the most popular option is Anchore community edition. Anchore will use the main public vulnerabilities databases available, including CVE. To install Anchore is very ease (source: https://engine.anchore.io/docs/quickstart/), follow the steps: Create a folder in your OS and download the anchor docker compose file to the created folder. curl -O https://engine.anchore.io/docs/quickstart/docker-compose.yaml Run: docker-compose up -d Check docker services availability (services with up status): docker-compose ps Check Anchore services availability (services with up and the product version in the last row): docker-compose exec api anchore-cli system status Now wait for the vulnerability database sync (about 30 to 120 minutes, depends the internet speed). You can check the progress running this command: docker-compose exec api anchore-cli system feeds list When all files synced, you can begin to use Anchore. To do a security scan is simple, but you need to know the docker image name to be scanned. I will scan the last InterSystems IRIS docker image (after add write your docker image name): # docker-compose exec api anchore-cli image add store/intersystems/iris-community:2020.4.0.524.0 # docker-compose exec api anchore-cli image wait store/intersystems/iris-community:2020.4.0.524.0 You will see this message until analysis end: Status: analyzing Waiting 5.0 seconds for next retry. With the image added, you can see the analysis status/content, see: docker-compose exec api anchore-cli image content store/intersystems/iris-community:2020.4.0.524.0 os: available files: available npm: available gem: available python: available java: available binary: available go: available malware: available With the status analyzed, it is possible list the current vulnerabilities found, see: docker-compose exec api anchore-cli image vuln store/intersystems/iris-community:2020.4.0.524.0 all Finally to know if the your image passed in the security scan, run: docker-compose exec api anchore-cli evaluate check store/intersystems/iris-community:2020.4.0.524.0 See more details in this tutorial: https://anchore.com/blog/docker-image-security-in-5-minutes-or-less/. Enjoy! Thanks @YURI MARX GOMESWell explained
Announcement
Anastasia Dyubaylo · May 30, 2022

InterSystems Grand Prix Contest 2022: Voting time!

Hey Developers, Let the voting week begin! It's time to cast your votes for the best applications in the Grand Prix Programming Contest! 🔥 You decide: VOTE HERE 🔥 How to vote? Details below. Experts nomination: InterSystems experienced jury will choose the best apps to nominate the prizes in the Experts Nomination. Please welcome our experts: ⭐️ @Alexander.Woodhead, Technical Specialist⭐️ @Steven.LeBlanc, Product Specialist⭐️ @Alexander.Koblov, Senior Support Specialist⭐️ @Daniel.Kutac, Senior Sales Engineer⭐️ @Eduard.Lebedyuk, Senior Cloud Engineer⭐️ @Steve.Pisani, Senior Solution Architect⭐️ @Timothy.Leavitt, Development Manager⭐️ @Thomas.Dyar, Product Specialist⭐️ @Andreas.Dieckow, Product Manager⭐️ @Benjamin.DeBoe, Product Manager⭐️ @Carmen.Logue, Product Manager⭐️ @Luca.Ravazzolo, Product Manager⭐️ @Stefan.Wittmann, Product Manager⭐️ @Raj.Singh5479, Product Manager⭐️ @Robert.Kuszewski, Product Manager⭐️ @Jeffrey.Fried, Director of Product Management⭐️ @Dean.Andrews2971, Head of Developer Relations ⭐️ @Evgeny.Shvarov, Developer Ecosystem Manager Community nomination: For each user, a higher score is selected from two categories below: Conditions Place 1st 2nd 3rd If you have an article posted on DC and an app uploaded to Open Exchange (OEX) 9 6 3 If you have at least 1 article posted on DC or 1 app uploaded to OEX 6 4 2 If you make any valid contribution to DC (posted a comment/question, etc.) 3 2 1 Level Place 1st 2nd 3rd VIP Global Masters level or ISC Product Managers 15 10 5 Ambassador GM level 12 8 4 Expert GM level or DC Moderators 9 6 3 Specialist GM level 6 4 2 Advocate GM level or ISC Employees 3 2 1 Blind vote! The number of votes for each app will be hidden from everyone. Once a day we will publish the leaderboard in the comments to this post. The order of projects on the contest page will be as follows: the earlier an application was submitted to the competition, the higher it will be in the list. P.S. Don't forget to subscribe to this post (click on the bell icon) to be notified of new comments. To take part in the voting, you need: Sign in to Open Exchange – DC credentials will work. Make any valid contribution to the Developer Community – answer or ask questions, write an article, contribute applications on Open Exchange – and you'll be able to vote. Check this post on the options to make helpful contributions to the Developer Community. If you changed your mind, cancel the choice and give your vote to another application! Support the application you like! Note: contest participants are allowed to fix the bugs and make improvements to their applications during the voting week, so don't miss and subscribe to application releases! So! After the first day of the voting we have: Expert Nomination, Top 3 Docker InterSystems Extension by @Dmitry.Maslennikov webterminal-vscode by @John.Murray Water Conditions in Europe by @Evgeniy.Potapov ➡️ Voting is here. Community Nomination, Top 3 webterminal-vscode by @John.Murray Docker InterSystems Extension by @Dmitry.Maslennikov Disease Predictor by @Yuri.Gomes ➡️ Voting is here. Experts, we are waiting for your votes! 🔥 Participants, improve & promote your solutions! Seems to be some confusion about whose (or which) app was first in Community section after the first day: Here are the results after 2 days of voting: Expert Nomination, Top 3 Water Conditions in Europe by @Evgeniy.Potapov Docker InterSystems Extension by @Dmitry Maslennikov test-dat by @Oliver.Wilms ➡️ Voting is here. Community Nomination, Top 3 iris-megazord by @José.Pereira webterminal-vscode by @John Murray Docker InterSystems Extension by @Dmitry Maslennikov ➡️ Voting is here. So, the voting continues. Please support the application you like! Devs! Here are the top 5 for now: Expert Nomination, Top 5 Water Conditions in Europe by @Evgeniy.Potapov CloudStudio by @Sean.Connelly iris-fhir-client by @Muhammad.Waseem iris-megazord by @José.Pereira Docker InterSystems Extension by @Dmitry Maslennikov ➡️ Voting is here. Community Nomination, Top 5 iris-megazord by @José.Pereira webterminal-vscode by @John.Murray iris-fhir-client by @Muhammad.Waseem CloudStudio by @Sean.Connelly Docker InterSystems Extension by @Dmitry Maslennikov ➡️ Voting is here. Who is gonna be the winner?!😱 Only 1 day left till the end of the voting period. Support our participants with your votes! Last day of voting! ⌛ Don't miss the opportunity to support the application you like!Our contestants need your votes! 📢 Developers! Last call!Only a few hours left to the end of voting! Cast your votes for applications you like!