Search

Clear filter
Announcement
Evgeny Shvarov · Jul 6, 2023

Technological Bonuses Results for InterSystems Grand Prix Contest 2023

Hi Developers! Here is the bonus results for the applications in InterSystems Grand Prix Programming Contest 2023: Project LLM AI or LangChain FHIR SQL Builder FHIR IntegratedML Native API Embedded Python Interoperability PEX Adaptive Analytics Tableau, PowerBI, Logi IRIS BI Columnar Index Docker ZPM Online Demo Unit Testing Community Idea Implementation First Article on DC Second Article on DC Code Quality First Time Contribution Video on YouTube Total Bonus Nominal 6 5 3 4 3 4 3 2 3 3 3 1 2 2 2 2 4 2 1 1 3 3 62 oex-mapping 4 3 2 2 2 2 2 1 1 3 22 appmsw-warm-home 2 2 2 2 1 9 RDUH Interface Analyst HL7v2 Browser Extension 3 3 6 irisapitester 4 2 2 2 1 1 3 15 oex-vscode-snippets-template 2 2 4 1 9 IRIS FHIR Transcribe Summarize Export 6 3 4 2 2 2 2 1 1 3 3 29 IntegratedMLandDashboardSample 4 3 2 2 1 12 iris-user-manager 2 2 1 5 irisChatGPT 6 5 4 2 2 2 2 1 1 3 28 fhir-chatGPT 6 3 4 2 1 16 iris-fhir-generative-ai 6 3 4 3 2 2 2 2 1 1 3 29 IRIS Data Migration Manager - - - 0 password-app-iris-db 3 2 2 2 3 3 15 interoperability_GPT 6 4 3 2 1 16 FHIR Editor 3 2 5 Recycler 3 - - 3 ZProfile 2 2 2 2 3 11 DevBox 6 2 3 11 FHIR - AI and OpenAPI Chain 6 3 2 2 2 2 1 1 3 3 25 IntegratedML-IRIS-PlatformEntryPrediction 4 3 3 10 Please apply with your comments for new implementations and corrections to be made here in the comments or in Discord. Hi @Evgeny.Shvarov ! I used Java to connect to IRIS in the application and associated an article with it, but I did not see it in bonus points. Can they be added to bonus points? Hi Zhang! We don't have points for using Java. What bonus are you talking about? If you mean Native API you haven't use it. You used only jdbc in your project without Native SDK . Hi @Evgeny.Shvarov Thanks for publishing the bonuses.Please note that I have added FHIR SQL Builder functionality in my new release of irisChatGPT application. So pls consider it.Thanks Hi Muhammad! Your points were added to the table! Thank you! Hi @Semion.Makarov I added a BI dashboard to do analytics on the app logs of iris-fhir-generative-ai to the release 1.0.9, and a second article explaining about such analytics. So, I'd like to ask for IRIS BI and Second article bonuses. PS: Sorry for publish this so late, but I had this idea just Sunday late. 😄 Thanks! Hi Jose! I've applied these bonuses to your app.
Article
Alex Woodhead · Jun 15, 2023

LangChain InterSystems PDF to Interview Questions and FlashCards

Demonstration example for the current Grand Prix contest for use of a more complex Parameter template to test the AI. Interview Questions There is documentation. A recruitment consultant wants to quickly challenge candidates with some relevant technical questions to a role. Can they automate making a list of questions and answers from the available documentation? Interview Answers and Learning One of the most effective ways to cement new facts into accessible long term memory is with phased recall. In essence you take a block of text information, reorganize it into a series of self-contained Questions and Facts. Now imagine two questions: What day of the week is the trash-bin placed outside for collection? When is the marriage anniversary? Quickly recalling correct answers can mean a happier life!! Recalling the answer to each question IS the mechanism to enforce a fact into memory. Phased Recall re-asks each question with longed and longer time gaps when the correct answer is recalled.For example: You consistently get the right answer: The question is asked again tomorrow, in 4 days, in 1 week, in 2 weeks, in 1 month. You consistently get the answer wrong: The question will be asked every day until it starts to be recalled. If you can easily see challenging answers, it is productive to re-work difficult answers, to make them more memorable. There is a free software package called Anki that provides this full phased recall process for you. If you can automate the creation of questions and answers into a text file, the Anki will create new flashcards for you. Hypothesis We can use LangChain to transform InterSystems PDF documentation into a series of Questions and answers to: Make interview questions and answers Make Learner Anki flash cards Create new virtual environment mkdir chainpdf cd chainpdf python -m venv . scripts\activate pip install openai pip install langchain pip install wget pip install lancedb pip install tiktoken pip install pypdf set OPENAI_API_KEY=[ Your OpenAI Key ] python Prepare the docs import glob import wget; url='https://docs.intersystems.com/irisforhealth20231/csp/docbook/pdfs.zip'; wget.download(url) # extract docs import zipfile with zipfile.ZipFile('pdfs.zip','r') as zip_ref: zip_ref.extractall('.') Extract PDF text from langchain.document_loaders import PyPDFLoader from langchain.embeddings.openai import OpenAIEmbeddings from langchain.text_splitter import CharacterTextSplitter from langchain.prompts.prompt import PromptTemplate from langchain import OpenAI from langchain.chains import LLMChain # To limit for the example # From the documentation site I could see that documentation sets # GCOS = Using ObjectScript # RCOS = ObjectScript Reference pdfFiles=['./pdfs/pdfs/GCOS.pdf','./pdfs/pdfs/RCOS.pdf'] # The prompt will be really big and need to leave space for the answer to be constructed # Therefore reduce the input string text_splitter = CharacterTextSplitter( separator = "\n\n", chunk_size = 200, chunk_overlap = 50, length_function = len, ) # split document text into chuncks documentsAll=[] for file_name in pdfFiles: loader = PyPDFLoader(file_name) pages = loader.load_and_split() # Strip unwanted padding for page in pages: del page.lc_kwargs page.page_content=("".join((page.page_content.split('\xa0')))) documents = text_splitter.split_documents(pages) # Ignore the cover pages for document in documents[2:]: # skip table of contents if document.page_content.__contains__('........'): continue documentsAll.append(document) Prep search template _GetDocWords_TEMPLATE = """From the following documents create a list of distinct facts. For each fact create a concise question that is answered by the fact. Do NOT restate the fact in the question. Output format: Each question and fact should be output on a seperate line delimited by a comma character Escape every double quote character in a question with two double quotes Add a double quote to the beginning and end of each question Escape every double quote character in a fact with two double quotes Add a double quote to the beginning and end of each fact Each line should end with {labels} The documents to reference to create facts and questions are as follows: {docs} """ PROMPT = PromptTemplate( input_variables=["docs","labels"], template=_GetDocWords_TEMPLATE ) llm = OpenAI(temperature=0, verbose=True) chain = LLMChain(llm=llm, prompt=PROMPT) Process each document and place output in file # open an output file with open('QandA.txt','w') as file: # iterate over each text chunck for document in documentsAll: # set the label for Anki flashcard source=document.metadata['source'] if source.__contains__('GCOS.pdf'): label='Using ObjectScript' else: label='ObjectScript Reference' output=chain.run(docs=document,labels=label) file.write(output+'\n') file.flush() There were some retry and force-close messages during loop. Anticipate this is limiting the openAI API to a fair use. Alternatively a local LLM could be applied instead. Examine the output file "What are the contexts in which ObjectScript can be used?", "You can use ObjectScript in any of the following contexts: Interactively from the command line of the Terminal, As the implementation language for methods of InterSystems IRIS object classes, To create ObjectScript routines, and As the implementation language for Stored Procedures and Triggers within InterSystems SQL.", Using ObjectScript, "What is a global?", "A global is a sparse, multidimensional database array.", Using ObjectScript, "What is the effect of the ##; comment on INT code line numbering?", "It does not change INT code line numbering.", Using ObjectScript, "What characters can be used in an explicit namespace name after the first character?", "letters, numbers, hyphens, or underscores", Using ObjectScript "Are string equality comparisons case-sensitive?", "Yes" Using ObjectScript, "What happens when the number of references to an object reaches 0?", "The system automatically destroys the object.",Using ObjectScript Question: "What operations can take an undefined or defined variable?", Fact: "The READ command, the $INCREMENT function, the $BIT function, and the two-argument form of the $GET function.", Using ObjectScript, a While a good attempt at formatting answers has occurred there is some deviation. Manually reviewing I can pick some questions and answers to continue the experiment. Importing FlashCards into Anki Reviewed text file: "What are the contexts in which ObjectScript can be used?", "You can use ObjectScript in any of the following contexts: Interactively from the command line of the Terminal, As the implementation language for methods of InterSystems IRIS object classes, To create ObjectScript routines, and As the implementation language for Stored Procedures and Triggers within InterSystems SQL.", "Using ObjectScript","What is a global?", "A global is a sparse, multidimensional database array.", "Using ObjectScript","What is the effect of the ##; comment on INT code line numbering?", "It does not change INT code line numbering.", "Using ObjectScript","What characters can be used in an explicit namespace name after the first character?", "letters, numbers, hyphens, or underscores", "Using ObjectScript""Are string equality comparisons case-sensitive?", "Yes", "Using ObjectScript","What happens when the number of references to an object reaches 0?", "The system automatically destroys the object.","Using ObjectScript""What operations can take an undefined or defined variable?", "The READ command, the $INCREMENT function, the $BIT function, and the two-argument form of the $GET function.", "Using ObjectScript" Creating new Anki card deck Open Anki and select File -> Import Select the reviewed text file Optionally create a new Card Deck for "Object Script" A basic card type is fine for this format There was mention of a "Field 4" so should check the records. Anki import success Lets Study Now choose the reinforcement schedule Happy Learning !! References Anki software is available from https://apps.ankiweb.net/
Article
Claudio Devecchi · Jun 20, 2023

Fast API Development using InterSystems Open Exchange Tools

In this article, I will share the theme we presented at the Global Summit 2023, in the Tech Exchange room. Me and @Rochael.Ribeiro In this opportunity, we talk about the following topics: Open Exchange Tools for Fast APIs Open API Specification Traditional versus Fast Api development Composite API (Interoperability) Spec-First or Api-First Approach Api Governance & Monitoring Demo (video) Open Exchange Tools for Fast APIs As we are talking about fast modern APIs development (Rest / json) we will use two Intersystems Open Exchange tools: The first is a framework for rapid development of APIs which we will detail in this article. https://openexchange.intersystems.com/package/IRIS-apiPub The second is to use Swagger as a user interface for the specification and documentation of the Rest APIs developed on IRIS platform, as well as their use/execution. The basis for its operation is the Open Api specification (OAS) standard, described below: https://openexchange.intersystems.com/package/iris-web-swagger-ui What is the Open API Specification (OAS)? It is a standard used worldwide to define, document and consume APIs. In most cases, APIs are designed even before implementation. I'll talk more about it in the next topics. It is important because it defines and documents Rest APIs for its use, both on the provider and consumer side. But this pattern also serves to speed up tests and API calls in tools (Rest APIs Clients) on the market, such as Swagger, Postman, Insomnia, etc… Traditional way in to publish API using IRIS Imagine we have to build and publish a Rest API from an existing IRIS method (picture below). In the traditional way: 1: We have to think about how consumers will call it. For example: Which path and verb will be used and how will be the response. Whether in a JSON object or as a plain/text. 2: Build a new method in a %CSP.REST class that will handle the http request to call it. 3: Handle the method's response to the intended http response for the end user. 4: Think about how we're going to provide the success code and how we're going to handle exceptions. 5: Map the route for our new method. 6: Provide the API documentation to the end user. We will probably build the OAS content manually. 7: And if, for example, we have a request or response payload (object), the implementation time will increase, because it must also be documented in OAS. How can we be faster? By simply tagging the IRIS method with the [WebMethod] attribute. Whatever it is, the framework will take care of its publication, using the OAS 3.x standard. Why is the OAS 3.x standard so important? Because It also documents in detail all properties of the input and output payloads. In this way, any Rest Client tools on the market can instantly couple to the APIs, Like Insomnia, Postman, Swagger, etc. and provide a sample content to call them easily. Using Swagger we will already visualize our API (image above) and call it. This is also very useful for testing. API customization But what if I need to customize my API? For example: Instead of the name of the method I want the path to be something else. And I want the input parameters to be in the path, not like a query param. We define a specific notation on top of the method, where we can complement the meta-information that the method itself does not provide. In this example we are defining another path for our API and complementing the information for the end user to have a more friendly experience. Projection Map for Rest API This framework supports numerous types of parameters. In this map we can highlight the complex types (the objects). They will be automatically exposed as a JSON payload and each property will be properly documented (OAS) for the end user . Interoperability (Composite APIs) By supporting complex types, you can also expose Interoperability Services. It is a favorable scenario for building composite APIs, which use the orchestration of multiple external components (outbounds). This means that objects or messages used as request or response will be automatically published and read by tools like swagger. And it's an excellent way to test interoperability components, because usually a payload template is already loaded so that the user knows which properties the API uses. First the developer can focus on testing, then shape the Api through customization. Spec-first or Api-first Approach Another concept widely used today is having the API defined even before its implementation. With this framework it is possible to import an Open Api Spec. It creates the methods structure (spec) automatically, missing only their implementation. API Governance and Monitoring For Api's governance, it is also recommended to use IAM together. In addition to having multiple plugins, IAM can quickly couple to APIs through the OAS standard. apiPub offers additional tracing for APIs (see demo video) Demo Download & Documentation Intersystems Open Exchange: https://openexchange.intersystems.com/?search=apiPub Complete documentation: https://github.com/devecchijr/apiPub Very rich material for the use of a new development method, bringing facilities and innovation. This will help a lot in the agility of API development. Congratulations Claudio, for sharing the knowledge. thank you @Thiago.Simoes
Announcement
Anastasia Dyubaylo · Jul 11, 2023

Online Meetup with the winners of the InterSystems Grand Prix Contest 2023

Hi Community, Let's meet together at the online meetup with the winners of the InterSystems Grand Prix Contest 2023 – a great opportunity to have a discussion with the InterSystems Experts team as well as our contestants. Winners' demo included! Date & Time: Thursday, July 13, 11 am EDT | 5 pm CEST Join us to learn more about winners' applications and to have a talk with our experts. ➡️ REGISTER TODAY See you all at our virtual meetup! 👉 LIVE NOW
Article
Niyaz Khafizov · Jul 6, 2018

The way to launch Apache Spark + Apache Zeppelin + InterSystems IRIS

Hi all. Yesterday I tried to connect Apache Spark, Apache Zeppelin, and InterSystems IRIS. During the process, I experienced troubles connecting it all together and I did not find a useful guide. So, I decided to write my own. Introduction What is Apache Spark and Apache Zeppelin and find out how it works together. Apache Spark is an open-source cluster-computing framework. It provides an interface for programming entire clusters with implicit data parallelism and fault tolerance. So, it is very useful when you need to work with Big Data. And Apache Zeppelin is a notebook, that provides cool UI to work with analytics and machine learning. Together, it works like this: IRIS provides data, Spark reads provided data, and in a notebook we work with the data. Note: I have done the following on Windows 10. Apache Zeppelin Now, we will install all the necessary programs. First of all, download apache zeppelin from the official site of apache zeppelin. I have used zeppelin-0.8.0-bin-all.tgz. It includes Apache Spark, Scala, and Python. Unzip it to any folder. After that you can launch zeppelin by calling \bin\zeppelin.cmd from the root of your Zeppelin folder. Wait until the Done, zeppelin server started string appears and open http://localhost:8080 in your browser. If everything is okay, you will see Welcome to Zeppelin! message. Note: I assume, that InterSystems IRIS already installed. If not, download and install it before the next step. Apache Spark So, we have the browser's open window with Zeppelin notebook. In the upper-right corner click on anonymous and after, click on Interpreter. Scroll down and find spark. Next to the spark find edit button and click on it. Scroll down and add dependencies to intersystems-spark-1.0.0.jar and to intersystems-jdbc-3.0.0.jar. I installed InterSystems IRIS to the C:\InterSystems\IRIS\ directory, so artifacts I need to add are at: My files are here: And save it. Check that it works Let us check it. Create a new note, and in a paragraph paste the following code: var dataFrame=spark.read.format("com.intersystems.spark").option("url", "IRIS://localhost:51773/NAMESPACE").option("user", "UserLogin").option("password", "UserPassword").option("dbtable", "Sample.Person").load() // dbtable - name of your table URL - IRIS address. It is formed as follows IRIS://ipAddress:superserverPort/namespace: protocol IRIS is a JDBC connection over TCP/IP that offers Java shared memory connection; ipAddress — The IP address of the InterSystems IRIS instance. If you are connecting locally, use 127.0.0.1 instead of localhost; superserverPort — The superserver port number for the IRIS instance, which is not the same as the webserver port number. To find the superserver port number, in the Management Portal, go to System Administration > Configuration > System Configuration > Memory and Startup; namespace — An existing namespace in the InterSystems IRIS instance. In this demo, we connect to the USER namespace. Run the paragraph. If everything is okay, you will see FINISHED. My notebook: Conclusion In conclusion, we found out how Apache Spark, Apache Zeppelin, and InterSystems IRIS can work together. In my next articles, I will write about data analysis. Links The official site of Apache Spark Apache Spark documentation IRIS Protocol Using the InterSystems Spark Connector 💡 This article is considered as InterSystems Data Platform Best Practice.
Article
sween · Jul 29, 2021

BILLIONS - Monetizing the InterSystems FHIR® with Google Cloud's Apigee Edge

We are ridiculously good at mastering data. The data is clean, multi-sourced, related and we only publish it with resulting levels of decay that guarantee the data is current. We chose the HL7 Reference Information Model (RIM) to land the data, and enable exchange of the data through Fast Healthcare Interoperability Resources (FHIR®). We are also a high performing, full stack team, and like to keep our operational resources on task, so managing the underlying infrastructure to host the FHIR® data repository for purposes of ingestion and consumption is not in the cards for us. For this, we chose the [FHIR® Accelerator Service](https://docs.intersystems.com/components/csp/docbook/Doc.View.cls?KEY=FAS) to handle storage, credentials, back up, development, and FHIR® interoperability. Our data is marketable, and well served as an API, so we will **monetize** it. This means we need to package our data/api up for appropriate sale — which includes: a developer portal, documentation, sample code, testing tools, and other resources to get developers up and running quickly against our data. We need to focus on making our API as user-friendly as possible, and give us some tooling to ward off abuse and protect our business against denial service attacks. For the customers using our data, we chose to use [Google Cloud's Apigee Edge](https://apigee.google.com/edge). ![image](/sites/default/files/inline/images/venmo2.png) > ### With our team focused and our back office entirely powered as services, we are set to make **B I L L I O N S**, and this is an account as to how. # Provisioning High level tasks for provisioning in the [FHIR® Accelerator Service](https://docs.intersystems.com/components/csp/docbook/Doc.View.cls?KEY=FAS) and [Google Cloud's Apigee Edge](https://apigee.google.com/edge). ## FHIR® Accelerator Service Head over to the AWS Marketplace and subscribe to the InterSystems FHIR® Accelerator Service, or sign up for a trial account directly [here](https://portal.trial.isccloud.io/account/signup). After your account has been created, create a FHIR® Accelerator deployment for use to store and sell your FHIR® data. ![image](/sites/default/files/inline/images/fhiraas2.gif) After a few minutes, the deployment will be ready for use and available to complete the following tasks: 1. Create an API Key in the Credentials section and record it. 2. Record the newly created FHIR® endpoint from the Overview section. ![image](/sites/default/files/inline/images/fhiraas4.png) ![image](/sites/default/files/inline/images/fhiraas5.png) ## Google Cloud Apigee Edge Within your Google Cloud account, create a project and enable it for use with Apigee Edge. To understand a little bit of the magic that is going on with the following setup, we are enabling a Virtual Network to be created, a Load Balancer, SSL/DNS for our endpoint, and making some choices on whether or not its going to be publicly accessible. > Fair warning here, if you create this as an evaluation and start making M I L L I O N S, it cannot be converted to a paid plan later on to continue on to making B I L L I O N S. ![image](/sites/default/files/inline/images/apigee1.png) ![image](/sites/default/files/inline/images/apigeetwo_0.png) ![image](/sites/default/files/inline/images/apigeethree.png) ## Build the Product Now, lets get on to building the product for our two initial customers of our data, Axe Capital and Taylor Mason Capital. ![image](/sites/default/files/inline/images/drawing.png) ### Implement Proxy Out first piece of the puzzle here is the mechanics of our proxy from Apigee to the FHIR® Accelerator Service. At its core, we are implementing a basic reverse proxy that backs the Apigee Load Balancer with our FHIR® API. Remember that we created all of the Apigee infrastructure during the setup process when we enabled the GCP Project for Apigee. ![image](/sites/default/files/inline/images/apiproxy.gif) ### Configure the Proxy Configuring the proxy basically means you are going to define a number of policies to the traffic/payload as it either flows to (PreFlow/PostFlow) to shape the interaction and safety of how the customers/applications behave against the API. In the below, we configure a series of policies that : 1. Add CORS Headers. 2. Remove the API Key from the query string. 3. Add the FHIR® Accelerator API key to the headers. 4. Impose a Quota/Limit. ![image](/sites/default/files/inline/images/flow.png) A mix of XML directives and a user interface to configure the policy is available as below. ![image](/sites/default/files/inline/images/configureproxy.gif) ### Add a Couple of Developers, Axe and Taylor We need to add some developers next, which is as simple as adding the users to any directory, this is required to enable the Applications that are created in the next step and supplied to our customers. ![image](/sites/default/files/inline/images/developers.png) ### Configure the Apps, one per customer Applications is where we break part our *product* and logically divide it up to our customers, here we will create one app per customer. Important note here that in our case for this demonstration, this is where the apikey for that particular customer is assigned, after we assign the developer to the app. ![image](/sites/default/files/inline/images/apps.gif) ### Create the Developer Portal The Developer Portal is the "**clown suit**" and front door for our customers and where they can interact with what they are paying for. It comes packed with some powerful customization, a specific url for the product it is attached to, and allows the import of a swagger/openapi spec for developers to interact with the api using swagger based implemented UI. Lucky for us the Accelerator Service comes with a swagger definition, so we just have to know where to look for it and make some modifications so that the defs work against our authentication scheme and url. We don't spend a lot of time here in the demonstration, but you should if you plan on setting yourself apart for the paying customers. ![image](/sites/default/files/inline/images/portal.gif) ### Have Bobby Send a Request Let's let Bobby Axelrod run up a tab by sending his first requests to our super awesome data wrapped up in FHIR®. For this, keep in mind the key that is being used and the endpoint that is being used, is all assigned by Apigee Edge, but the access to the FHIR® Accelerator Service is done through the single key we supplied in the API Proxy. ![image](/sites/default/files/inline/images/axerequest.gif) ![image](/sites/default/files/inline/images/showmethemoney.png) ### Rate Limit Bobby with a Quota Let's just say one of our customers has a credit problem, so we want to limit the use of our data on a rate basis. If you recall, we did specify a rate of 30 requests a minute when we setup the proxy, so lets test that below. ![image](/sites/default/files/inline/images/quota.gif) ### Bill Axe Capital I will get in front of your expectations here so you wont be too disappointed by how rustic the billing demonstration is, but it does employ a technique here to generate a report for purposes of invoicing, that actually removes things that may or may not be the customers fault in the proxy integration. For instance, if you recall from the rate limit demo above, we sent in 35 requests, but limited things to 30, so a quick filter in the billing report will actually remove those and show we are competent enough to bill only for our customers utilization. ![image](/sites/default/files/inline/images/billing.gif) To recap, monetizing our data included: * Safety against abuse and DDOS protection. * Developer Portal and customization for the customer. * Documentation through Swagger UI. * Control over the requests Pre/Post our API ... and a way to invoice for **B I L L I O N S**. This is very cool. Well done. 💡 This article is considered as InterSystems Data Platform Best Practice.
Announcement
Thomas Dyar · Dec 14, 2021

InterSystems IRIS and IRIS for Health 2021.2 preview is published

Preview releases are now available for the 2021.2 version of InterSystems IRIS, IRIS for Health, and HealthShare Health Connect. As this is a preview release, we are eager to learn from your experiences with this new release ahead of its General Availability release next month. Please share your feedback through the Developer Community so we can build a better product together. InterSystems IRIS Data Platform 2021.2 makes it even easier to develop, deploy and manage augmented applications and business processes that bridge data and application silos. It has many new capabilities including: Enhancements for application and interface developers, including: Embedded Python Interoperability Productions in Python Updates to Visual Studio Code ObjectScript Extension Pack New Business Services and operations added allowing users to set and run SQL query with minimal custom coding Enhancements for Analytics and AI, including: New SQL LOAD command efficiently loads CSV and JDBC source data into tables Enhancements to Adaptive Analytics Enhancements for Cloud and Operations tasks, including: New Cloud Connectors make it simple to access and use cloud services within InterSystems IRIS applications IKO enhancements improve manageability of Kubernetes resources Enhancements for database and system administrators, including: Online Shard Rebalancing automates distribution of data across nodes without interrupting operations Adaptive SQL engine uses fast block sampling and automation to collect advanced table statistics and leverages runtime information for improved query planning Storage needs for InterSystems IRIS are reduced with new stream and journal file compression settings Support for TLS 1.3 and OpenSSL 1.1.1, using system-provided libraries New ^TRACE utility reports detailed process statistics such as cache hits and reads More details on all of these features can be found in the product documentation: InterSystems IRIS 2021.1 documentation and release notes InterSystems IRIS for Health 2021.1 documentation and release notes HealthShare Health Connect 2021.1 documentation and release notes InterSystems IRIS 2021.2 is a Continuous Delivery (CD) release, which now comes with classic installation packages for all supported platforms, as well as container images in OCI (Open Container Initiative) a.k.a. Docker container format. Container images are available for OCI compliant run-time engines for Linux x86-64 and Linux ARM64, as detailed in the Supported Platforms document. Full installation packages for each product are available from the WRC's product download site. Using the "Custom" installation option enables users to pick the options they need, such as InterSystems Studio and IntegratedML, to right-size their installation footprint. Installation packages and preview keys are available from the WRC's preview download site. Container images for the Enterprise Edition, Community Edition and all corresponding components are available from the InterSystems Container Registry using the following commands: docker pull containers.intersystems.com/intersystems/iris:2021.2.0.617.0 docker pull containers.intersystems.com/intersystems/iris-ml:2021.2.0.617.0 docker pull containers.intersystems.com/intersystems/irishealth:2021.2.0.617.0 docker pull containers.intersystems.com/intersystems/irishealth-ml:2021.2.0.617.0 For a full list of the available images, please refer to the ICR documentation. Alternatively, tarball versions of all container images are available via the WRC's preview download site. The build number for this preview release is 2021.2.0.617.0. Interoperability productions with Python and Cloud connectors? YEEEESSSSSSS. However, containers.intersystems.com is giving up bad credentials.... or am I the sole brunt of cruelty here? ``` (base) sween @ dimsecloud-pop-os ~ └─ $ ▶ docker login -u="ron.sweeney@integrationrequired.com" containers.intersystems.com Password: Error response from daemon: Get https://containers.intersystems.com/v2/: unauthorized: BAD_CREDENTIAL ``` I was able to get in, for example: $ docker-ls tags --registry https://containers.intersystems.com intersystems/irishealth ... requesting list . done repository: intersystems/irishealth tags: - 2019.1.1.615.1 - 2020.1.0.217.1 - 2020.1.1.408.0 - 2020.2.0.211.0 - 2020.3.0.221.0 - 2020.4.0.547.0 - 2021.1.0.215.0 - 2021.2.0.617.0 No problem: download from wrc>previews some .tar.gz docker load -i <downloaded> ... off it goes with Docker run or Dockerfile + docker-compose And here it is, containers.intersystems.com gone $ docker pull containers.intersystems.com/intersystems/irishealth-community:2021.2.0.617.0 Error response from daemon: Get "https://containers.intersystems.com/v2/": Service Unavailable Could you push those images to the docker hub, as usual before? It's more stable. Hi Dmitry, Thanks for the heads-up, we are working to bring containers.intersystems.com back online. The docker hub listings will be updated with the 2021.2 preview images in the next day or so, and we will update this announcement when they're available! Kind Regards,Thomas Dyar Good news -- containers.intersystems.com is back online. Please let us know if you encounter any issues! Regards, Thomas Dyar And images with ZPM package manager 0.3.2 are available accordingly: intersystemsdc/iris-community:2021.2.0.617.0-zpm intersystemsdc/iris-ml-community:2021.2.0.617.0-zpm intersystemsdc/iris-community:2021.1.0.215.3-zpm intersystemsdc/irishealth-community:2021.1.0.215.3-zpm intersystemsdc/irishealth-ml-community:2021.1.0.215.3-zpm intersystemsdc/irishealth-community:2021.1.0.215.3-zpm And to launch IRIS do: docker run --rm --name my-iris -d --publish 9091:1972 --publish 9092:52773 intersystemsdc/iris-community:2021.2.0.617.0-zpm docker run --rm --name my-iris -d --publish 9091:1972 --publish 9092:52773 intersystemsdc/iris-ml-community:2021.2.0.617.0-zpm docker run --rm --name my-iris -d --publish 9091:1972 --publish 9092:52773 intersystemsdc/iris-community:2021.2.0.617.0-zpm docker run --rm --name my-iris -d --publish 9091:1972 --publish 9092:52773 intersystemsdc/irishealth-community:2021.2.0.617.0-zpm docker run --rm --name my-iris -d --publish 9091:1972 --publish 9092:52773 intersystemsdc/irishealth-ml-community:2021.2.0.617.0-zpm docker run --rm --name my-iris -d --publish 9091:1972 --publish 9092:52773 intersystemsdc/irishealth-community:2021.2.0.617.0-zpm And for terminal do: docker exec -it my-iris iris session IRIS and to start the control panel: http://localhost:9092/csp/sys/UtilHome.csp To stop and destroy container do: docker stop my-iris And the FROM clause in dockerfile can look like: FROM intersystemsdc/iris-community:2021.2.0.617.0-zpm Or to take the latest image: FROM intersystemsdc/iris-community Excellent! That's comfort. Available on Docker Hub too. Will we have an arm64 version? I was trying to install the preview version on Ubuntu 20.04.3 LTS ARM64(in a VM on Mac M1). But irisintall gave the the following error. Installing zlib1g-dev did not solve the problem. Could anyone suggest me what I was missing? ----- Your system type is 'Ubuntu LTS (ARM64)'. zlib1g version 1 is required. ** Installation aborted ** Based on the msg alone , would need: sudo apt install zlib1g Instead of: sudo apt install zlib1g-dev Thanks. But looks like zlib1g is already installed.. $ sudo apt install zlib1g Reading package lists... Done Building dependency tree Reading state information... Done zlib1g is already the newest version (1:1.2.11.dfsg-2ubuntu1.2). 0 upgraded, 0 newly installed, 0 to remove and 11 not upgraded. I know this is not the proper way to do it, but I worked around this issue by deleting the line for zlib1g in the file, package/requirements_check/requirements.lnxubuntu2004arm64.isc. Looks like the instance is working fine, so I suspect there is something wrong with requirement checking in installation.
Announcement
Jeff Fried · Jan 27, 2020

InterSystems IRIS and IRIS for Health 2020.1 preview is published

Preview releases are now available for the 2020.1 version of InterSystems IRIS and IRIS for Health! Kits and Container images are available via the WRC's preview download site. The build number for these releases is 2020.1.0.199.0. (Note: first release was build 197, updated to 199 on 2/12/20) InterSystems IRIS Data Platform 2020.1 has many new capabilities including: Kernel Performance enhancements, including reduced contention for blocks and cache lines Universal Query Cache - every query (including embedded & class ones) now gets saved as a cached query Universal Shard Queue Manager - for scale-out of query load in sharded configurations Selective Cube Build - to quickly incorporate new dimensions or measures Security improvements, including hashed password configuration Improved TSQL support, including JDBC support Dynamic Gateway performance enhancements Spark connector update MQTT support in ObjectScript (NOTE: this preview build does not include TLS 1.3 and OpenLDAP updates, which are planned for General Availability) InterSystems IRIS for Health 2020.1 includes all of the enhancements of InterSystems IRIS. In addition, this release includes: In-place conversion to IRIS for Health HL7 Productivity Toolkit including Migration Tooling and Cloverleaf conversion X12 enhancements FHIR R4 base standard support As this is an EM (Extended Maintenance) release, customers may want to know the differences between 2020.1 and 2019.1. These are listed in the release notes: InterSystems IRIS 2020.1 release notes IRIS for Health 2020.1 release notes Draft documentation can be found here: InterSystems IRIS 2020.1 documentation IRIS for Health 2020.1 documentation The platforms on which InterSystems IRIS and IRIS for Health 2020.1 are supported for development and production are detailed in the Supported Platforms document. Jeffrey, thank you for the info. Do you already know that the Supported Platforms document link is broken? (404) Hi Jeff! What are the Docker image tags for Community Editions? I've just uploaded the Community Editions to the Docker Store (2/13-updated with new preview build): docker pull store/intersystems/iris-community:2020.1.0.199.0 docker pull store/intersystems/irishealth-community:2020.1.0.199.0 Thanks, Steve! Will native install kits be available for the Community Editions as well? Yes, full kit versions of the 2020.1 Community Edition Preview are available through the WRC download site as well. I'm getting this error when I attempt to access the link ... Jeffery If you don't use that link and first log into the WRC application: https://wrc.intersystems.com/wrc/enduserhome.csp Can you then go to: https://wrc.intersystems.com/wrc/coDistribution2.csp Then select Preview? Some customers have had problems with the distib pages because their site restricts access to some JS code we get from a third party. I get the same result using your suggested method, Brendan. I'm not technically a customer; I work for a Services Partner of ISC. I am a DC Moderator though (if that carries any weight) so it would be nice to keep abreast of the new stuff OK I needed to do one more click, your Org does not have a support contract so you can't have access to these pages, sorry. Maybe Learning Services could help you out but I can't grant you access to the kits on the WRC. Hello, I took this for a spin and noticed that the new Prometheus metrics are not available on it like they were in 2019.4 ? ( ie: https://community.intersystems.com/post/monitoring-intersystems-iris-using-built-rest-api ). Am I missing something or is the metrics api still under consideration to make it into this build ? The correct link is https://docs.intersystems.com/iris20201/csp/docbook/platforms/index.html. I fixed the typo in the post. Tanks for pointing that out! Seems to be there for me... Hello Jeffrey, We're currently working on IRIS for Health 20.1 build 197, and we were wondering what fixes or additions went to latest build 199. Intesystems used publish all fixes with each FT build version, is there such list? Thank you Yuriy The Preview has been updated with build 2020.1.0.199.0. This includes a variety of changes, primarily corrections for issues found under rare conditions in install, upgrade, and certain distributed configurations. None of these changes impacts any published API. Thank you for working with the preview and for your feedback! Hi Yuriy - Thanks for pointing this out. We did not prepare a list for this, but I did make a comment on this thread, including verifying that none of these changes impacts any published API. If there is a change resolving an issue you reported through the WRC, you'll see that this is resolved via the normal process. We will be publishing detailed changenotes with the GA release. -Jeff
Article
sween · Mar 4, 2024

InterSystems IRIS® CloudSQL Metrics to Google Cloud Monitoring

If you are a customer of the new InterSystems IRIS® Cloud SQL and InterSystems IRIS® Cloud IntegratedML® cloud offerings and want access to the metrics of your deployments and send them to your own Observability platform, here is a quick and dirty way to get it done by sending the metrics to Google Cloud Platform Monitoring (formerly StackDriver). The Cloud portal does contain a representation of some top level metrics for at-a-glance heads up metrics, which is powered by a metrics endpoint that is exposed to you, but without some inspection you would not know it was there. 🚩 This approach is most likely taking advantage of a "to be named feature", so with that being said, it is not future-proof and definitely not supported by InterSystems. So what if you wanted a more comprehensive set exported? This technical article/example shows a technique to scrape and forward metrics to observability, it can be modified to suit your needs, to scrape ANY metrics target and send to ANY observability platform using the Open Telemetry Collector. The mechanics leading up to the above result can be accomplished in many ways, but for here we are standing up a Kubernetes pod to run a python script in one container, and Otel in another to pull and push the metrics... definitely a choose your own adventure, but for this example and article k8s is the actor pulling this off with Python. Steps: Prereqs Python Container Kubernetes Google Cloud Monitoring Prerequisites: An active subscription to IRIS® Cloud SQL One Deployment, running, optionally with Integrated ML Secrets to supply to your environment Environment Variables Obtain Secrets I dropped this in a teaser as it is a bit involved and somewhat off target of the point, but these are the values you will need to generate the secrets. ENV IRIS_CLOUDSQL_USER 'user' ENV IRIS_CLOUDSQL_PASS 'pass' ☝ These are your credentials for https://portal.live.isccloud.io ENV IRIS_CLOUDSQL_USERPOOLID 'userpoolid' ENV IRIS_CLOUDSQL_CLIENTID 'clientid' ENV IRIS_CLOUDSQL_API 'api' ☝ These you have to dig out of development tools for your browser. `aud` = clientid `userpoolid`= iss `api` = request utl ENV IRIS_CLOUDSQL_DEPLOYMENTID 'deploymentid' ☝ This can be derived from the Cloud Service Portal Python: Here is the python hackery to pull the metrics from the Cloud Portal and export them locally as metrics for the otel collector to scrape: iris_cloudsql_exporter.py import time import os import requests import json from warrant import Cognito from prometheus_client.core import GaugeMetricFamily, REGISTRY, CounterMetricFamily from prometheus_client import start_http_server from prometheus_client.parser import text_string_to_metric_families class IRISCloudSQLExporter(object): def __init__(self): self.access_token = self.get_access_token() self.portal_api = os.environ['IRIS_CLOUDSQL_API'] self.portal_deploymentid = os.environ['IRIS_CLOUDSQL_DEPLOYMENTID'] def collect(self): # Requests fodder url = self.portal_api deploymentid = self.portal_deploymentid print(url) print(deploymentid) headers = { 'Authorization': self.access_token, # needs to be refresh_token, eventually 'Content-Type': 'application/json' } metrics_response = requests.request("GET", url + '/metrics/' + deploymentid, headers=headers) metrics = metrics_response.content.decode("utf-8") for iris_metrics in text_string_to_metric_families(metrics): for sample in iris_metrics.samples: labels_string = "{1}".format(*sample).replace('\'',"\"") labels_dict = json.loads(labels_string) labels = [] for d in labels_dict: labels.extend(labels_dict) if len(labels) > 0: g = GaugeMetricFamily("{0}".format(*sample), 'Help text', labels=labels) g.add_metric(list(labels_dict.values()), "{2}".format(*sample)) else: g = GaugeMetricFamily("{0}".format(*sample), 'Help text', labels=labels) g.add_metric([""], "{2}".format(*sample)) yield g def get_access_token(self): try: user_pool_id = os.environ['IRIS_CLOUDSQL_USERPOOLID'] # isc iss username = os.environ['IRIS_CLOUDSQL_USER'] password = os.environ['IRIS_CLOUDSQL_PASS'] clientid = os.environ['IRIS_CLOUDSQL_CLIENTID'] # isc aud print(user_pool_id) print(username) print(password) print(clientid) try: u = Cognito( user_pool_id=user_pool_id, client_id=clientid, user_pool_region="us-east-2", # needed by warrant, should be derived from poolid doh username=username ) u.authenticate(password=password) except Exception as p: print(p) except Exception as e: print(e) return u.id_token if __name__ == '__main__': start_http_server(8000) REGISTRY.register(IRISCloudSQLExporter()) while True: REGISTRY.collect() print("Polling IRIS CloudSQL API for metrics data....") #looped e loop time.sleep(120) Docker: Dockerfile FROM python:3.8 ADD src /src RUN pip install prometheus_client RUN pip install requests WORKDIR /src ENV PYTHONPATH '/src/' ENV PYTHONUNBUFFERED=1 ENV IRIS_CLOUDSQL_USERPOOLID 'userpoolid' ENV IRIS_CLOUDSQL_CLIENTID 'clientid' ENV IRIS_CLOUDSQL_USER 'user' ENV IRIS_CLOUDSQL_PASS 'pass' ENV IRIS_CLOUDSQL_API 'api' ENV IRIS_CLOUDSQL_DEPLOYMENTID 'deploymentid' RUN pip install -r requirements.txt CMD ["python" , "/src/iris_cloudsql_exporter.py"] docker build -t iris-cloudsql-exporter . docker image tag iris-cloudsql-exporter sween/iris-cloudsql-exporter:latest docker push sween/iris-cloudsql-exporter:latest Deployment: k8s; Create us a namespace: kubectl create ns iris k8s; Add the secret: kubectl create secret generic iris-cloudsql -n iris \ --from-literal=user=$IRIS_CLOUDSQL_USER \ --from-literal=pass=$IRIS_CLOUDSQL_PASS \ --from-literal=clientid=$IRIS_CLOUDSQL_CLIENTID \ --from-literal=api=$IRIS_CLOUDSQL_API \ --from-literal=deploymentid=$IRIS_CLOUDSQL_DEPLOYMENTID \ --from-literal=userpoolid=$IRIS_CLOUDSQL_USERPOOLID otel, Create Config: apiVersion: v1 data: config.yaml: | receivers: prometheus: config: scrape_configs: - job_name: 'IRIS CloudSQL' # Override the global default and scrape targets from this job every 5 seconds. scrape_interval: 30s scrape_timeout: 30s static_configs: - targets: ['192.168.1.96:5000'] metrics_path: / exporters: googlemanagedprometheus: project: "pidtoo-fhir" service: pipelines: metrics: receivers: [prometheus] exporters: [googlemanagedprometheus] kind: ConfigMap metadata: name: otel-config namespace: iris k8s; Load the otel config as a configmap: kubectl -n iris create configmap otel-config --from-file config.yaml k8s; deploy load balancer (definitely optional), MetalLB. I do this to scrape and inspect from outside of the cluster. cat <<EOF | kubectl apply -f -n iris - apiVersion: v1 kind: Service metadata: name: iris-cloudsql-exporter-service spec: selector: app: iris-cloudsql-exporter type: LoadBalancer ports: - protocol: TCP port: 5000 targetPort: 8000 EOF gcp; need the keys to google cloud, the service account needs to be scoped roles/monitoring.metricWriter kubectl -n iris create secret generic gmp-test-sa --from-file=key.json=key.json k8s; the deployment/pod itself, two containers: deployment.yaml apiVersion: apps/v1 kind: Deployment metadata: name: iris-cloudsql-exporter labels: app: iris-cloudsql-exporter spec: replicas: 1 selector: matchLabels: app: iris-cloudsql-exporter template: metadata: labels: app: iris-cloudsql-exporter spec: containers: - name: iris-cloudsql-exporter image: sween/iris-cloudsql-exporter:latest ports: - containerPort: 5000 env: - name: "GOOGLE_APPLICATION_CREDENTIALS" value: "/gmp/key.json" - name: IRIS_CLOUDSQL_USERPOOLID valueFrom: secretKeyRef: name: iris-cloudsql key: userpoolid - name: IRIS_CLOUDSQL_CLIENTID valueFrom: secretKeyRef: name: iris-cloudsql key: clientid - name: IRIS_CLOUDSQL_USER valueFrom: secretKeyRef: name: iris-cloudsql key: user - name: IRIS_CLOUDSQL_PASS valueFrom: secretKeyRef: name: iris-cloudsql key: pass - name: IRIS_CLOUDSQL_API valueFrom: secretKeyRef: name: iris-cloudsql key: api - name: IRIS_CLOUDSQL_DEPLOYMENTID valueFrom: secretKeyRef: name: iris-cloudsql key: deploymentid - name: otel-collector image: otel/opentelemetry-collector-contrib:0.92.0 args: - --config - /etc/otel/config.yaml volumeMounts: - mountPath: /etc/otel/ name: otel-config - name: gmp-sa mountPath: /gmp readOnly: true env: - name: "GOOGLE_APPLICATION_CREDENTIALS" value: "/gmp/key.json" volumes: - name: gmp-sa secret: secretName: gmp-test-sa - name: otel-config configMap: name: otel-config kubectl -n iris apply -f deployment.yaml Running Assuming nothing is amiss, lets peruse the namespace and see how we are doing. ✔ 2 config maps, one for GCP, one for otel ✔ 1 load balancer ✔ 1 pod, 2 containers successful scrapes Google Cloud Monitoring Inspect observability to see if the metrics are arriving ok and be awesome in observability!
Announcement
AYUSH Shetty · May 18

Do you have any openings for InterSystems developer Job

I am writing to express my interest in the "IRIS Ensemble Integration . I have 2 years of experience as an Ensemble IRIS Developer, working with Ensemble and IRIS for integration, server management, and application development. Looking for more opportunites to work under Iris Cache Objectscript
Article
Ben Spead · May 21

ISCLauncher - Get immediate access to InterSystems knowledge and support records!

For 15 over years I have been playing with ways to speed up the way I use InterSystems systems and technology via AutoHotkey scripting. As a power keyboard user (I avoid my mouse when possible) I found it very helpful to set up hotkeys to get to my most frequently accessed systems and research utilities as quickly as possible. While I have used this approach for many years, this is the first time that I am introducing my approach and a customer-facing hotkey script to the D.C. and OEx... ISCLauncher is a hotkey program based on AutoHotKey (Windows scripting language) which provides quick access a number of useful InterSystems resources and online systems (Windows OS only). Use it to quickly access the following InterSystems resources: Documentation Search Developer Community Search D.C. A.I. Online Learning WRC Issues iService Issues CCR records ... plus more! To try it out for yourself, use the "Download' button on the ISCLauncher Open Exchange listing, which will pull down a Zip file from which you can extract the contents. Run the ISCLauncher.exe and you will see the following in your Windows SysTray: To pull up the Help screen so you can see all of the things that it can do, once you are running ISCLauncher, press [Ctrl]+[Windows]+[?]: The power of ISCLauncher is that it can turn plain text into a hyperlink. E.g. If you have an ID from a WRC, iService or CCR record in an email, chat or notes, simply highlighting the record ID and using the appropriate hotkey will allow you to just directly to that record. See a demo for a CCR lookup below ([Ctrl]+[Windows]+[c]): To access a record even faster, use ISC Uber-Key ( [Ctrl]+[Windows]+[Space] ) to try to automatically determine the record type and navigate immediately there (credit to @Chad.Severtson for the original ISC Uber-Key code from years ago!). In addition to WRC, iService and CCR records - do quick searches against things like InterSystems Documentation ([Ctrl]+[Windows]+[b] ) or the Developer Community ([Ctrl]+[Windows]+[d]): Make this tool even more powerful by adding your own hotkeys for things that you frequently type or open on your desktop. For some inspiration, here is my personal launcher which I have tuned over the years: Have ideas how to make this more powerful? Add comments below. Also, once this is in GitHub you can feel free to create PRs with your suggestions.
Announcement
Anastasia Dyubaylo · Jun 2

Winners of the InterSystems FHIR and Digital Health Interoperability Contest 2025

Hi Community, It's time to announce the winners of the InterSystems FHIR and Digital Health Interoperability Contest! Thanks to all our amazing participants who submitted 11 applications 🔥 Now it's time to announce the winners! Experts Nomination 🥇 1st place and $5,000 go to the FHIRInsight app by @José.Pereira, @henry, @Henrique 🥈 2nd place and $2,500 go to the iris-fhir-bridge app by @Muhammad.Waseem 🥉 3rd place and $1,000 go to the health-gforms app by @Yuri.Gomes 🏅 4th place and $500 go to the fhir-craft app by @Laura.BlázquezGarcía 🏅 5th place and $300 go to the CCD Data Profiler app by @Landon.Minor 🌟 $100 go to the IRIS Interop DevTools app by @Chi.Nguyen-Rettig 🌟 $100 go to the hc-export-editor app by @Eric.Fortenberry 🌟 $100 go to the iris-medbot-guide app by @shan.yue 🌟 $100 go to the Langchain4jFhir app by @ErickKamii 🌟 $100 go to the ollama-ai-iris app by @Oliver.Wilms Community Nomination 🥇 1st place and $1,000 go to the iris-medbot-guide app by @shan.yue 🥈 2nd place and $600 go to the FHIRInsight app by @José.Pereira, @henry, @Henrique 🥉 3rd place and $300 go to the FhirReportGeneration app by @XININGMA 🏅 4th place and $200 go to the iris-fhir-bridge app by @Muhammad.Waseem 🏅 5th place and $100 go to the fhir-craft app by @Laura.BlázquezGarcía Our sincerest congratulations to all the winners! Join the fun next time ;)
Announcement
Bob Kuszewski · Jun 20

InterSystems API Manager (IAM) 3.10 Release Announcement

InterSystems is pleased to announce that IAM 3.10 has been released. IAM 3.10 is the first significant release in about 18 months, so it includes many significant new features that are not available in IAM 3.4, including: Added support for incremental config sync for hybrid mode deployments. Instead of sending the entire entity config to data planes on each config update, incremental config sync lets you send only the changed configuration to data planes. Added the new configuration parameter admin_gui_csp_header to Gateway, which controls the Content-Security-Policy (CSP) header served with Kong Manager. This defaults to off, and you can opt in by setting it to on. You can use this setting to strengthen security in Kong Manager. AI RAG Injector (ai-rag-injector) Added the AI Rag Injector plugin, which allows automatically injecting documents to simplify building RAG pipelines. AI Sanitizer (ai-sanitizer) Added the AI Sanitizer plugin, which can sanitize the PII information in requests before the requests are proxied by the AI Proxy or AI Proxy Advanced plugins. Kafka Consume (kafka-consume): Introduced the Kafka Consume plugin, which adds Kafka consumption capabilities to Kong Gateway. Redirect (redirect): Introduced the Redirect plugin, which lets you redirect requests to another location. … and many more Customers upgrading from earlier versions of IAM must get a new IRIS license key in order to use IAM 3.10. Kong has changed their licensing in a way that requires us to provide you with new license keys. When you are upgrading IAM, you will need to install the new IRIS license key on your IRIS server before starting IAM 3.10. IAM 2.8 has reached its end-of-life and current customers are strongly encouraged to upgrade as soon as possible. IAM 3.4 will reach end-of-life in 2026, so start planning that upgrade soon. IAM is an API gateway between your InterSystems IRIS servers and applications, providing tools to effectively monitor, control, and govern HTTP-based traffic at scale. IAM is available as a free add-on to your InterSystems IRIS license. IAM 3.10 can be downloaded from the Components area of the WRC Software Distribution site. Follow the Installation Guide for guidance on how to download, install, and get started with IAM. The complete IAM 3.10 documentation gives you more information about IAM and using it with InterSystems IRIS. Our partner Kong provides further documentation on using IAM in the Kong Gateway (Enterprise) 3.10 documentation IAM is only available in OCI (Open Container Initiative) a.k.a. Docker container format. Container images are available for OCI compliant run-time engines for Linux x86-64 and Linux ARM64, as detailed in the Supported Platforms document. The build number for this release is IAM 3.10.0.2. This release is based on Kong Gateway (Enterprise) version 3.10.0.2.
Announcement
Irène Mykhailova · Jun 23

Stream / watch keynotes from the InterSystems Ready 2025 online

Hi Community! We have great news for those of you who are interested in what's happening at the InterSystems Ready 2025 but couldn't attend in person. All the keynotes are being streamed! Moreover, you can watch them afterwards if they happen at an inopportune time. Keynotes from Day 1 are already ready 😉 And don't forget to check out the rest of the keynotes: Keynotes from day 2 Keynotes from day 3 It promises to be epic!
Announcement
Olga Zavrazhnova · Jun 19, 2019

How to earn points on InterSystems Global Masters Advocate Hub

It’s no secret that the InterSystems Global Masters program is integrated with Developer Community, Open Exchange, and Ideas Portal. Whenever you contribute to any of these platforms, you automatically earn points and badges on Global Masters. We’ve created a short guide to help you discover the best ways to earn points on Global Masters: Please note that points are automatically awarded on the 4th day after you make a contribution on DC, OEX, or the Ideas Portal (activities made outside of the Global Masters platform). HOW TO EARN POINTS ON GLOBAL MASTERS Each published post on Developer Community Published post on DC ES / PT / JP / CN / FR 200400 1st Comment on DC / Each comment* Comment on DC ES / PT / JP / CN / FR 300 / 30 60 1st answer marked as Accepted / Each accepted answer 1 000 / 150 Translate an article / a question 150 / 30 Publish 1 / 5 / 10 / 25 / 50 articles on DC 1 500 / 7 500 / 15 000 / 40 000 / 75 000 First published question on DC Publish 1 / 5 / 10 / 25 / 50 questions on DC 500 500 / 2 000 / 5 000 / 15 000 / 30 000 Each application on Open ExchangeBonus points for each ZPM applicationPublish 1 / 5 / 10 / 25 applications on Open Exchange 8004001 000 / 10 000 / 25 000 / 75 000 1 / 5 / 10 / 25 / 50 Accepted Answers on DC 1 000 / 4 000 / 8 000 / 20 000 / 40 000 Bonus points for each your DC post gathered 750+ / 2000+ / 5000+ / 15000+ views 200 / 500 / 1000 / 3000 Read an article on DCWatch the videoShare an article / video in Social Networks 101050 Write 1 / 2 / 3 / 4 / 5 Articles with Best Practices tag 1000 / 3000 / 7000 / 10 000 / 15 000 50 / 100 / 250 / 500 / 1000 application's downloads on Open Exchange 2 500 / 5 000 / 7 500 / 12 500 / 25 000 Make a review for InterSystems / InterSystems products 2 000 - 3 000 Invite your colleague to Developer Community 1000 Create a video about your OEX application 3000 *counted only comments that were published after registration on the Global Masters Advocate Hub. Complete challenges, get badges and climb up the levels: Insider > Advocate > Specialist > Expert >Ambassador> Legend.**Please note the level system is not available on a new Global Masters platform starting from April 2024. We are working on bringing it back! The higher level you are, the more interesting prizes available! And... Please check the additional information about Global Masters: What is Global Masters? Start Here Global Masters Badges Descriptions Global Masters Levels Descriptions If you have not joined InterSystems Global Masters Advocacy Hub yet, let's get started right now! Feel free to ask your questions in the comments to this post. Thanks, Anastasia!Very helpful! I believe we also have a series upon the number of accepted answers, like 10,25,50,100 accepted answers. Do we? Thank you for this quick reference table (and for my *looks up amount of points for comments* 30 points!) Hi Evgeny,let me answer - we do not have so far, and I think that would be good to have such series & badges to recognize the authors. Are these automated in any way? Wondering if mine is bugged because I've certainly posted questions and comments before but those badges were never unlocked. Their descriptions below say "first" question/comment and I don't know if mine are being detected:https://community.intersystems.com/post/changes-global-masters-program-new-level-new-badges-new-possibilities Hi David! This should be automatic. We'll investigate. I wrote a post on DC in 2017? Do I have to 'register' it to get points on Global Masters?Kind regards, Stephen Hi David!We have fixed this issue. Thank you for the feedback! Thank you! You're very quick! Hi Stephen, I see you have joined recently the Global Masters, that is so great! -this post is not counted in "Write a post on Developer Community" challenge (100 points), as it has been published before you registered- it is counted in all other type of challenges listed above e.g. "Write 10 posts on DC". This was really helpful Thank you This is an excellent article and is worth bumping the thread :) Great! This is very helpful! This is helpful. Thank you! "Invite your colleague to Developer Community" - is there a formal way to do this via the D.C. interface? I looked around and couldn't seem to find an 'invite a friend' option or anything like that. I have some colleagues whom I think would benefit from getting involved in the D.C. (CC: @Anastasia.Dyubaylo / @Evgeny.Shvarov ) Hi @Benjamin.Spead you can do that via this Global Masters challenge (this challenge is currently in your "Later" tab) Thank you @Olga.Zavrazhnova2637! I knew I had seen it somewhere at some point. I just had a conversation with a new colleague yesterday about the value of the D.C. and Global Masters, so I will send her an invite :) It's a good idea! Do you mean to have the UI on DC to prepare an email invitation to join DC to a friend developer with a standard invitation text? This was more to figure out the proper way to do this in order for tracking for the badge, etc on the G.M. platform. It makes sense that it needs to originate in a challenge (and than you to Olga for pointing that out). I don't think that just having a form on the D.C. to invite a friend necessarily makes sense, as anyone can just shoot a friend an email with the link. If others would like to see this as a new feature I won't object though. Hello,please, can you explain how to translate an article/question? Regards You can see the language of the article in the upper left side of the window, click on it and a list of language will be deployed, select the language to translate the article and a new window will be shown with 2 options, translate and request translate, select the first and you will be able to translate the article. Oh, I see. Unfortunally there is not Italian language available. Puoi dare un suggerimento in Ideas Portal Penso che ci sia abbastanza supporto lì. Perciò ?? @Luca.Ravazzolo ?? Thank you for the table! Thanks, Anastasia! Very helpful! Thanks for the help. Hello! Thank you very much for remembering these points! I was reading and noticed that the option "Share an article / video in Social Networks" is no longer available in publications, I think this referred to the old platform, right? Hi Marcelo! Yes, the option to share was available for any article and video on the old platform. On the new platform, we still have some articles/videos for sharing, but only for selected content. Social sharing “asks” are tagged with the “social share” tag on the platform when available. Thank you!! Thank for the tips thanks for the information it is very useful Very helpful, thanks I was curious, how long should it take for there to be an update to your points after reading an article / posting a reply? Is this something that should happen immediately, or take some time? This is so cool! Can't wait to get active in the Intersystems community and earn some points Hi Henry! The points are awarded on the 4th day after you post a comment or article — this delay is intentional, for moderation purposes. However, if you notice a delay longer than that, please let me know. That could indicate a possible issue with the integration between your profiles on DC and GM that we may need to look into 😊 Hi Olga! Thank you for the quick response. That definitely makes sense Hi Olga, I had a question about the point structure regarding comments. Do you get 300 points for being the first to comment on a post, or is it 300 points for your first ever comment, then 30 points for every subsequent one? Thanks for your help! That’s a good question! You get 300 points for your first-ever comment on DC — and it comes with a badge too. Then, 30 points for every subsequent one. We’ll update the table to make this clearer. Now I really like the idea of awarding bonus points to the author of the first answer to a question 😄 — maybe we should introduce something like that! Thanks for the clarification! And I agree, that would be a good way to incentivize initiation on posts. I also think you should give points to those who suggested great ideas on how to score points 😄