Search

Clear filter
Announcement
Anastasia Dyubaylo · Jul 9

Kick-off Webinar for InterSystems Developer Tools Contest 2025

Hey Community, We're pleased to invite all the developers to the upcoming kick-off webinar for the InterSystems Developer Tools Contest! Date & Time: Monday, July 14 – 11 am EDT | 5 pm CEST Discover the exciting challenges and opportunities that await developers in this contest. We will also discuss the topics we would like the participants to cover and show you how to develop, build, and deploy applications using the InterSystems IRIS data platform. Speakers: 🗣 @Raj.Singh5479, Project Manager - Developer Experience, InterSystems🗣 @Evgeny.Shvarov, Senior Startups and Community Programs Manager, InterSystems🗣 ​​​@DKG, Developer Relations Evangelist, InterSystems ✅ Register for the kick-off today!
Announcement
Evgeny Shvarov · Jul 31

Technology Bonuses Results for the InterSystems Developer Tools Contest 2025

We are happy to present the bonuses page for the applications submitted to the InterSystems Developer Tools Contest 2025! See the results below. Project Vector Search Embedded Python Interoperability IRIS BI VSCode Plugin FHIR Tools Docker IPM Community Idea Implementation Find Embedded Python bug First Article on DC Second Article on DC Video on YouTube First Time Contribution Total Bonus Nominal 3 3 3 3 3 3 2 2 4 2 2 1 3 3 37 Interoperability REST API Template 3 3 6 iristest-html 3 3 Global-Inspector 2 2 2 3 9 InterSystems Testing Manager for VS Code 3 4 2 1 3 13 IPM Explorer for VSCode 3 2 3 3 11 typeorm-iris 2 4 2 3 11 addsearchtable 3 3 iris-message-search 3 2 2 7 iris4word 3 3 3 2 2 4 2 1 3 23 PyObjectscript Gen 0 IrisTest 2 3 5 dc-artisan 3 3 3 3 2 2 2 1 3 22 templated_email 3 3 2 2 2 12 wsgi-to-zpm 3 2 5 toolqa 3 3 2 2 1 11 dataset-sample-split 3 2 3 8 snipforge 3 3 6 Please apply with your comments for new implementations and corrections to be made here in the comments or in Discord. Hi @Evgeny.Shvarov , how are you? What about this item? It could be a new article or a translation, right?The Second article on Developer Community - 1 point You can collect one more bonus point for the second article or the translation of the first article. The second article should go into detail about a feature of your project. The 3rd and more articles will not bring more points, but the attention will be all yours. *This bonus is subject to the discretion of the experts whose decision is final. HI! Right, It can be a translation of the first article or you can provide a new article if you want. @Semion.Makarov ,this is exactly what I'm questioning, to see if there is any other detail that may have prevented the score from being received. Andre, you can choose any option from translation or new article, only theme should be related with app and length for new article should be more when 1000 characters. No more restrictions So the point was not computed. Ok. Please provide link to translation. I'll update the table https://pt.community.intersystems.com/post/toolqa-espera-acabouThank you, @Semion.Makarov Done Hi @Semion.Makarov I'm not sure if we can claim points for "Community Idea Implementation" on dc-artisan. While we don't literally implement the idea from https://ideas.intersystems.com/ideas/DPI-I-557, but the rag-pipeline mode was heavily inspired by it. Our goal was to showcase embedded chunks, management, and vector ingestion, which aligns closely with the core concept of that idea. It's not a direct, word-for-word implementation of a GUI like vectorAdmin, but the underlying idea is very similar. Hi! That's pretty cool that you was inspired by one of ideas! But that idea requires UI for Vector db management. Your app doesn't follow the main idea's line Video for InterSystems Testing Manager for VS Code is now on YouTube. Thank you! Bonus was added. Can I get a bonus for Community opportunity Idea? As part of the main project, to proper test across multiple versions of IRIS, I implemented another project and wrote an article about it, with examples from typeorm-iris project I know that only Python matters nowadays, but what about other languages officially supported by InterSystems? I found significant bugs in the NodeJS driver, and it is not worth anything. Published Video about typeorm-iris I think IPM Explorer for VSCode has been incorrectly awarded the IPM bonus. From the bonuses article: This entry isn't published as an IPM package but as a VS Code extension on Microsoft's Marketplace. Hi! I've added bonus for idea. About languages, in this contest we have bonuses only for Embedded Python. Hi! I've updated bonuses, thank you! Hi! Judges decided to add this bonus to this app because it directly related with IPM. @Dmitry.Maslennikov Could you please add this link to your OEX app? First Article pointing for dc-artisan https://community.intersystems.com/node/585174 I've added bonus. Thank you! Hmm, seems unfair they didn't take that view last year for my IPM in VS Code entry: https://community.intersystems.com/post/technological-bonuses-results-developer-tools-contest-2024 Ask @Evgeny.Shvarov for details. I discussed that with Evgeny and team, and we decided to remove this bonus. @John.Murray Hi @Evgeny.Shvarov Now dc-artisan has a Youtube promo / demo video Hi! Bonus was added Hi @Semion.Makarov Second article for dc-artisanhttps://pt.community.intersystems.com/post/do-barro-à-obra-prima-conheça-o-dc-artisan-e-crie-prompts-com-qualidade Hi! Bonus was added. Thanks!
Announcement
Anastasia Dyubaylo · Aug 5

Online Meetup with the Winners of the InterSystems Developer Tools Contest 2025

Hi Community, Let's meet at the online meetup with the winners of the InterSystems Developer Tools Contest! It's a great opportunity to chat with the InterSystems Experts team and our contestants. Winners' demo included! Date & Time: Friday, August 8, 11:30 am EDT | 5:30 pm CEST Join us to learn more about winners' applications and to have a talk with our experts. ➡️ REGISTER TODAY See you all at our virtual meetup! Please join the meetup in 5 minutes https://riverside.fm/webinar/directlink/eyJzbHVnIjoiZGV2ZWxvcGVyLXJlbGF0aW9ucy1zdHVkaW8iLCJldmVudElkIjoiNjg4OGMxNzExYTdhNjk0N2ZjNjIzMzk5IiwicHJvamVjdElkIjoiNjg4OGMxNzExYTdhNjkyY2VlNjIzMzk2In0=
Announcement
Jess Jowdy · Aug 18

[Demo Video] Introducing InterSystems Go tHE DIStance

#InterSystems Demo Games entry ⏯️ Introducing InterSystems Go tHE DIStance InterSystems Go tHE DIStance is a next-generation care management solution aimed at helping health plans address rising unmanaged healthcare costs and declining quality ratings, particularly in chronic condition management like diabetes. The platform integrates rich, real-time clinical data (rather than relying solely on delayed claims data) into interactive views that can track HEDIS performance, identify care gaps, and support targeted, personalized outreach through built-in communication tools. Care managers can access a unified clinical record, engagement history, and recommended next actions to improve both individual patient outcomes and broader population health. Presenters:🗣 @Jessica.Jowdy, Manager of Healthcare Sales Engineering, InterSystems🗣 @James.Carney, Team Lead of Emerging Markets & SMB Sales Engineering, InterSystems🗣 @Clayton.Lewis, Senior Technical Specialist, InterSystems 🗳 If you like this video, don't forget to vote for it in the Demo Games!
Announcement
Bob Kuszewski · Jun 20

InterSystems API Manager (IAM) 3.10 Release Announcement

InterSystems is pleased to announce that IAM 3.10 has been released. IAM 3.10 is the first significant release in about 18 months, so it includes many significant new features that are not available in IAM 3.4, including: Added support for incremental config sync for hybrid mode deployments. Instead of sending the entire entity config to data planes on each config update, incremental config sync lets you send only the changed configuration to data planes. Added the new configuration parameter admin_gui_csp_header to Gateway, which controls the Content-Security-Policy (CSP) header served with Kong Manager. This defaults to off, and you can opt in by setting it to on. You can use this setting to strengthen security in Kong Manager. AI RAG Injector (ai-rag-injector) Added the AI Rag Injector plugin, which allows automatically injecting documents to simplify building RAG pipelines. AI Sanitizer (ai-sanitizer) Added the AI Sanitizer plugin, which can sanitize the PII information in requests before the requests are proxied by the AI Proxy or AI Proxy Advanced plugins. Kafka Consume (kafka-consume): Introduced the Kafka Consume plugin, which adds Kafka consumption capabilities to Kong Gateway. Redirect (redirect): Introduced the Redirect plugin, which lets you redirect requests to another location. … and many more Customers upgrading from earlier versions of IAM must get a new IRIS license key in order to use IAM 3.10. Kong has changed their licensing in a way that requires us to provide you with new license keys. When you are upgrading IAM, you will need to install the new IRIS license key on your IRIS server before starting IAM 3.10. IAM 2.8 has reached its end-of-life and current customers are strongly encouraged to upgrade as soon as possible. IAM 3.4 will reach end-of-life in 2026, so start planning that upgrade soon. IAM is an API gateway between your InterSystems IRIS servers and applications, providing tools to effectively monitor, control, and govern HTTP-based traffic at scale. IAM is available as a free add-on to your InterSystems IRIS license. IAM 3.10 can be downloaded from the Components area of the WRC Software Distribution site. Follow the Installation Guide for guidance on how to download, install, and get started with IAM. The complete IAM 3.10 documentation gives you more information about IAM and using it with InterSystems IRIS. Our partner Kong provides further documentation on using IAM in the Kong Gateway (Enterprise) 3.10 documentation IAM is only available in OCI (Open Container Initiative) a.k.a. Docker container format. Container images are available for OCI compliant run-time engines for Linux x86-64 and Linux ARM64, as detailed in the Supported Platforms document. The build number for this release is IAM 3.10.0.2. This release is based on Kong Gateway (Enterprise) version 3.10.0.2.
Article
Liam Evans · Jul 14

Hosting a Flask REST API on InterSystems IRIS using WSGI

For my intern project, I am building a Flask REST API backend. My goal is to host it on InterSystems IRIS using the WSGI interface. This is a relatively new approach and is currently only being used in a handful of projects such as AskMe. To help others get started, I decided to write this article to simplify the process. Creating a Basic Flask App First, let’s create a minimal Flask application. Here is the code: from flask import Flask from flask_cors import CORS app = Flask(__name__) CORS(app) @app.route('/test') def test(): return "Test" if __name__ == "__main__": app.run() This simple app runs a Flask server with one API endpoint at /test that returns the text “Test.” Let's analyze our code line by line: from flask import Flask from flask_cors import CORS We are using Flask as our web framework to build the REST API, so we import Flask . CORS (Cross-Origin Resource Sharing) is imported via flask_cors to allow requests from different domains (important for frontend apps hosted elsewhere to access your API without security errors). app = Flask(__name__) CORS(app) We create an instance of the Flask application named app. We then wrap this app with CORS middleware, enabling cross-origin requests by default. Without this, browsers might block API calls from other domains. @app.route('/test') def test(): return "Test" Here we define a route /test using the @app.route decorator. This binds the URL /test to the Python function test(). When a user sends an HTTP GET request to /test, Flask will invoke the test() function, which simply returns the string "Test". This serves as a simple API endpoint for testing connectivity. if __name__ == "__main__": app.run() This block checks if the script is being run directly (not imported as a module). If so, it starts Flask’s built-in development server locally on your machine. Note: When deploying on InterSystems IRIS using WSGI, this section is ignored because IRIS manages the server. Configuring the Web App on IRIS Once your Flask app is ready, the next step is to host it on IRIS by configuring a Web Application with the WSGI option. Open the IRIS Management Portal for your instance. Go to System > Security Management > Web Applications or search Web Applications. Click the Create New Web Application button. Fill out the form as follows: Name: Give your web app a path, for example: /flaskapp1/api. Description: Add a brief description like My Flask API backend. Enabled: Make sure this checkbox is checked. Scroll down and check the WSGI (Experimental) option to enable running Python WSGI apps. Then fill in the WSGI details: Application Name: main (This corresponds to the WSGI callable you want IRIS to use.) Callable Name: app (This matches the Flask application variable name in your code.) WSGI App Directory: Enter the full file path to your main.py file where the Flask app is defined. For example: /path/to/main.py. Debug: Unchecked Click Save. Here’s what this configuration looks like in the portal: Testing Your API After saving, your Flask API should now be available at: https://base.<your-domain>.com/flaskapp1/api/test or more generally, https://base.<your-domain>.com/<path>/test Open this URL in your browser or test it using Postman or curl. You should see the response: 'Test' What’s Next? This setup lets you run a Flask REST API hosted directly on IRIS using the WSGI integration. From here, you can expand your API with more endpoints and connect to IRIS data sources to build powerful applications.Put in the comments what you plan on building with Flask 😜
Announcement
Fabiano Sanches · Jun 27

InterSystems Cloud Services - Release Notes - 27 June 2025

Reference: Build 2025.1.0.1.24372U.25e14d55 Overview This release introduces significant enhancements to security, analytics capabilities, and user experience, along with important operational improvements aimed at reducing downtime and improving reliability. New Features and Enhancements Category Feature / Improvement Details Analytics Adaptive Analytics in Data Fabric Studio InterSystems Data Fabric Studio now includes Adaptive Analytics as an optional feature, offering advanced analytics capabilities directly within your workflow. Security Enhanced Firewall Management Firewall management page now supports creating explicit inbound and outbound firewall rules specifically for port 22, providing greater security and access control. Custom APIs Security Update Custom APIs have transitioned from ID tokens to access tokens, strengthening security by improving authentication mechanisms. Enforcement of HTTPS for Custom APIs Custom APIs no longer support HTTP; all communication is now exclusively over HTTPS, ensuring encrypted and secure data transmission. General Security Improvements Multiple security enhancements applied, reinforcing the security posture across the platform. User Experience New Feature Announcements and Widgets Additional widgets have been introduced to effectively communicate new features, announcements, and important updates directly within the Cloud Service Portal. Operations Improved Timezone Change Performance Downtime associated with the timezone-change operation on prod environments significantly reduced from approximately 2 minutes to about 15 seconds, minimizing impact on operations. Recommended Actions Explore Adaptive Analytics within Data Fabric Studio to enhance your data-driven decision-making capabilities. Review firewall settings to leverage the new inbound/outbound port 22 rules. The first deploy you perform will define the rules. Make sure to review the outbound rules. Ensure Custom APIs use updated SDKs that utilize access tokens instead of ID tokens, and confirm HTTPS-only configurations are correctly applied. Support For assistance, open a support case via iService or directly through the InterSystems Cloud Service Portal. Thank you for choosing InterSystems Cloud Services.
Article
Sylvain Guilbaud · Jul 18

Python tool for exporting/importing InterSystems API Manager configurations

🛠️ Managing InterSystems InterSystems API Manager (IAM = Kong Gateway) configurations in CI/CD 🔍 Context: InterSystems IAM configurations As part of integrating InterSystems IRIS into a secure and controlled environment, InterSystems IAM relies on Kong Gateway to manage exposed APIs.Kong acts as a modern API Gateway, capable of handling authentication, security, traffic management, plugins, and more. However, maintaining consistent Kong configurations (routes, services, plugins, etc.) across different environments (development, testing, production) is a major challenge. This is where tools like deck and this Python script become highly valuable. ⚙️ Overview of kong_config_tool.py This tool allows you to: Export the current configuration of a Kong Gateway into a versionable YAML file. Import a YAML configuration into a Kong Gateway (via deck sync). Automate full logging (logs, stdout/stderr) for accurate tracking. Easily integrate into a CI/CD pipeline. 🎯 Goals and Benefits 🔄 Consistent Synchronization Across Environments The tool simplifies propagating Kong configuration between environments. By exporting from dev and importing into staging or prod, you ensure functional parity. 🔐 Traceability and Audits via Logs With the --log option, all operations (including internal deck commands) are logged: Who executed what What configuration was applied What was Kong’s response (number of resources created, modified, etc.) 🧪 CI/CD Pipeline Integration In GitLab CI, GitHub Actions, or Jenkins: The export step can be triggered automatically after API changes. The import step can deploy the Kong config on every merge or release. The generated YAML files can be version-controlled in Git. 🧰 Example GitLab Pipeline stages: - export - deploy export_kong: stage: export script: - python3 kong_config_tool.py --export --log artifacts: paths: - kong.yaml - kong_config_tool.log deploy_kong: stage: deploy script: - python3 kong_config_tool.py --import --log 🛡️ Security and Reproducibility Since InterSystems IAM is often used in sensitive environments (healthcare, finance...), it’s essential to: Avoid manual errors using deck sync Ensure each deployment applies the exact same configuration Maintain a clear audit trail via .log files 💡 Tool Highlights Feature Description --export Saves the current config to a file like kong-<timestamp>.yaml --import Applies the contents of kong.yaml to the Gateway --log Enables full logging (stdout, stderr, logs) Automatic Symlink kong.yaml is always a symlink to the latest exported version Easy Integration No heavy dependencies — relies on standard Python and deck 📦 Conclusion The kong_config_tool.py script is a key component for industrializing Kong configuration management in the context of InterSystems IAM. It enables: Better configuration control Enhanced traceability Smooth integration into CI/CD pipelines Compliance with security requirements 🚀 Potential Future Enhancements Native GitOps integration (ArgoCD, FluxCD) Configuration validation with deck diff Error notifications (Slack, Teams) 🧬 Python Code Overview The kong_config_tool.py script is a Python CLI tool designed to automate configuration exports and imports for Kong Gateway using deck, while ensuring robust logging. 📁 General Structure #!/usr/bin/env python3 import argparse import subprocess from datetime import datetime from pathlib import Path import sys import logging Uses only standard Python modules. argparse: to handle command-line options. subprocess: to run deck commands. logging: for structured output (console + file). 🧱 Logger Initialization logger = logging.getLogger("kong_config_tool") Initializes a named logger, configurable based on whether a log file is requested. 📝 setup_logging(log_file=None) This function: Creates handlers for both console and/or file. Redirects sys.stdout and sys.stderr to the log file if --log is provided. 🔎 This captures everything: Python logs, print(), errors, and also output from deck. 📤 export_kong_config() deck_dir = Path.cwd() output_file = deck_dir / f"kong-{timestamp}.yaml" Executes deck gateway dump -o ... to export the current configuration. Captures stdout and stderr and sends them to logger.debug(...). Creates or updates a kong.yaml symlink pointing to the exported file — simplifying future imports. Logs and exits on failure. 📥 import_kong_config() Checks for the presence of the kong.yaml file (symlink or actual file). Runs deck gateway sync kong.yaml. Captures and logs full output. Handles errors via CalledProcessError. 🔁 This logic mirrors the export process. 🚀 main() The main entry point that: Handles --export, --import, and --log arguments. Calls the appropriate functions. Example usage: python kong_config_tool.py --export --log python kong_config_tool.py --import --log 💡 If --log is omitted, output goes to console only. 🧪 Typical CI/CD Execution Export python kong_config_tool.py --export --log Results: kong-2025-07-18_12-34-56.yaml (versionable content) kong.yaml (useful symlink for import) kong_config_tool.log (audit log) Import python kong_config_tool.py --import --log Results: Applies the configuration to a new gateway (staging, prod, etc.) kong_config_tool.log to prove what was done ✅ Code Summary Table Feature Implementation Intuitive CLI Interface argparse with help descriptions Clean Export deck gateway dump + timestamp Controlled Import deck gateway sync kong.yaml Full Logging logging + stdout/stderr redirection Resilience Error handling via try/except CI/CD Ready Simple interface, no external dependencies Let me know if you'd like the English version of the actual code too! kong_config_tool.py #!/usr/bin/env python3 import argparse import subprocess from datetime import datetime from pathlib import Path import sys import logging logger = logging.getLogger("kong_config_tool") def setup_logging(log_file=None): logger.setLevel(logging.DEBUG) formatter = logging.Formatter("[%(asctime)s] %(levelname)s: %(message)s", "%Y-%m-%d %H:%M:%S") handlers = [] # Console handler console_handler = logging.StreamHandler() console_handler.setFormatter(formatter) handlers.append(console_handler) if log_file: file_handler = logging.FileHandler(log_file) file_handler.setFormatter(formatter) handlers.append(file_handler) # Redirect all stdout and stderr to the log file log_file_handle = open(log_file, "a") sys.stdout = log_file_handle sys.stderr = log_file_handle for handler in handlers: logger.addHandler(handler) def export_kong_config(): deck_dir = Path.cwd() current_date = datetime.now().strftime("%Y-%m-%d_%H-%M-%S") output_file = deck_dir / f"kong-{current_date}.yaml" try: logger.info(f"Exporting Kong config to: {output_file}") result = subprocess.run( ["deck", "gateway", "dump", "-o", str(output_file)], check=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True ) # Log and print deck output logger.debug(result.stdout) logger.debug(result.stderr) print(result.stdout) print(result.stderr) symlink_path = deck_dir / "kong.yaml" if symlink_path.exists() or symlink_path.is_symlink(): symlink_path.unlink() symlink_path.symlink_to(output_file.name) logger.info(f"Symlink created/updated: {symlink_path} -> {output_file.name}") except subprocess.CalledProcessError as e: logger.error(f"Export failed: {e}") logger.error(e.stderr) print(e.stderr) sys.exit(1) def import_kong_config(): deck_file = Path.cwd() / "kong.yaml" if not deck_file.exists(): logger.error(f"Configuration file {deck_file} not found.") sys.exit(1) try: logger.info("Syncing kong.yaml to gateway...") result = subprocess.run( ["deck", "gateway", "sync", str(deck_file)], check=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True ) # Log and print deck output logger.debug(result.stdout) logger.debug(result.stderr) # print(result.stdout) # print(result.stderr) logger.info("Sync completed successfully.") except subprocess.CalledProcessError as e: logger.error(f"Sync failed: {e}") logger.error(e.stderr) print(e.stderr) sys.exit(1) def main(): parser = argparse.ArgumentParser( description=( "Tool to export or import Kong Gateway configuration using deck.\n\n" "Exactly one of the options --export or --import is required.\n\n" "Examples:\n" " python kong_config_tool.py --export --log\n" " python kong_config_tool.py --import --log\n\n" "For help, run:\n" " python kong_config_tool.py --help" ), formatter_class=argparse.RawTextHelpFormatter ) group = parser.add_mutually_exclusive_group(required=True) group.add_argument("-e", "--export", dest="export_config", action="store_true", help="Export Kong configuration to a timestamped YAML file and create/update kong.yaml symlink.") group.add_argument("-i", "--import", dest="import_config", action="store_true", help="Import (sync) Kong configuration from kong.yaml into Kong Gateway.") parser.add_argument( "--log", dest="log_file", nargs="?", const="kong_config_tool.log", help="Enable logging to a file. Use default 'kong_config_tool.log' if no path is given." ) args = parser.parse_args() setup_logging(args.log_file) if args.export_config: export_kong_config() elif args.import_config: import_kong_config() if __name__ == "__main__": main()
Announcement
Mohammad Ali · Jul 25

Hiring InterSystems IRIS for Health Developer – Remote (USA Only)

We are looking for experienced InterSystems IRIS for Health developers to join a long-term healthcare technology project. You must be USA-based and available to work full-time remotely. 🔹 Requirements: Strong experience with InterSystems IRIS for Health Solid understanding of HL7, FHIR, and healthcare integration workflows Proficiency in ObjectScript, and Python/Java/SQL Prior experience with EHR, LIS, or other clinical systems is a big plus Must be based in the United States Interested candidates can DM here or email on: ali.ceo@softhawker.com
Announcement
Fabiano Sanches · Aug 19

InterSystems Cloud Services - Release Notes - 18 August 2025

Build 2025.1.0.1.24372U.f00326d. Overview This release delivers expanded Azure support for InterSystems Data Fabric Studio, enhanced subscription flexibility, major module updates, and multiple improvements to networking, security, and API responsiveness. New Features and Enhancements Category Feature/Improvement Details Azure Support Enhanced InterSystems Data Fabric Studio (IDFS) on Azure Improved stability, compatibility, and performance for IDFS deployments in Azure environments. Deployment update required. Subscription More Granular Usage Options Subscription model updated to provide finer-grained usage configuration, enabling better cost control and service alignment. Modules Data Fabric Studio - Supply Chain 1.1.0 Released version 1.1.0 of the Supply Chain module, introducing functional enhancements and performance optimizations. Deployment update required. Networking Conflict Notification for Network Routes Network Connect now provides clearer, more actionable notifications when conflicting network routes are detected, improving troubleshooting speed. Networking Enhanced Network Connectivity Test Connectivity testing now delivers more detailed results, helping identify and resolve network issues faster. Security Improved SSL/TSL Configuration Updated SSL/TLS defaults and configuration processes to ensure stronger encryption and compliance with best practices. Custom APIs Accurate Asynchronous Job Responses Custom APIs now correctly reflect the final outcome of asynchronous jobs in responses, improving reliability for client integrations. Recommended Actions If you're using Data Fabric Studio - Supply Chain, request an update to the v1.1.0 to leverage to its latest capabilities. Support For assistance or to learn more about these updates, open a support case via iService or through the InterSystems Cloud Service Portal. ©2025 InterSystems Corporation. All Rights Reserved.
Article
Murray Oldfield · Nov 12, 2016

InterSystems Data Platforms Capacity Planning and Performance Series Index

# Index This is a list of all the posts in the Data Platforms’ capacity planning and performance series in order. Also a general list of my other posts. I will update as new posts in the series are added. > You will notice that I wrote some posts before IRIS was released and refer to Caché. I will revisit the posts over time, but in the meantime, Generally, the advice for configuration is the same for Caché and IRIS. Some command names may have changed; the most obvious example is that anywhere you see the `^pButtons` command, you can replace it with `^SystemPerformance`. --- > While some posts are updated to preserve links, others will be marked as strikethrough to indicate that the post is legacy. Generally, I will say, "See: some other post" if it is appropriate. --- #### Capacity Planning and Performance Series Generally, posts build on previous ones, but you can also just dive into subjects that look interesting. - [Part 1 - Getting started on the Journey, collecting metrics.][1] - [Part 2 - Looking at the metrics we collected.][2] - [Part 3 - Focus on CPU.][3] - [Part 4 - Looking at memory.][4] - [Part 5 - Monitoring with SNMP.][5] - [Part 6 - Caché storage IO profile.][6] - [Part 7 - ECP for performance, scalability and availability.][7] - [Part 8 - Hyper-Converged Infrastructure Capacity and Performance Planning][8] - [Part 9 - Caché VMware Best Practice Guide][9] - [Part 10 - VM Backups and IRIS freeze/thaw scripts][10] - [Part 11 - Virtualizing large databases - VMware cpu capacity planning][11] #### Other Posts This is a collection of posts generally related to Architecture I have on the Community. - [Understanding free memory on a Linux IRIS database server][34] - [Access IRIS database ODBC or JDBC using Python.][33] - [Ansible modules and IRIS demo.][32] - [AWS Capacity Planning Review Example.][29] - [Using an LVM stripe to increase AWS EBS IOPS and Throughput.][28] - [YASPE - Parse and chart InterSystems Caché pButtons and InterSystems IRIS SystemPerformance files for quick performance analysis of Operating System and IRIS metrics.][27] - [SAM - Hacks and Tips for set up and adding metrics from non-IRIS targets][12] - [Monitoring InterSystems IRIS Using Built-in REST API - Using Prometheus format.][13] - [Example: Review Monitor Metrics From InterSystems IRIS Using Default REST API][14] - [InterSystems Data Platforms and performance – how to update pButtons.][15] - [Extracting pButtons data to a csv file for easy charting.][16] - [Provision a Caché application using Ansible - Part 1.][17] - [Windows, Caché and virus scanners.][18] - [ECP Magic.][19] - [Markdown workflow for creating Community posts.][20] - [Yape - Yet another pButtons extractor (and automatically create charts)][21] See: [YASPE](https://community.intersystems.com/post/yaspe-yet-another-system-performance-extractor). - [How long does it take to encrypt a database?][22] - [Minimum Monitoring and Alerting Solution][23] - [LVM PE Striping to maximize Hyper-Converged storage throughput][24] - [Unpacking pButtons with Yape - update notes and quick guides][25] - [Decoding Intel processor models reported by Windows][26] - [AWS storage. High write IOPS. Compare gp3 and io2][30] - [[Video] Best Practices for InterSystems IRIS System Performance in the Cloud][31] Murray Oldfield Principle Technology Architect InterSystems Follow the community or @murrayoldfield on Twitter [1]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-%E2%80%93-part-1 [2]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-%E2%80%93-part-2 [3]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-%E2%80%93-part-3-focus-cpu [4]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-part-4-looking-memory [5]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-part-5-monitoring-snmp [6]: https://community.intersystems.com/post/data-platforms-and-performance-part-6-cach%C3%A9-storage-io-profile [7]: https://community.intersystems.com/post/data-platforms-and-performance-part-7-ecp-performance-scalability-and-availability [8]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-%E2%80%93-part-8-hyper-converged-infrastructure-capacity [9]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-%E2%80%93-part-9-cach%C3%A9-vmware-best-practice-guide [10]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-%E2%80%93-vm-backups-and-cach%C3%A9-freezethaw-scripts [11]: https://community.intersystems.com/post/virtualizing-large-databases-vmware-cpu-capacity-planning [12]: https://community.intersystems.com/post/sam-hacks-and-tips-set-and-adding-metrics-non-iris-targets [13]: https://community.intersystems.com/post/monitoring-intersystems-iris-using-built-rest-api [14]: https://community.intersystems.com/post/example-review-monitor-metrics-intersystems-iris-using-default-rest-api [15]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-%E2%80%93-how-update-pbuttons [16]: https://community.intersystems.com/post/extracting-pbuttons-data-csv-file-easy-charting [17]: https://community.intersystems.com/post/provision-cach%C3%A9-application-using-ansible-part-1 [18]: https://community.intersystems.com/post/windows-cach%C3%A9-and-virus-scanners [19]: https://community.intersystems.com/post/ecp-magic [20]: https://community.intersystems.com/post/markdown-workflow-creating-community-posts [21]: https://community.intersystems.com/post/yape-yet-another-pbuttons-extractor-and-automatically-create-charts [22]: https://community.intersystems.com/post/how-long-does-it-take-encrypt-database [23]: https://community.intersystems.com/post/minimum-monitoring-and-alerting-solution [24]: https://community.intersystems.com/post/lvm-pe-striping-maximize-hyper-converged-storage-throughput [25]: https://community.intersystems.com/post/unpacking-pbuttons-yape-update-notes-and-quick-guides [26]: https://community.intersystems.com/post/decoding-intel-processor-models-reported-windows [27]: https://community.intersystems.com/post/yaspe-yet-another-system-performance-extractor [28]: https://community.intersystems.com/post/using-lvm-stripe-increase-aws-ebs-iops-and-throughput [29]: https://community.intersystems.com/post/aws-capacity-planning-review-example [30]: https://community.intersystems.com/post/aws-storage-high-write-iops-compare-gp3-and-io2 [31]: https://www.youtube.com/watch?v=2nUzXjymRXs [32]: https://community.intersystems.com/post/ansible-modules-and-iris-demo [33]: https://community.intersystems.com/post/access-iris-database-odbc-or-jdbc-using-python [34]: https://community.intersystems.com/post/understanding-free-memory-linux-database-server
Announcement
Celeste Canzano · May 12

Do you have the InterSystems IRIS SQL Specialist certification? Beta testers needed for our upcoming InterSystems IRIS SQL Professional certification exam

Hello IRIS community, InterSystems Certification is currently developing a certification exam for InterSystems IRIS SQL professionals, and if you match the exam candidate description given below, we would like you to beta test the exam! The exam will be available for beta testing starting May 19, 2025. Please note: Only candidates with the pre-existing InterSystems IRIS SQL Specialist certification are eligible to take the beta. Interested in the beta but don’t have the SQL Specialist certification? Take the SQL Specialist exam! Eligible candidates will receive an email from the certification team with instructions on scheduling the exam. Beta testing will be completed October 30, 2025. What are my responsibilities as a beta tester? You will schedule and take the exam by July 15th. The exam will be administered in an online proctored environment free of charge (the standard fee of $150 per exam is waived for all beta testers). The InterSystems Certification team will then perform a careful statistical analysis of all beta test data to set a passing score for the exam. The analysis of the beta test results will take 6-8 weeks, and once the passing score is established, you will receive an email notification from InterSystems Certification informing you of the results. If your score on the exam is at or above the passing score, you will have earned the certification! Note: Beta test scores are completely confidential. How is this exam different from the InterSystems IRIS SQL Specialist exam? This new exam - InterSystems IRIS SQL Professional - covers higher-level SQL topics and is recommended for candidates with 4 to 6 years of relevant experience, compared to the 1 to 2 years recommended for the SQL Specialist exam. Interested in participating? Read the Exam Details below. Exam Details Exam title: InterSystems IRIS SQL Professional Candidate description: A developer or solutions architect who Designs IRIS SQL applications Manages IRIS SQL operations Uses IRIS SQL Loads and efficiently queries datasets stored in IRIS SQL Number of questions: 38 Time allotted to take exam: 2 hours Recommended preparation: Review the content below before taking the exam. Online Learning: Using SQL in InterSystems IRIS (learning path, 3h 45m) Recommended practical experience: 4 to 6 years of experience developing and managing IRIS SQL applications is recommended. At least 2 years of experience working with ObjectScript and globals in InterSystems IRIS is recommended. Exam practice questions A set of practice questions is provided here to familiarize candidates with question formats and approaches. Exam format The questions are presented in two formats: multiple choice and multiple response. Access to InterSystems IRIS Documentation will be available during the exam. DISCLAIMER: Please note this exam has a 2-hour time limit. While InterSystems documentation will be available during the exam, candidates will not have time to search the documentation for every question. Thus, completing the recommended preparation before taking the exam, and searching the documentation only when absolutely necessary during the exam, are both strongly encouraged! System requirements for beta testing Working camera & microphone Dual-core CPU At least 2 GB available of RAM memory At least 500 MB of available disk space Minimum internet speed: Download - 500kb/s Upload - 500kb/s Exam topics and content The exam contains questions that cover the areas for the stated role as shown in the exam topics chart immediately below. Topic Subtopic Knowledge, skills, and abilities 1. Designs IRIS SQL applications 1.1 Designs a SQL schema Distinguishes use cases for row vs columnar table layout Distinguishes use cases for different index types 1.2 Designs advanced schemas Recalls anatomy of Globals (subscript and value) Interprets relationship between table structure and Globals Distinguishes the (Globals) level at which mirroring/journaling operates from the SQL layer Distinguishes the differences between date/time data types Interprets the overhead associated with stream data Identifies use cases for text search 1.3 Writes business logic Identifies use cases for UDFs, UDAFs, and SPs 1.4 Develops Object/Relational applications Recalls SQL best practices when defining classes Uses Object access to interact with individual rows Identifies SQL limitations with class inheritance Uses serial and object properties Identifies use cases for collection properties Distinguishes class relationships from Foreign Keys 1.5 Deploys SQL applications Determines what needs to be part of a deployment 2. Uses IRIS SQL 2.1 Manages IRIS query processing Identify benefits of the universal query cache List considerations made by the optimizer Differentiates client and server-side problems Uses Statement Index to find statement metadata Distinguishes between the use of parameters and constants in a query Distinguishes between transaction and isolation levels 2.2 Interprets query plans Identifies the use of indices in a query plan Identifies vectorized (columnar) query plans Uses hints to troubleshoot query planning Identifies opportunities for indices, based on a query plan 2.3 Uses IRIS SQL in applications Distinguishes use cases for Dynamic SQL and Embedded SQL 2.4 Uses IRIS-specific SQL capabilities Uses arrow syntax for implicit joining Determines use cases for explicit use of collation functions 3. Manages IRIS SQL operations 3.1 Manages SQL operations Identifies use cases for purging queries and rebuilding indices Recalls impact of purging queries and rebuilding indices Identifies use cases for un/freezing query plans, including automation Identifies use cases for (bitmap) index compaction Uses the runtime stats in the Statement Index to find statements with optimization opportunities 3.2 Configures InterSystems SQL options Recalls relevant system configuration options (e.g. lock threshold) Differentiates scale-out options, ECP, and sharding 3.3 Manages SQL security Recalls to apply SQL privilege checking when using Embedded SQL 3.4 Uses PTools for advanced performance analysis Identifies use cases for using PTools Interested in participating? Eligible candidates will receive an email from the certification team with instructions on how to schedule and take the exam. Hello Celeste! This is really interesting. How are the eligible candidates chosen? Is there a way to apply? Thank you. Hi Pietro! Unlike prior certification exam betas, only folks who hold the InterSystems IRIS SQL Specialist certification are eligible. There is no application process, rather, the certification team will be reaching out directly to eligible individuals on May 19th. Anyone who holds an active SQL Specialist certification will receive an email next Monday with instructions on how to access and take the beta exam. The email will be sent to the address associated with your account on Credly, our digital badging platform. If you do not yet have the SQL Specialist certification, I encourage you to consider taking the InterSystems IRIS SQL Specialist certification exam. Once you pass this exam and obtain the certification, you will receive an email from the certification team regarding the beta. Please let me know if I can clarify anything! Thank you for the clarifications Celeste! dear @Celeste.Canzano Can anyone take this beta test? or only those who holds an active SQL Specialist certification is eligible for this? Regards, Harshitha Hi Harshitha! Only those who hold an active SQL Specialist certification are eligible for the SQL Professional beta. Please let me know if you have any additional questions.
Question
Murali krishnan · May 3, 2017

DEV, TEST Environment set up in Intersystems

Intersystems is all about name spaces. Each Name space can be mapped to one or more databases and vice versa. In my desktop with intersystems, Can i have DEV , TEST environments pointing to different name spaces at same point of time ? if i am right here, then the DEV environment is nothing but the namespace that we work on....Please let know Slight correction: the system mode is per instance and not per namespace. Well spotted, I've updated my comment. Hi Murali,Your perfectly right.You can have multiple namespaces on the same Caché instance for different purposes. These should have a naming convention to identify their purpose. That convention is really down to you. I normally postfix the name with -DEV and -TEST, e.g. FOO-DEV & FOO-TEST.These namespaces will share commonly mapped code from the main library, but unless configured otherwise they are completely independent from each other. You can dev and test in them respectively without fear of polluting one another.TipYou can mark the mode of an instance via the management portal > system > configuration > memory and startup. On that configuration page you will see a drop down for "system mode" with the options...Live SystemTest SystemDevelopment SystemFailover SystemThe options are mostly inert, but what they will do is paint a box on the top of your management portal screens. If you mark it as live then you will get a red box that you can't miss.Sean
Question
Esther Guite · Sep 27, 2018

running intersystems cache workflow using atelier

Has anyone successfully executed s (INTSFREEZE and INSTATHAW) scripts in atelier tool? We are having issues with FREEZE script whereas Thaw works fine. Would appreciate any inputs! I'm not familiar with INSTFREEZE and INSTATHAW, can you describe where those scripts come from? However, if these are calling into the database and ultimately call the ExternalFreeze and ExternalThaw methods, the permissions for running an ExternalFreeze are more strict than those for running an ExternalThaw, which may be the root of the difference as to why thaw is succeeding and freeze is failing. Thanks for providing inputs. Yes agreed, invoking external freeze and external thaw operations to perform suspend and resume system. Thaw operation working as expected , but externalFreeze operation failing, I believe these issue with authentication issue. Is there way to address authentication issue to fix through Cache object script using $ZF(-1) function or any other option .OS level authentication we enabled through admin console and got succeeded in the Windows Command prompt execution but same is failing through cache object script code. @Thomas Granger - Thank you for your response. You said it right! that's exactly the problem with specific permissions for ExternalFreeze.. @John Murray - Thank you for your response and sharing the article. REally appreciate it! will definitely take a look and try that! Building off the comment by @Thomas.Granger about this perhaps being a permissions issue, I suggest you check out this previous DC article by me:https://community.intersystems.com/post/who-does-windows-think-i-am
Question
Guillaume Lepretre · Jun 20, 2018

[Intersystems 2017] Use the email alert request

Hello, I used the operation : EnsLib.EMail.AlertOperation to send mail to handle error. However, I want to get more information about the error (session ID message, the date, the namespace... etc). what is the best way to do it? I tried to add informations in OnAlertRequest method as below but I need to change mail operation from all namespaces... Thanks, I need to get the url of message viewer screen with the session ID .For example : http://localhost:57772/csp/svcptl/EnsPortal.VisualTrace.zen?SESSIONID=40241. It is possible ? You already have a value of SessionId, so concatenate the rest?To get port and host call: set sc=##class(%Studio.General).GetWebServerPort(.port, .server) Some ideas:session ID - you're getting it with pAlertRequest.SessionId, no?date - get it from pAlertRequest.AlertTimenamespace -wouldn't it always be the current namespace? Get it with $namespaceWhat other data do you need?Also, please post your code as text. You can do it in three ways:1 -set ^mtemperro($ZNSPACE,..%PackageName()_"."_..%ClassName(),$horolog) = mensagem_" Erro CACHE: "_$zerror_" - "_$SYSTEM.OBJ.DisplayError()2 - Throw ##class(%Exception.General).%New("Falha ao sinalizar o lançamento do PDA como rastreado.",1,..%ClassName()_".upByRastreado","Informe ao suporte sobre o problema.").Log()3 - Declare in your class an exception variable#dim exception As% Exception.AbstractExceptioncatch exception { do exception.Log()}Then just check the cache administration portal:System Operation-> System Logs-> Application Error Log