Clear filter
Article
Sylvain Guilbaud · Jul 18
🛠️ Managing InterSystems InterSystems API Manager (IAM = Kong Gateway) configurations in CI/CD
🔍 Context: InterSystems IAM configurations
As part of integrating InterSystems IRIS into a secure and controlled environment, InterSystems IAM relies on Kong Gateway to manage exposed APIs.Kong acts as a modern API Gateway, capable of handling authentication, security, traffic management, plugins, and more.
However, maintaining consistent Kong configurations (routes, services, plugins, etc.) across different environments (development, testing, production) is a major challenge. This is where tools like deck and this Python script become highly valuable.
⚙️ Overview of kong_config_tool.py
This tool allows you to:
Export the current configuration of a Kong Gateway into a versionable YAML file.
Import a YAML configuration into a Kong Gateway (via deck sync).
Automate full logging (logs, stdout/stderr) for accurate tracking.
Easily integrate into a CI/CD pipeline.
🎯 Goals and Benefits
🔄 Consistent Synchronization Across Environments
The tool simplifies propagating Kong configuration between environments. By exporting from dev and importing into staging or prod, you ensure functional parity.
🔐 Traceability and Audits via Logs
With the --log option, all operations (including internal deck commands) are logged:
Who executed what
What configuration was applied
What was Kong’s response (number of resources created, modified, etc.)
🧪 CI/CD Pipeline Integration
In GitLab CI, GitHub Actions, or Jenkins:
The export step can be triggered automatically after API changes.
The import step can deploy the Kong config on every merge or release.
The generated YAML files can be version-controlled in Git.
🧰 Example GitLab Pipeline
stages:
- export
- deploy
export_kong:
stage: export
script:
- python3 kong_config_tool.py --export --log
artifacts:
paths:
- kong.yaml
- kong_config_tool.log
deploy_kong:
stage: deploy
script:
- python3 kong_config_tool.py --import --log
🛡️ Security and Reproducibility
Since InterSystems IAM is often used in sensitive environments (healthcare, finance...), it’s essential to:
Avoid manual errors using deck sync
Ensure each deployment applies the exact same configuration
Maintain a clear audit trail via .log files
💡 Tool Highlights
Feature
Description
--export
Saves the current config to a file like kong-<timestamp>.yaml
--import
Applies the contents of kong.yaml to the Gateway
--log
Enables full logging (stdout, stderr, logs)
Automatic Symlink
kong.yaml is always a symlink to the latest exported version
Easy Integration
No heavy dependencies — relies on standard Python and deck
📦 Conclusion
The kong_config_tool.py script is a key component for industrializing Kong configuration management in the context of InterSystems IAM. It enables:
Better configuration control
Enhanced traceability
Smooth integration into CI/CD pipelines
Compliance with security requirements
🚀 Potential Future Enhancements
Native GitOps integration (ArgoCD, FluxCD)
Configuration validation with deck diff
Error notifications (Slack, Teams)
🧬 Python Code Overview
The kong_config_tool.py script is a Python CLI tool designed to automate configuration exports and imports for Kong Gateway using deck, while ensuring robust logging.
📁 General Structure
#!/usr/bin/env python3
import argparse
import subprocess
from datetime import datetime
from pathlib import Path
import sys
import logging
Uses only standard Python modules.
argparse: to handle command-line options.
subprocess: to run deck commands.
logging: for structured output (console + file).
🧱 Logger Initialization
logger = logging.getLogger("kong_config_tool")
Initializes a named logger, configurable based on whether a log file is requested.
📝 setup_logging(log_file=None)
This function:
Creates handlers for both console and/or file.
Redirects sys.stdout and sys.stderr to the log file if --log is provided.
🔎 This captures everything: Python logs, print(), errors, and also output from deck.
📤 export_kong_config()
deck_dir = Path.cwd()
output_file = deck_dir / f"kong-{timestamp}.yaml"
Executes deck gateway dump -o ... to export the current configuration.
Captures stdout and stderr and sends them to logger.debug(...).
Creates or updates a kong.yaml symlink pointing to the exported file — simplifying future imports.
Logs and exits on failure.
📥 import_kong_config()
Checks for the presence of the kong.yaml file (symlink or actual file).
Runs deck gateway sync kong.yaml.
Captures and logs full output.
Handles errors via CalledProcessError.
🔁 This logic mirrors the export process.
🚀 main()
The main entry point that:
Handles --export, --import, and --log arguments.
Calls the appropriate functions.
Example usage:
python kong_config_tool.py --export --log
python kong_config_tool.py --import --log
💡 If --log is omitted, output goes to console only.
🧪 Typical CI/CD Execution
Export
python kong_config_tool.py --export --log
Results:
kong-2025-07-18_12-34-56.yaml (versionable content)
kong.yaml (useful symlink for import)
kong_config_tool.log (audit log)
Import
python kong_config_tool.py --import --log
Results:
Applies the configuration to a new gateway (staging, prod, etc.)
kong_config_tool.log to prove what was done
✅ Code Summary Table
Feature
Implementation
Intuitive CLI Interface
argparse with help descriptions
Clean Export
deck gateway dump + timestamp
Controlled Import
deck gateway sync kong.yaml
Full Logging
logging + stdout/stderr redirection
Resilience
Error handling via try/except
CI/CD Ready
Simple interface, no external dependencies
Let me know if you'd like the English version of the actual code too!
kong_config_tool.py
#!/usr/bin/env python3
import argparse
import subprocess
from datetime import datetime
from pathlib import Path
import sys
import logging
logger = logging.getLogger("kong_config_tool")
def setup_logging(log_file=None):
logger.setLevel(logging.DEBUG)
formatter = logging.Formatter("[%(asctime)s] %(levelname)s: %(message)s", "%Y-%m-%d %H:%M:%S")
handlers = []
# Console handler
console_handler = logging.StreamHandler()
console_handler.setFormatter(formatter)
handlers.append(console_handler)
if log_file:
file_handler = logging.FileHandler(log_file)
file_handler.setFormatter(formatter)
handlers.append(file_handler)
# Redirect all stdout and stderr to the log file
log_file_handle = open(log_file, "a")
sys.stdout = log_file_handle
sys.stderr = log_file_handle
for handler in handlers:
logger.addHandler(handler)
def export_kong_config():
deck_dir = Path.cwd()
current_date = datetime.now().strftime("%Y-%m-%d_%H-%M-%S")
output_file = deck_dir / f"kong-{current_date}.yaml"
try:
logger.info(f"Exporting Kong config to: {output_file}")
result = subprocess.run(
["deck", "gateway", "dump", "-o", str(output_file)],
check=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
text=True
)
# Log and print deck output
logger.debug(result.stdout)
logger.debug(result.stderr)
print(result.stdout)
print(result.stderr)
symlink_path = deck_dir / "kong.yaml"
if symlink_path.exists() or symlink_path.is_symlink():
symlink_path.unlink()
symlink_path.symlink_to(output_file.name)
logger.info(f"Symlink created/updated: {symlink_path} -> {output_file.name}")
except subprocess.CalledProcessError as e:
logger.error(f"Export failed: {e}")
logger.error(e.stderr)
print(e.stderr)
sys.exit(1)
def import_kong_config():
deck_file = Path.cwd() / "kong.yaml"
if not deck_file.exists():
logger.error(f"Configuration file {deck_file} not found.")
sys.exit(1)
try:
logger.info("Syncing kong.yaml to gateway...")
result = subprocess.run(
["deck", "gateway", "sync", str(deck_file)],
check=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
text=True
)
# Log and print deck output
logger.debug(result.stdout)
logger.debug(result.stderr)
# print(result.stdout)
# print(result.stderr)
logger.info("Sync completed successfully.")
except subprocess.CalledProcessError as e:
logger.error(f"Sync failed: {e}")
logger.error(e.stderr)
print(e.stderr)
sys.exit(1)
def main():
parser = argparse.ArgumentParser(
description=(
"Tool to export or import Kong Gateway configuration using deck.\n\n"
"Exactly one of the options --export or --import is required.\n\n"
"Examples:\n"
" python kong_config_tool.py --export --log\n"
" python kong_config_tool.py --import --log\n\n"
"For help, run:\n"
" python kong_config_tool.py --help"
),
formatter_class=argparse.RawTextHelpFormatter
)
group = parser.add_mutually_exclusive_group(required=True)
group.add_argument("-e", "--export", dest="export_config", action="store_true",
help="Export Kong configuration to a timestamped YAML file and create/update kong.yaml symlink.")
group.add_argument("-i", "--import", dest="import_config", action="store_true",
help="Import (sync) Kong configuration from kong.yaml into Kong Gateway.")
parser.add_argument(
"--log",
dest="log_file",
nargs="?",
const="kong_config_tool.log",
help="Enable logging to a file. Use default 'kong_config_tool.log' if no path is given."
)
args = parser.parse_args()
setup_logging(args.log_file)
if args.export_config:
export_kong_config()
elif args.import_config:
import_kong_config()
if __name__ == "__main__":
main()
Announcement
Mohammad Ali · Jul 25
We are looking for experienced InterSystems IRIS for Health developers to join a long-term healthcare technology project. You must be USA-based and available to work full-time remotely.
🔹 Requirements:
Strong experience with InterSystems IRIS for Health
Solid understanding of HL7, FHIR, and healthcare integration workflows
Proficiency in ObjectScript, and Python/Java/SQL
Prior experience with EHR, LIS, or other clinical systems is a big plus
Must be based in the United States
Interested candidates can DM here or email on: ali.ceo@softhawker.com
Announcement
Fabiano Sanches · Aug 19
Build 2025.1.0.1.24372U.f00326d.
Overview
This release delivers expanded Azure support for InterSystems Data Fabric Studio, enhanced subscription flexibility, major module updates, and multiple improvements to networking, security, and API responsiveness.
New Features and Enhancements
Category
Feature/Improvement
Details
Azure Support
Enhanced InterSystems Data Fabric Studio (IDFS) on Azure
Improved stability, compatibility, and performance for IDFS deployments in Azure environments. Deployment update required.
Subscription
More Granular Usage Options
Subscription model updated to provide finer-grained usage configuration, enabling better cost control and service alignment.
Modules
Data Fabric Studio - Supply Chain 1.1.0
Released version 1.1.0 of the Supply Chain module, introducing functional enhancements and performance optimizations. Deployment update required.
Networking
Conflict Notification for Network Routes
Network Connect now provides clearer, more actionable notifications when conflicting network routes are detected, improving troubleshooting speed.
Networking
Enhanced Network Connectivity Test
Connectivity testing now delivers more detailed results, helping identify and resolve network issues faster.
Security
Improved SSL/TSL Configuration
Updated SSL/TLS defaults and configuration processes to ensure stronger encryption and compliance with best practices.
Custom APIs
Accurate Asynchronous Job Responses
Custom APIs now correctly reflect the final outcome of asynchronous jobs in responses, improving reliability for client integrations.
Recommended Actions
If you're using Data Fabric Studio - Supply Chain, request an update to the v1.1.0 to leverage to its latest capabilities.
Support
For assistance or to learn more about these updates, open a support case via iService or through the InterSystems Cloud Service Portal.
©2025 InterSystems Corporation. All Rights Reserved.
Article
Murray Oldfield · Nov 12, 2016
# Index
This is a list of all the posts in the Data Platforms’ capacity planning and performance series in order. Also a general list of my other posts. I will update as new posts in the series are added.
> You will notice that I wrote some posts before IRIS was released and refer to Caché. I will revisit the posts over time, but in the meantime, Generally, the advice for configuration is the same for Caché and IRIS. Some command names may have changed; the most obvious example is that anywhere you see the `^pButtons` command, you can replace it with `^SystemPerformance`.
---
> While some posts are updated to preserve links, others will be marked as strikethrough to indicate that the post is legacy. Generally, I will say, "See: some other post" if it is appropriate.
---
#### Capacity Planning and Performance Series
Generally, posts build on previous ones, but you can also just dive into subjects that look interesting.
- [Part 1 - Getting started on the Journey, collecting metrics.][1]
- [Part 2 - Looking at the metrics we collected.][2]
- [Part 3 - Focus on CPU.][3]
- [Part 4 - Looking at memory.][4]
- [Part 5 - Monitoring with SNMP.][5]
- [Part 6 - Caché storage IO profile.][6]
- [Part 7 - ECP for performance, scalability and availability.][7]
- [Part 8 - Hyper-Converged Infrastructure Capacity and Performance Planning][8]
- [Part 9 - Caché VMware Best Practice Guide][9]
- [Part 10 - VM Backups and IRIS freeze/thaw scripts][10]
- [Part 11 - Virtualizing large databases - VMware cpu capacity planning][11]
#### Other Posts
This is a collection of posts generally related to Architecture I have on the Community.
- [Understanding free memory on a Linux IRIS database server][34]
- [Access IRIS database ODBC or JDBC using Python.][33]
- [Ansible modules and IRIS demo.][32]
- [AWS Capacity Planning Review Example.][29]
- [Using an LVM stripe to increase AWS EBS IOPS and Throughput.][28]
- [YASPE - Parse and chart InterSystems Caché pButtons and InterSystems IRIS SystemPerformance files for quick performance analysis of Operating System and IRIS metrics.][27]
- [SAM - Hacks and Tips for set up and adding metrics from non-IRIS targets][12]
- [Monitoring InterSystems IRIS Using Built-in REST API - Using Prometheus format.][13]
- [Example: Review Monitor Metrics From InterSystems IRIS Using Default REST API][14]
- [InterSystems Data Platforms and performance – how to update pButtons.][15]
- [Extracting pButtons data to a csv file for easy charting.][16]
- [Provision a Caché application using Ansible - Part 1.][17]
- [Windows, Caché and virus scanners.][18]
- [ECP Magic.][19]
- [Markdown workflow for creating Community posts.][20]
- [Yape - Yet another pButtons extractor (and automatically create charts)][21] See: [YASPE](https://community.intersystems.com/post/yaspe-yet-another-system-performance-extractor).
- [How long does it take to encrypt a database?][22]
- [Minimum Monitoring and Alerting Solution][23]
- [LVM PE Striping to maximize Hyper-Converged storage throughput][24]
- [Unpacking pButtons with Yape - update notes and quick guides][25]
- [Decoding Intel processor models reported by Windows][26]
- [AWS storage. High write IOPS. Compare gp3 and io2][30]
- [[Video] Best Practices for InterSystems IRIS System Performance in the Cloud][31]
Murray Oldfield
Principle Technology Architect
InterSystems
Follow the community or @murrayoldfield on Twitter
[1]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-%E2%80%93-part-1
[2]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-%E2%80%93-part-2
[3]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-%E2%80%93-part-3-focus-cpu
[4]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-part-4-looking-memory
[5]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-part-5-monitoring-snmp
[6]: https://community.intersystems.com/post/data-platforms-and-performance-part-6-cach%C3%A9-storage-io-profile
[7]: https://community.intersystems.com/post/data-platforms-and-performance-part-7-ecp-performance-scalability-and-availability
[8]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-%E2%80%93-part-8-hyper-converged-infrastructure-capacity
[9]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-%E2%80%93-part-9-cach%C3%A9-vmware-best-practice-guide
[10]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-%E2%80%93-vm-backups-and-cach%C3%A9-freezethaw-scripts
[11]: https://community.intersystems.com/post/virtualizing-large-databases-vmware-cpu-capacity-planning
[12]: https://community.intersystems.com/post/sam-hacks-and-tips-set-and-adding-metrics-non-iris-targets
[13]: https://community.intersystems.com/post/monitoring-intersystems-iris-using-built-rest-api
[14]: https://community.intersystems.com/post/example-review-monitor-metrics-intersystems-iris-using-default-rest-api
[15]: https://community.intersystems.com/post/intersystems-data-platforms-and-performance-%E2%80%93-how-update-pbuttons
[16]: https://community.intersystems.com/post/extracting-pbuttons-data-csv-file-easy-charting
[17]: https://community.intersystems.com/post/provision-cach%C3%A9-application-using-ansible-part-1
[18]: https://community.intersystems.com/post/windows-cach%C3%A9-and-virus-scanners
[19]: https://community.intersystems.com/post/ecp-magic
[20]: https://community.intersystems.com/post/markdown-workflow-creating-community-posts
[21]: https://community.intersystems.com/post/yape-yet-another-pbuttons-extractor-and-automatically-create-charts
[22]: https://community.intersystems.com/post/how-long-does-it-take-encrypt-database
[23]: https://community.intersystems.com/post/minimum-monitoring-and-alerting-solution
[24]: https://community.intersystems.com/post/lvm-pe-striping-maximize-hyper-converged-storage-throughput
[25]: https://community.intersystems.com/post/unpacking-pbuttons-yape-update-notes-and-quick-guides
[26]: https://community.intersystems.com/post/decoding-intel-processor-models-reported-windows
[27]: https://community.intersystems.com/post/yaspe-yet-another-system-performance-extractor
[28]: https://community.intersystems.com/post/using-lvm-stripe-increase-aws-ebs-iops-and-throughput
[29]: https://community.intersystems.com/post/aws-capacity-planning-review-example
[30]: https://community.intersystems.com/post/aws-storage-high-write-iops-compare-gp3-and-io2
[31]: https://www.youtube.com/watch?v=2nUzXjymRXs
[32]: https://community.intersystems.com/post/ansible-modules-and-iris-demo
[33]: https://community.intersystems.com/post/access-iris-database-odbc-or-jdbc-using-python
[34]: https://community.intersystems.com/post/understanding-free-memory-linux-database-server
Announcement
Evgeny Shvarov · Sep 20, 2023
Hi Developers!
Here is the bonus results for the applications in InterSystems Python Programming Contest 2023:
Project
Embedded Python
Python Native API
Python Pex Interoperability
Python libs: sqlalchemy and dbt
LLM AI or LangChain
NoObjectScriptLine
Questionnaire
Find a bug in Embedded Python
Docker
ZPM
Online Demo
Community Idea Implementation
First Article on DC
Second Article on DC
First Time Contribution
Video on YouTube
Total Bonus
Nominal
3
3
4
2
4
5
2
2
2
2
2
4
2
1
3
3
44
native-api-command-line-py-client
3
3
5
2
2
2
2
1
3
23
iris-size-django
5
2
2
3
12
apptools-django
3
5
2
2
2
2
16
iris-recorder-helper
2
4
5
2
2
3
18
irisapitester
-
2
-
-
2
iris-python-lookup-table-utils
3
5
2
3
3
16
BardPythonSample
3
4
5
2
2
3
19
native-api-py-demo
3
3
6
iris-vector
3
5
2
2
2
2
2
18
flask-iris
2
5
2
3
12
password-app-iris-db
-
5
2
-
7
Face Login
3
2
2
7
iris-GenLab
3
4
2
4
5
2
2
2
2
1
3
30
iris-python-machinelearn
3
2
2
3
3
13
IRIS-Cloudproof-Encryption
3
3
3
9
Please apply with your comments for new implementations and corrections to be made here in the comments or in Discord. Thanks @Evgeny.Shvarov for sharing technological bonuses results.Please note that iris-GenLab application has Embedded Python and (Langchain or AI LLM) functionality.PEX Production and 2nd article have been also added.Regards In my project iris-vector, there are a few things, which make it impossible to make it ObjectScriptLess, at least for now.
I've already mentioned the issues of running Python based SQL functions, in my article, so, I had to add ObjectScript realization too.
And any new DataType in IRIS, which implements custom serialization based on methods like DisplayToStorage, at the moment accepts only ObjectScript, due to tied implementation to SQL Compilation engine.
Thanks @Evgeny Shvarov for sharing technological bonuses results.Please note that iris-python-machinelearn application has Docker functionality and the last update NoObjectScriptLine too.Thanks again. Hi @Muhammad.Waseem ! Bonuses are implemented Hi @Dienes ! But here is the ObjectScript in your repo Thanks, @Dmitry.Maslennikov ! Yes, you used Python and avoided ObjectScript where possible finding new limits - thanks for that! The bonus point is provided. Hi @Evgeny.Shvarov, still points are not updated Hello @Evgeny.Shvarov.
I think the apptools-django project implements the idea of community:
https://ideas.intersystems.com/ideas/DP-I-149
How do you think ?And also, I updated the first article. Please add points. Yes, there are both implementations, in the Python folder, file sample.py this is the construction only in Python without ObjectScript. By the way, I also made an article, I think I linked it to the app. Hello @Evgeny.Shvarov Thanks for sharing the bonuses results.
Please, can you check the IrisApitester puntuation?
It uses:
-Embedded phyton
-Docker
-ZPM
Or maybe it’s because it’s an update to an existing App?
Thanks! Sergey, it is a good attempt, but you need to justify every aspect of current management portal substituted with your one. From what I saw you introduce a new interesting functionality and sometimes you use links to the current portal.
Article is considered. Yes, if you apply again with the same app "previous" bonuses don't count. only for the new functionality. Hi Daniel! You already received these bonuses in the previous contest. We do not re-give bonuses to ensure what already participating apps do not have an advantage over new ones. Could you please delete the files from repo that are not related to your solution to avoid the confusion? My initial idea was to create a kind of template for future changes, where in the SRC folder there is the project with Python embedded in IRIS and in the Python folder there are two files, one that calls IRIS through Python and the other just the pure Script that runs directly in local or inside Docker with Iris. But I believe that the idea of doing something more comprehensive ended up generating more confusion than solution, this is the first time I have participated in a contest, I will refine it more the next time. Hi Andre! If we consider all the code you wrote, it includes ObjectScript. Considering that you merged two templates intersystems-iris-dev-template and iris-embedded-python-template and wrote your code on top, it is a little difficult to find exactly your changes. I checked all your commits and found out that you were the one who wrote the ObjectScript code. You can easily fix this by rewriting this method in Python. I understand, it would be more interesting to leave just the Python part like this file without the ObjectScript, but I would have to generate a new image in Docker to remove it, and unfortunately I'm very committed to time right now, for a first Contest it will serve as a learning experience and a lesson, if you can consider something as it is, great, if you can't, that's fine, I'll dedicate myself more to the next one, I'll change this one as soon as I can. Thank you very much for your attention and explanations. Ok, I understand it, no problem, thank you! Hi @Evgeny.Shvarov, Thanks for the update.
Only Questionnaire points are not updated which I already done yesterday. Done! Your Questionnaire points were added Thanks Hi Semion, i made some changes, I added a video and the link to the article in the contest, I changed the name of the file that was as a sample to NoObjectScript.py, in the video I show its execution and demonstrating the usability of the libraries in a FrontEnd. Hello @Evgeny Shvarov Thanks for sharing the bonuses results.
To the password-app-iris-db application, a second article was written, but no points were awarded for it.
Hi! Your bonuses were accepted Hi Oleksandr! You have already received all bonuses for articles. First article what you wrote for previous contest is not applying. We are counting articles what were written only for the current contest. It should be fresh. Thanks Semion, one last cry, won't NoObjectScript be considered? Thank you very much for your attention and sorry for the inconvenience. Hi Andre! Sorry, but no. I already explained why and how to fix it.
Announcement
Olga Zavrazhnova · Jun 19, 2019
It’s no secret that the InterSystems Global Masters program is integrated with Developer Community, Open Exchange, and Ideas Portal. Whenever you contribute to any of these platforms, you automatically earn points and badges on Global Masters.
We’ve created a short guide to help you discover the best ways to earn points on Global Masters:
Please note that points are automatically awarded on the 4th day after you make a contribution on DC, OEX, or the Ideas Portal (activities made outside of the Global Masters platform).
HOW TO EARN POINTS ON GLOBAL MASTERS
Activity
Points
Badge(s)
Activities on Developer Community
👣 First Steps
Register on Developer Community
50
First article
1500
DC Author
First question
500
InterSystems Researcher
First comment or answer
300
DC Commenter
Certified Specialist badge
200
📑 Articles on Developer Community
Each published post on Developer Community
200
Published post on DC in Chinese
400
Published post on DC in Spanish
400
Published post on DC in French
400
Published post on DC in Japanese
400
Published post on DC in Portuguese
400
5 Articles on DC
7500
Reporter
10 Articles on DC
15000
Blogger
25 Articles on DC
40000
Influencer
50 Articles on DC
75000
Opinion Maker
💬Comments on Developer Community
Each comment on the Developer Community
30
Comment on DC in Chinese
60
Comment on DC in Spanish
60
Comment on DC in French
60
Comment on DC in Japanese
60
Comment on DC in Portuguese
60
Accepted answer on Developer Community
150
❓Questions
5 Questions on DC
2000
Curious Member
10 Questions on DC
5000
Thorough Member
25 Questions on DC
15000
Inquisitive Member
50 Questions on DC
30000
Socratic Member
🙋♂️Answers
1 Accepted Answer
1000
DC Problem Solver
5 Accepted Answers
4000
Master of Answers
10 Accepted Answers
8000
Bronze Master of Answers
25 Accepted Answers
20000
Silver Master of Answers
50 Accepted Answers
40000
Gold Master of Answers
If you would like to learn about our special set of badges and additional points for a number of Accepted Answers on the Developer Community!
🌐Translations of the DC Articles
Translate an articleRead how to translate Articles here.
150
DC Translator — 1 completion
Advances — 5 completions
Bronze — 15 completions
Silver — 25 completions
Gold — 50 completions
Translate a question
30
🌟Best Practices
1 Best Practices Article
500
Best Practices Author
2 Best Practices Articles
1500
Advanced Best Practices Author
3 Best Practices Articles
3500
Bronze Best Practices Author
4 Best Practices Articles
5000
Silver Best Practices Author
5 Best Practices Articles
7500
Gold Best Practices Author
👀 Post Views (on your post)
750 Views on DC Post
200
Popular Writer
2000 Views on DC Post
500
Notable Writer
5000 Views on DC Post
1000
Famous Writer
15000 Views on DC Post
3000
Gold Writer
👍 Likes (on your posts)
50 Likes on DC Posts
500
Insightful Author
100 Likes on DC Posts
1000
Expert Author
500 Likes on DC Posts
5000
Recognizable Author
1000 Likes on DC Posts
10000
Powerful Author
Activities on Open Exchange
💿Downloads of Your Application
50 Downloads on OEX App
500
Popular App
100 Downloads on OEX App
1000
Bronze Popular App
250 Downloads on OEX App
2500
Silver Popular App
500 Downloads on OEX App
5000
Gold Popular App
🧑💻Applications
Each App on Open Exchange
800
IPM application on Open Exchange
400
1 App on OEX
1000
Open Exchange Developer
5 Apps on OEX
10000
Bronze Open Exchange Developer
10 Apps on OEX
25000
Silver Open Exchange Developer
25 Apps on OEX
50000
Gold Open Exchange Developer
📝OEX Reviews
Each Open Exchange Review
200
1 OEX Review
200
Open Exchange Reviewer
5 OEX Reviews
500
Advanced Open Exchange Reviewer
10 OEX Reviews
1000
Bronze Open Exchange Reviewer
25 OEX Reviews
2500
Silver Open Exchange Reviewer
50 OEX Reviews
5000
Gold Open Exchange Reviewer
Activities on Ideas Portal
💡Product Ideas Submission
Product Idea Submitted
Points are awarded automatically for ideas submitted under the "InterSystems Products" category, after the idea has passed moderation. Read more in this post.
100
Idea Creator — 1 idea
Advanced — 5 ideas
Bronze — 10 ideas
Silver — 25 ideas
Gold — 50 ideas
Product Idea In Progress
500
Product Idea Implemented
3000
Complete challenges, get badges and climb up the levels: Insider > Advocate > Specialist > Expert >Ambassador> Legend.**Please note the level system is not available on a new Global Masters platform starting from April 2024. We are working on bringing it back!
The higher level you are, the more interesting prizes available!
And...
Please check the additional information about Global Masters:
What is Global Masters? Start Here
Global Masters Badges Descriptions
Global Masters Levels Descriptions
Additionally, you can join our Referral Program here and earn 1,000 points for each friend who joined Developer Community!
If you have not joined InterSystems Global Masters Advocacy Hub yet, let's get started right now!
Feel free to ask your questions in the comments to this post.
*Post last updated: 21 July 2025 Thanks, Anastasia!Very helpful! I believe we also have a series upon the number of accepted answers, like 10,25,50,100 accepted answers. Do we? Thank you for this quick reference table (and for my *looks up amount of points for comments* 30 points!) Hi Evgeny,let me answer - we do not have so far, and I think that would be good to have such series & badges to recognize the authors. Are these automated in any way? Wondering if mine is bugged because I've certainly posted questions and comments before but those badges were never unlocked. Their descriptions below say "first" question/comment and I don't know if mine are being detected:https://community.intersystems.com/post/changes-global-masters-program-new-level-new-badges-new-possibilities Hi David! This should be automatic. We'll investigate. I wrote a post on DC in 2017? Do I have to 'register' it to get points on Global Masters?Kind regards, Stephen Hi David!We have fixed this issue. Thank you for the feedback! Thank you! You're very quick! Hi Stephen, I see you have joined recently the Global Masters, that is so great! -this post is not counted in "Write a post on Developer Community" challenge (100 points), as it has been published before you registered- it is counted in all other type of challenges listed above e.g. "Write 10 posts on DC". This was really helpful
Thank you
This is an excellent article and is worth bumping the thread :) Great! This is very helpful! This is helpful. Thank you! "Invite your colleague to Developer Community" - is there a formal way to do this via the D.C. interface? I looked around and couldn't seem to find an 'invite a friend' option or anything like that. I have some colleagues whom I think would benefit from getting involved in the D.C. (CC: @Anastasia.Dyubaylo / @Evgeny.Shvarov ) Hi @Benjamin.Spead you can do that via this Global Masters challenge (this challenge is currently in your "Later" tab) Thank you @Olga.Zavrazhnova2637! I knew I had seen it somewhere at some point. I just had a conversation with a new colleague yesterday about the value of the D.C. and Global Masters, so I will send her an invite :) It's a good idea! Do you mean to have the UI on DC to prepare an email invitation to join DC to a friend developer with a standard invitation text? This was more to figure out the proper way to do this in order for tracking for the badge, etc on the G.M. platform. It makes sense that it needs to originate in a challenge (and than you to Olga for pointing that out).
I don't think that just having a form on the D.C. to invite a friend necessarily makes sense, as anyone can just shoot a friend an email with the link. If others would like to see this as a new feature I won't object though. Hello,please, can you explain how to translate an article/question?
Regards You can see the language of the article in the upper left side of the window, click on it and a list of language will be deployed, select the language to translate the article and a new window will be shown with 2 options, translate and request translate, select the first and you will be able to translate the article. Oh, I see.
Unfortunally there is not Italian language available. Puoi dare un suggerimento in Ideas Portal
Penso che ci sia abbastanza supporto lì.
Perciò ?? @Luca.Ravazzolo ?? Thank you for the table! Thanks, Anastasia!
Very helpful! Thanks for the help. Hello! Thank you very much for remembering these points! I was reading and noticed that the option "Share an article / video in Social Networks" is no longer available in publications, I think this referred to the old platform, right? Hi Marcelo! Yes, the option to share was available for any article and video on the old platform. On the new platform, we still have some articles/videos for sharing, but only for selected content. Social sharing “asks” are tagged with the “social share” tag on the platform when available. Thank you!! Thank for the tips thanks for the information it is very useful Very helpful, thanks I was curious, how long should it take for there to be an update to your points after reading an article / posting a reply? Is this something that should happen immediately, or take some time? This is so cool! Can't wait to get active in the Intersystems community and earn some points Hi Henry! The points are awarded on the 4th day after you post a comment or article — this delay is intentional, for moderation purposes.
However, if you notice a delay longer than that, please let me know. That could indicate a possible issue with the integration between your profiles on DC and GM that we may need to look into 😊 Hi Olga! Thank you for the quick response. That definitely makes sense Hi Olga,
I had a question about the point structure regarding comments. Do you get 300 points for being the first to comment on a post, or is it 300 points for your first ever comment, then 30 points for every subsequent one?
Thanks for your help! That’s a good question! You get 300 points for your first-ever comment on DC — and it comes with a badge too. Then, 30 points for every subsequent one. We’ll update the table to make this clearer.
Now I really like the idea of awarding bonus points to the author of the first answer to a question 😄 — maybe we should introduce something like that! Thanks for the clarification! And I agree, that would be a good way to incentivize initiation on posts.
I also think you should give points to those who suggested great ideas on how to score points 😄 And then the question is, does it have to be a correct answer? Because if not I'm going to be getting lots of points very quickly ;) It doesn’t have to be correct, but make it your best shot :)P.S. Our moderators are wide awake — just saying 😄😜 Sensational Wow great question Henry! This guide is really helpful—it makes it easy to understand how to earn points and how many points each activity gives. I’ve saved it to my sticky notes so I can quickly check it without having to come back here every time
Announcement
Celeste Canzano · Sep 18
Hi, Community!
Have you thought about becoming a subject matter expert (SME) for InterSystems Certification?
The benefits are many—but you can hear directly from five members of our SME community as they discuss:
How did participating in exam development boost your own expertise?
What about you?
If you're already a SME, what did you gain from the experience?
If you are not part of our SME community - but are interested - what questions do you have about the process?
Acting as an SME was a fantastic experience. Myself and my paired expert were able to really dive into some interesting avenues of discussion, combining our different approaches and experiences, and distilling them down to a common understanding. It was a truly valuable exercise to expand and reinforce my knowledge of the great products we offer
Announcement
Celeste Canzano · May 12
Hello IRIS community,
InterSystems Certification is currently developing a certification exam for InterSystems IRIS SQL professionals, and if you match the exam candidate description given below, we would like you to beta test the exam! The exam will be available for beta testing starting May 19, 2025.
Please note: Only candidates with the pre-existing InterSystems IRIS SQL Specialist certification are eligible to take the beta. Interested in the beta but don’t have the SQL Specialist certification? Take the SQL Specialist exam!
Eligible candidates will receive an email from the certification team with instructions on scheduling the exam.
Beta testing will be completed October 30, 2025.
What are my responsibilities as a beta tester?
You will schedule and take the exam by July 15th. The exam will be administered in an online proctored environment free of charge (the standard fee of $150 per exam is waived for all beta testers). The InterSystems Certification team will then perform a careful statistical analysis of all beta test data to set a passing score for the exam. The analysis of the beta test results will take 6-8 weeks, and once the passing score is established, you will receive an email notification from InterSystems Certification informing you of the results. If your score on the exam is at or above the passing score, you will have earned the certification!
Note: Beta test scores are completely confidential.
How is this exam different from the InterSystems IRIS SQL Specialist exam?
This new exam - InterSystems IRIS SQL Professional - covers higher-level SQL topics and is recommended for candidates with 4 to 6 years of relevant experience, compared to the 1 to 2 years recommended for the SQL Specialist exam.
Interested in participating? Read the Exam Details below.
Exam Details
Exam title: InterSystems IRIS SQL Professional
Candidate description: A developer or solutions architect who
Designs IRIS SQL applications
Manages IRIS SQL operations
Uses IRIS SQL
Loads and efficiently queries datasets stored in IRIS SQL
Number of questions: 38
Time allotted to take exam: 2 hours
Recommended preparation: Review the content below before taking the exam.
Online Learning:
Using SQL in InterSystems IRIS (learning path, 3h 45m)
Recommended practical experience:
4 to 6 years of experience developing and managing IRIS SQL applications is recommended.
At least 2 years of experience working with ObjectScript and globals in InterSystems IRIS is recommended.
Exam practice questions
A set of practice questions is provided here to familiarize candidates with question formats and approaches.
Exam format
The questions are presented in two formats: multiple choice and multiple response. Access to InterSystems IRIS Documentation will be available during the exam.
DISCLAIMER: Please note this exam has a 2-hour time limit. While InterSystems documentation will be available during the exam, candidates will not have time to search the documentation for every question. Thus, completing the recommended preparation before taking the exam, and searching the documentation only when absolutely necessary during the exam, are both strongly encouraged!
System requirements for beta testing
Working camera & microphone
Dual-core CPU
At least 2 GB available of RAM memory
At least 500 MB of available disk space
Minimum internet speed:
Download - 500kb/s
Upload - 500kb/s
Exam topics and content
The exam contains questions that cover the areas for the stated role as shown in the exam topics chart immediately below.
Topic
Subtopic
Knowledge, skills, and abilities
1. Designs IRIS SQL applications
1.1 Designs a SQL schema
Distinguishes use cases for row vs columnar table layout
Distinguishes use cases for different index types
1.2 Designs advanced schemas
Recalls anatomy of Globals (subscript and value)
Interprets relationship between table structure and Globals
Distinguishes the (Globals) level at which mirroring/journaling operates from the SQL layer
Distinguishes the differences between date/time data types
Interprets the overhead associated with stream data
Identifies use cases for text search
1.3 Writes business logic
Identifies use cases for UDFs, UDAFs, and SPs
1.4 Develops Object/Relational applications
Recalls SQL best practices when defining classes
Uses Object access to interact with individual rows
Identifies SQL limitations with class inheritance
Uses serial and object properties
Identifies use cases for collection properties
Distinguishes class relationships from Foreign Keys
1.5 Deploys SQL applications
Determines what needs to be part of a deployment
2. Uses IRIS SQL
2.1 Manages IRIS query processing
Identify benefits of the universal query cache
List considerations made by the optimizer
Differentiates client and server-side problems
Uses Statement Index to find statement metadata
Distinguishes between the use of parameters and constants in a query
Distinguishes between transaction and isolation levels
2.2 Interprets query plans
Identifies the use of indices in a query plan
Identifies vectorized (columnar) query plans
Uses hints to troubleshoot query planning
Identifies opportunities for indices, based on a query plan
2.3 Uses IRIS SQL in applications
Distinguishes use cases for Dynamic SQL and Embedded SQL
2.4 Uses IRIS-specific SQL capabilities
Uses arrow syntax for implicit joining
Determines use cases for explicit use of collation functions
3. Manages IRIS SQL operations
3.1 Manages SQL operations
Identifies use cases for purging queries and rebuilding indices
Recalls impact of purging queries and rebuilding indices
Identifies use cases for un/freezing query plans, including automation
Identifies use cases for (bitmap) index compaction
Uses the runtime stats in the Statement Index to find statements with optimization opportunities
3.2 Configures InterSystems SQL options
Recalls relevant system configuration options (e.g. lock threshold)
Differentiates scale-out options, ECP, and sharding
3.3 Manages SQL security
Recalls to apply SQL privilege checking when using Embedded SQL
3.4 Uses PTools for advanced performance analysis
Identifies use cases for using PTools
Interested in participating? Eligible candidates will receive an email from the certification team with instructions on how to schedule and take the exam. Hello Celeste! This is really interesting. How are the eligible candidates chosen? Is there a way to apply? Thank you. Hi Pietro! Unlike prior certification exam betas, only folks who hold the InterSystems IRIS SQL Specialist certification are eligible. There is no application process, rather, the certification team will be reaching out directly to eligible individuals on May 19th. Anyone who holds an active SQL Specialist certification will receive an email next Monday with instructions on how to access and take the beta exam. The email will be sent to the address associated with your account on Credly, our digital badging platform.
If you do not yet have the SQL Specialist certification, I encourage you to consider taking the InterSystems IRIS SQL Specialist certification exam. Once you pass this exam and obtain the certification, you will receive an email from the certification team regarding the beta.
Please let me know if I can clarify anything! Thank you for the clarifications Celeste! dear @Celeste.Canzano
Can anyone take this beta test? or only those who holds an active SQL Specialist certification is eligible for this?
Regards,
Harshitha Hi Harshitha! Only those who hold an active SQL Specialist certification are eligible for the SQL Professional beta. Please let me know if you have any additional questions.
Question
Murali krishnan · May 3, 2017
Intersystems is all about name spaces. Each Name space can be mapped to one or more databases and vice versa. In my desktop with intersystems, Can i have DEV , TEST environments pointing to different name spaces at same point of time ? if i am right here, then the DEV environment is nothing but the namespace that we work on....Please let know Slight correction: the system mode is per instance and not per namespace.
Well spotted, I've updated my comment. Hi Murali,Your perfectly right.You can have multiple namespaces on the same Caché instance for different purposes. These should have a naming convention to identify their purpose. That convention is really down to you. I normally postfix the name with -DEV and -TEST, e.g. FOO-DEV & FOO-TEST.These namespaces will share commonly mapped code from the main library, but unless configured otherwise they are completely independent from each other. You can dev and test in them respectively without fear of polluting one another.TipYou can mark the mode of an instance via the management portal > system > configuration > memory and startup. On that configuration page you will see a drop down for "system mode" with the options...Live SystemTest SystemDevelopment SystemFailover SystemThe options are mostly inert, but what they will do is paint a box on the top of your management portal screens. If you mark it as live then you will get a red box that you can't miss.Sean
Question
Esther Guite · Sep 27, 2018
Has anyone successfully executed s (INTSFREEZE and INSTATHAW) scripts in atelier tool?
We are having issues with FREEZE script whereas Thaw works fine.
Would appreciate any inputs! I'm not familiar with INSTFREEZE and INSTATHAW, can you describe where those scripts come from? However, if these are calling into the database and ultimately call the ExternalFreeze and ExternalThaw methods, the permissions for running an ExternalFreeze are more strict than those for running an ExternalThaw, which may be the root of the difference as to why thaw is succeeding and freeze is failing. Thanks for providing inputs. Yes agreed, invoking external freeze and external thaw operations to perform suspend and resume system. Thaw operation working as expected , but externalFreeze operation failing, I believe these issue with authentication issue. Is there way to address authentication issue to fix through Cache object script using $ZF(-1) function or any other option .OS level authentication we enabled through admin console and got succeeded in the Windows Command prompt execution but same is failing through cache object script code. @Thomas Granger - Thank you for your response. You said it right! that's exactly the problem with specific permissions for ExternalFreeze.. @John Murray - Thank you for your response and sharing the article. REally appreciate it! will definitely take a look and try that! Building off the comment by @Thomas.Granger about this perhaps being a permissions issue, I suggest you check out this previous DC article by me:https://community.intersystems.com/post/who-does-windows-think-i-am
Question
Guillaume Lepretre · Jun 20, 2018
Hello,
I used the operation : EnsLib.EMail.AlertOperation to send mail to handle error. However, I want to get more information about the error (session ID message, the date, the namespace... etc).
what is the best way to do it?
I tried to add informations in OnAlertRequest method as below but I need to change mail operation from all namespaces...
Thanks, I need to get the url of message viewer screen with the session ID .For example : http://localhost:57772/csp/svcptl/EnsPortal.VisualTrace.zen?SESSIONID=40241. It is possible ? You already have a value of SessionId, so concatenate the rest?To get port and host call:
set sc=##class(%Studio.General).GetWebServerPort(.port, .server)
Some ideas:session ID - you're getting it with pAlertRequest.SessionId, no?date - get it from pAlertRequest.AlertTimenamespace -wouldn't it always be the current namespace? Get it with $namespaceWhat other data do you need?Also, please post your code as text. You can do it in three ways:1 -set ^mtemperro($ZNSPACE,..%PackageName()_"."_..%ClassName(),$horolog) = mensagem_" Erro CACHE: "_$zerror_" - "_$SYSTEM.OBJ.DisplayError()2 - Throw ##class(%Exception.General).%New("Falha ao sinalizar o lançamento do PDA como rastreado.",1,..%ClassName()_".upByRastreado","Informe ao suporte sobre o problema.").Log()3 - Declare in your class an exception variable#dim exception As% Exception.AbstractExceptioncatch exception { do exception.Log()}Then just check the cache administration portal:System Operation-> System Logs-> Application Error Log
Article
Athanassios Hatzis · Jul 23, 2018
Hi,this is a public announcement for the first release of Intersystems Cache Object-Relational Mapper in Python 3. Project's main repository is located at Github (healiseu/IntersystemsCacheORM).About the projectCacheORM module is an enhanced OOP porting of Intersystems Cache-Python binding. There are three classes implemented:CacheClient This is the super class of CachePython module. It wraps two functions from intersys.pythonbind module pythonbind3.connection() and pythonbind3.database().CacheQuery A subclass of CacheClient that wraps methods and adds extra functionality in intersys.pythonbind.databaseand intersys.pythonbind.query classesCacheClass A subclass of CacheClient, that wraps methods and adds extra functionality in intersys.pythonbind.databaseand intersys.pythonbind.object classesThe intersys.pythonbind package is a Python C extension that provides Python application with transparent connectivity to the objects stored in the Caché database.Source CodeThe project's code that is released to the public was originally written and used as a module of TRIADB project.Tests and DemosThere are two folders in this release:testCacheORM contains python jupyter notebook files that demonstrate CacheQuery and CacheClasstestCacheBinding are tests written for Intersystems Cache python bindingOne can simply compare tests with demos to appreciate the work in this project to leverage intersystems cache python binding. For example
# Intersystems Cache Python binding for queries
import intersys.pythonbind3
# Create a connection
user="_SYSTEM";
password="123";
host = "localhost";
port = "1972";
url = host+"["+port+"]:SAMPLES"
conn = intersys.pythonbind3.connection()
# Connect Now to SAMPLES namespace
conn.connect_now(url, user, password, None)
# Create a database object
samplesDB = intersys.pythonbind3.database(conn)
# create a query object
cq = intersys.pythonbind3.query(samplesDB)
# prepare and execute query
sql = "SELECT ID, Name, DOB, SSN FROM Sample.Person"
cq.prepare(sql)
cq.execute()
# Fetch rows
for x in range(0,10):
print(cq.fetch([None]))
Same code in only 4 lines using CacheORM python module
from CacheORM import CacheQuery
samples_query = CacheQuery(namespace='SAMPLES', username='_SYSTEM', password='SYS', dl=99)
samples_query.execute_sql('SELECT ID, Name, DOB, SSN FROM Sample.Person')
samples_query.print_records(10)
You can view the output from this python Jupyter Notebook at my Microsoft Azure CacheORM library
Another example, this time with Cache-Python Objects
# Demo of Intersystems Cache Python binding with Samples namespace and Sample.Person class
import intersys.pythonbind3
conn = intersys.pythonbind3.connection( )
conn.connect_now('localhost[1972]:SAMPLES', '_SYSTEM', '123', None)
samplesDB = intersys.pythonbind3.database(conn)
#%% Create a new instance of Sample.Person to be husband
husband = samplesDB.create_new("Sample.Person", None)
ssn1 = samplesDB.run_class_method("%Library.PopulateUtils","SSN",[])
dob1 = samplesDB.run_class_method("%Library.PopulateUtils","Date",[])
husband.set("Name","Hatzis, Athanassios I")
husband.set("SSN",ssn1)
husband.set("DOB",dob1)
# Save husband
husband.run_obj_method("%Save",[])
print ("Saved id: "+str(husband.run_obj_method("%Id",[])))
#%% Create a new instance of Sample.Person to be wife
wife = samplesDB.create_new("Sample.Person", None);
ssn2 = samplesDB.run_class_method("%Library.PopulateUtils","SSN",[])
dob2 = samplesDB.run_class_method("%Library.PopulateUtils","Date",[])
wife.set("Name","Kalamari, Panajota");
wife.set("SSN",ssn2)
wife.set("DOB",dob2)
# Save wife
wife.set("Spouse",husband);
wife.run_obj_method("%Save",[]);
print ("Saved id: " + str(wife.run_obj_method("%Id",[])))
#%% Relate them
husband.set("Spouse",wife);
husband.run_obj_method("%Save",[]);
wife.set("Spouse",husband);
wife.run_obj_method("%Save",[]);
# Open an instance of the Sample.Person object
athanID=217
athanPerson = samplesDB.openid("Sample.Person",str(athanID),-1,-1)
# Open another instance
otherID=3
otherPerson = samplesDB.openid("Sample.Person",str(otherID),-1,-1)
# Fetch some properties
print ("ID: " + otherPerson.run_obj_method("%Id",[]))
print ("Name: " + otherPerson.get("Name"))
print ("SSN: " + otherPerson.get("SSN"))
print ("DOB: " + str(otherPerson.get("DOB")))
print ("Age: " + str(othePerson.get("Age")))
Same code using CacheORM python module, i.e. object-relational mapping
from CacheORM import CacheClass
# Create an instance of PopulateUtils to call built-in CACHE class method
populateUtils = CacheClass(namespace='%SYS', cachepackage='%Library', cacheclass='PopulateUtils', username='_SYSTEM', password='SYS')
# Create CacheClass Instance
husband = CacheClass(username='_SYSTEM', password='SYS', dl=99)
# Create and populate a new CacheClass Object
husband.new()
husband.set_value("SSN",populateUtils.class_method("SSN"))
husband.set_value("Name", "Hatzis, Athanassios I")
husband.set_value("DOB", populateUtils.class_method("Date"))
# Save husband
husband.save()
# Create another CacheClass object
wife = CacheClass(username='_SYSTEM', password='SYS')
wife.new()
wife.set_value("SSN",populateUtils.class_method("SSN"))
wife.set_value("Name", "Kalamari, Panajota")
wife.set_value("DOB", populateUtils.class_method("Date"))
wife.save()
# Relate them
wife.set_refobj("Spouse", person._cache_id)
wife.save()
husband.set_refobj("Spouse", female._cache_id)
husband.save()
# Get Object References
person.get("Spouse").get("Name")
female.get("Spouse").get("Name")
# Open an existing object with id=3 and read cache properties
person = CacheClass(username='_SYSTEM', password='SYS', objectID='3')
print(f"ID:{person.id}\nSSN: {person.get('SSN')}\nName:{person.get('Name')}\nDateOfBirth:{person.get('DOB')}")
You can view the output from this python Jupyter Notebook at my Microsoft Azure CacheORM library Cool staff, Athanassios!Caché queries demo doesn't work though: Thanks Evgeny, you cannot execute my jupyter notebooks on Azure cloud and I think you have to login first in order to view them. In any case my CacheORM module is dependent on intersys.pythonbind module. One has to install this first and verify that it works then start playing with my demos. I wrote guidelines about installation in Github README file.
Announcement
Tony Coffman · Dec 2, 2019
BridgeWorks is pleased to announce a VDM, v9.1.0.1. This release includes the following updates:
Updates
Historical Linking is now based off connection profile name
Saved Formatting is now based off connection profile name
Tables and Fields column headers no longer hide based on connection type
Bug Fixes
Cross tab would not load data correctly in Finished Reports Viewer if it was in a report footer
Fixed an issue where refreshing logs would not work correctly after viewing a SQL statement
Views were not visible for available schemas on the connection wizard
New
Load Selected Connections
Only load connections selected from a list
Announcement
Anastasia Dyubaylo · Feb 12, 2020
Hi Developers,
The new video from Global summit 2019 is already on InterSystems Developers YouTube:
⏯ Intersystems IRIS Kubernetes Operator
This video introduces the InterSystems IRIS Kubernetes operator, which enables InterSystems IRIS containers to function as "first-class citizens" of the Kubernetes ecosystem. We recommend that you be familiar with basic Kubernetes concepts: Introduction to Kubernetes Video
Takeaway: You will understand and appreciate the value proposition of the InterSystems IRIS Kubernetes operator.
Presenters: 🗣 @Luca.Ravazzolo, Product Manager, InterSystems🗣 @Steven.Lubars, InterSystems Software Developer
Additional materials to this video you can find in this InterSystems Online Learning Course.
Check out the Cloud Deployment Resource Guide.
Enjoy watching this video! 👍🏼 HI Guys,
Can you please post the link of prevision session that you mentioned in this video. I am still not sure how we will use Kubernetes with IRIS . It be great if you can put some light on that too ?
Thanks Hi Neerav Adam,
Are you looking for the Introduction to Kubernetes session? Does anyone know how to download this operator? I'm excited to start using it. Hi @Jonathan.Keam, I hope you found the answer back in Jan. If not head over to containers.intersystems.com
HTH
Question
Scott Roth · Jan 17, 2024
I downloaded containers.intersystems.com/intersystems/healthshare_providerdirectory:2023.2 to evaluate, however when I try to run the container it keeps
exiting.
>docker run --name providerdirectory --user=irisowner --env=ISC_DATA_DIRECTORY=/intersystems/irisdata --runtime=runc -d containers.intersystems.com/intersystems/healthshare_providerdirectory:2023.2
I keep getting the following, and I am not sure why...
2024-01-17 20:36:20 [INFO] Executing command /home/irisowner/irissys/startISCAgent.sh 2188...
2024-01-17 20:36:20 [INFO] Writing status to file: /home/irisowner/irissys/iscagent.status
2024-01-17 20:36:20 Reading configuration from file: /home/irisowner/irissys/iscagent.conf
2024-01-17 20:36:20 ISCAgent[17]: Starting
2024-01-17 20:36:20 ISCAgent[18]: Starting ApplicationServer on *:2188
2024-01-17 20:36:20 [INFO] ...executed command /home/irisowner/irissys/startISCAgent.sh 2188
2024-01-17 20:36:20 [INFO] Starting InterSystems IRIS instance IRIS...
2024-01-17 20:36:20 [ERROR] iris: instance 'IRIS' not found
2024-01-17 20:36:20 [ERROR] Command "iris start IRIS quietly" exited with status 256
2024-01-17 20:36:20 [ERROR] See the above messages or /intersystems/irisdata/mgr/messages.log for more information
2024-01-17 20:36:20 [FATAL] Error starting InterSystems IRIS
Can someone point me to the correct docker run command to get this to run?
Thanks
Scott Hi,
I've got the same issue, I've started to container without running the IRIS on startup -u=false
and found that there is only HSPD installed, so here is the command that worked for me:
docker run -d --name providerdirectory -p 52773:52773 -p 1972:1972 -p 2187:2187 containers.intersystems.com/intersystems/healthshare_providerdirectory:2023.2 -i HSPD
hth
Eyal Thanks I got it running, but I am not sure what to do next. When I try to access HealthShare within the Management Portal, it is telling me the Service is Unavailable. I want to be able to see what the System can do.