Clear filter
Announcement
Anastasia Dyubaylo · Jun 19, 2021
Hey Developers,
We're pleased to announce the next InterSystems online programming competition:
🏆 InterSystems AI Programming Contest 🏆
Duration: June 28 - July 25, 2021
Total prize: $8,750
Landing page: https://contest.intersystems.com
Prizes
1. Experts Nomination - a specially selected jury will determine winners:
🥇 1st place - $4,000
🥈 2nd place - $2,000
🥉 3rd place - $1,000
2. Community winners - applications that will receive the most votes in total:
🥇 1st place - $1,000
🥈 2nd place - $500
🥉 3rd place - $250
If several participants score the same amount of votes, they all are considered winners, and the money prize is shared among the winners.
Who can participate?
Any Developer Community member, except for InterSystems employees. Create an account!
👥 Developers can team up to create a collaborative application. Allowed from 2 to 5 developers in one team.
Do not forget to highlight your team members in the README of your application – DC user profiles.
Contest Period
🛠 June 28 - July 18: Application development and registration phase.
✅ July 19 - July 25: Voting period.
🎉 July 26: Winners announcement.
Note: Developers can improve their apps throughout the entire registration and voting period.
The topic
🤖 Artificial Intelligence and Machine Learning 🤖
Develop an AI/ML solution using InterSystems IRIS. Your application could be a library, package, tool, or any AI/ML solution which uses InterSystems IRIS.
Here are the requirements:
Accepted applications: new to Open Exchange apps or existing ones, but with a significant improvement. Our team will review all applications before approving them for the contest.
Build the app that either uses AI/ML capabilities with InterSystems IRIS.
The application should work either on IRIS Community Edition or IRIS for Health Community Edition or IRIS Advanced Analytics Community Edition.
The application should be Open Source and published on GitHub.
The README file to the application should be in English, contain the installation steps, and contain either the video demo or/and a description of how the application works.
Source code of the InterSystems ObjectScript part (if any)should be available in UDL format (not XML). Example.
The requirements above are subject to change.
➡️ Some ideas for contestants.
Use Embedded Python to join the current contest!
Embedded Python is a new feature of InterSystems IRIS that gives you the option to use python as a "first-class citizen" in backend business logic development with InterSystems classes.
Embedded Python could be used in "on-demand" images that could be delivered via InterSystems InterSystems Early Access Program (EAP) if you refer to python-interest@intersystems.com. More details here.
Helpful resources
1. Templates we suggest to start from:
InterSystems IntegratedML template
IRIS R Gateway template
2. Data import tools:
Data Import Wizard
CSVGEN - CSV import util
CSVGEN-UI - the web UI for CSVGEN
3. Documentation:
Using IntegratedML
4. Online courses & videos:
Learn IntegratedML in InterSystems IRIS
Preparing Your Data for Machine Learning
Predictive Modeling with the Machine Learning Toolkit
IntegratedML Resource Guide
Getting Started with IntegratedML
Machine Learning with IntegratedML & Data Robot
5. How to submit your app to the contest:
How to publish an application on Open Exchange
How to apply for the contest
Judgment
Voting rules will be announced soon. Stay tuned!
So!
We're waiting for YOUR great project – join our coding marathon to win!
❗️ Please check out the Official Contest Terms here.❗️
Here are some ideas for contestants:
New ML language. Interoperability with numerical computational languages or even CASes proper are great and offer the freedom of choice. Furthermore, these math-oriented languages allow faster problem search/space traversal than more generalized languages such as Python. Several classes of supporting ML problems can be solved with them. Callout interface makes implementation process easy (reference community implementations: PythonGateway, RGateway, JuliaGateway). Suggested languages: Octave, Scilab.
New showcases in IoT, Real-Time predictions, RPA. Convergent Analytics group provides a lot of starting templates in these fields - as InterSystems IRIS capabilities are an especially good fit for them. I'm always interested in more examples, especially real-life examples of machine learning.
Data Deduplication solutions. Do you have a dataset with a lot of dirty data and know how to clean it? Great. Make a showcase out of it.
Reinforcement learning showcases. Examples of Partially observable Markov decision process or other reinforcement learning technologies.
Hey Developers!
The competition starts on Monday and we have prepared Helpful resources for you:
InterSystems IntegratedML template
IRIS R Gateway template
Feel free to check it out! Developers!The InterSystems AI Programming Contest is open.The first week of registration has begun. Make the most of the tips and materials to create the best appl!😉
We'll wait for you on our Contest board. Hey developers,
If you haven't seen it yet, don't miss our official contest landing page: https://contest.intersystems.com/ 🔥
Stay tuned! Hey Developers,
The recording of the InterSystems AI Contest Kick-Off Webinar is available on InterSystems Developers YouTube! Please welcome:
⏯ InterSystems AI Contest Kick-Off Webinar
Thanks to our speakers! 🤗 Developers!
We are waiting for your great apps!
Don't forget to participate! Hey Devs!
Two weeks before the end of registration!⏳Upload your cool apps and register for the competition!🚀
Here some information how to submit your app to the contest:
How to publish an application on Open Exchange
How to apply for the contest
Hey Developers,
💥 Use Embedded Python to join the current contest!
Embedded Python is a new feature of InterSystems IRIS that gives you the option to use python as a "first-class citizen" in backend business logic development with InterSystems classes.
⏯ Short demo of Embedded Python by @Robert.Kuszewski
Embedded Python could be used in "on-demand" images that could be delivered via InterSystems Early Access Program (EAP) if you refer to python-interest@intersystems.com. More details here.
⬇️ Template package on how to use Embedded Python deployable with ZPM. Don't forget to change the image to the one you get from the EAP.
Stay tuned! Developers!
We are waiting for your solutions!
Please register on our Landing page Future Participants!
Don't forget about our special technology bonuses that will give you extra points in the voting. 🤩
Try them and start to upload your cool apps! 😃
Happy coding!
Dear Developers!
We have one additional idea for an application.Often, new UI components and pages are relevant to provide customers additional information within an app. We would like inject "AI Apps" into our HealthShare Clinical Viewer (HS CV).
A custom app wrapper component in the HS CV needs to be implemented. We will provide a HealthShare environment together with an AI App on the AI platform of DataRobot to you. https://www.datarobot.com/blog/introducing-no-code-ai-app-builder/
For additional details please reach out to Eduard or myself. We are ready to chat! Dear Developers,
The last week of the registration has begun!
If you still don't have ideas for your app, you can choose some preparedby @Eduard.Lebedyuk https://community.intersystems.com/post/intersystems-ai-programming-contest#comment-159576
and @Thomas.Nitzsche https://community.intersystems.com/post/intersystems-ai-programming-contest#comment-160796
We'll wait for your solutions on our Contest board. 😃
Hey Developers!
We remind you that you can register with one click 😉, just press the button "I want to participate" on our
Landing page: https://contest.intersystems.com
Who will be the first? 🤗 Hey Developers,
The first application is already on the Contest Board!
ESKLP by @Aleksandr.Kalinin6636
Who's next? Hey Devs!
Only 2 days left till the end of the registration. ⏳
Hurry up to upload your cool apps to our Contest Board!
Announcement
Anastasia Dyubaylo · Jun 7, 2021
Hi Community,
Please welcome the new video from #VSummit20:
⏯ Special Sauce: InterSystems IRIS Overview
See what makes InterSystems IRIS data platform so special, learn about the unique features behind the scenes, and identify what InterSystems IRIS can do for you. Follow #InterSystemsIRIS for more.
🗣 Presenter: @Harry.Tong, Senior Solutions Architect, InterSystems
Subscribe to InterSystems Developer YouTube and stay tuned! 👍🏼
Announcement
Anastasia Dyubaylo · Jun 25, 2021
Hi Community,
Enjoy watching the new video on InterSystems Developers YouTube:
⏯ Installing InterSystems API Manager
See the installation process for InterSystems API Manager 1.5. Learn how to prepare an InterSystems IRIS database platform instance and host system, and see how to run scripts provided in the API Manager installation kit.
Related chapters in InterSystems Documentation:
Installation instructions: IAM Guide
Configuring secure connections: Using TLS with InterSystems IRIS
In this video:
Preparing an InterSystems IRIS instance
Preparing a host system for API Manager
Running API Manager setup scripts
Testing and viewing the Admin Portal
Enjoy and stay tuned! 👍🏼
Announcement
Anastasia Dyubaylo · Mar 26, 2021
Hi Developers!
As you may know, InterSystems Developer Community can help you to find a job. There are two options for developers: find a job for yourself or hire staff for your company. Just post an ad on the InterSystems Developer Community and find what you are looking for!
How does it work? Check the details below:
The "Jobs" section available in the top menu here:
What can be found in this section?
Job opportunities:
Job opportunity announcements on any position which demands any of InterSystems technology skills.
Job wanted:
Announcements of specialists in InterSystems Data Platforms who are looking for a job. The list of skills should contain any of InterSystems technologies.
So, if you:
want to offer a position that demands any of InterSystems technology skills OR
have experience with InterSystems technology and are looking for a new job,
you can both publish a post in the Developer Community. You just have to add related tags to your announcements:
Job opportunity
Job wanted
And here's the easiest way to create a job opportunity:
Go to the Jobs section and click on the "New job opportunity" button, an ad will be automatically created with a special "Job opportunity" tag. This tag will make a vacancy out of your announcement and add it to this section.
So!
You will be able to find new employees or a new job in a quick and easy way with InterSystems Developer Community! 😉
If you have any questions or need some help with the post, don't hesitate to contact us!
Precautionary measures:
InterSystems does not guarantee the accuracy of recruitment posts or other information posted on this website.
InterSystems assumes no responsibility whatsoever for any losses incurred as a result of the information posted on this website. Please confirm the contents and conditions directly with the recruiter or applicant.
For more, please refer to the Developer Community Code of Conduct.
Announcement
Anastasia Dyubaylo · Jun 18, 2021
Hi Community,
Enjoy watching the new session recording from InterSystems Virtual Summit 2020:
⏯ Performance Testing with InterSystems Tools
Learn about some of the tools used to determine the performance capabilities of your system. See how you can test these capabilities and analyze the results.
Additional resources:
InterSystems System Alerting and Monitoring (SAM) Documentation
SAM GitHub page (with instructions)
Yet Another pButtons Extractor
🗣 Presenter: @Pran.Mukherjee, Senior Technology Architect, InterSystems
Enjoy watching this video! 👍🏼
Announcement
Anastasia Dyubaylo · Mar 23, 2021
Hi Developers,
Please welcome the new voting rules for the InterSystems programming contests!
See details below:
You can select 3 projects: the 1st, the 2nd, and the 3rd place upon your decision. This is how it works for the Community leaderboard:
Community Leaderboard:
Conditions
Place
1st
2nd
3rd
If you have an article posted on DC and an app uploaded to Open Exchange (OEX)
9
6
3
If you have at least 1 article posted on DC or 1 app uploaded to OEX
6
4
2
If you make any valid contribution to DC (posted a comment/question, etc.)
3
2
1
For the Experts leaderboard, different levels of experts have more "points" power:
Experts Leaderboard:
Level
Place
1st
2nd
3rd
VIP Global Masters level or ISC Product Managers
15
10
5
Ambassador GM level
12
8
4
Expert GM level or DC Moderators
9
6
3
Specialist GM level
6
4
2
Advocate GM level or ISC Employees
3
2
1
For those who have any of the above expert levels, votes will be counted in both Expert and Community nominations automatically.
This is how it works:
To take part in the voting, you need:
Sign in to Open Exchange – DC credentials will work.
Make any valid contribution to the Developer Community – answer or ask questions, write an article, contribute applications on Open Exchange – and you'll be able to vote. Check this post on the options to make helpful contributions to the Developer Community.
We hope this new system will be fairer and you'll have the option to give your votes to more projects that you like.
Comments, suggestions are very welcome! Is there any detail rules about how a common non-ISC user can become specialist, expert, ambassordor and even VIP lvl? These ranks all relate to Global Masters (GM) Thx!
@Xuying.Zheng Pls help translate this. Thx! Hi Michael,Robert is absolutely right, these levels are related to Global Masters Advocate Hub. You can see a description of all levels and how to achieve them in this post.
If you have an article posted on DC and an app downloaded on OEX
Did you really mean an app uploaded to OEX? Fixed. Thank you, Alexey!
Article
Sergey Lukyanchikov · Apr 7, 2021
What is Distributed Artificial Intelligence (DAI)?
Attempts to find a “bullet-proof” definition have not produced result: it seems like the term is slightly “ahead of time”. Still, we can analyze semantically the term itself – deriving that distributed artificial intelligence is the same AI (see our effort to suggest an “applied” definition) though partitioned across several computers that are not clustered together (neither data-wise, nor via applications, not by providing access to particular computers in principle). I.e., ideally, distributed artificial intelligence should be arranged in such a way that none of the computers participating in that “distribution” have direct access to data nor applications of another computer: the only alternative becomes transmission of data samples and executable scripts via “transparent” messaging. Any deviations from that ideal should lead to an advent of “partially distributed artificial intelligence” – an example being distributed data with a central application server. Or its inverse. One way or the other, we obtain as a result a set of “federated” models (i.e., either models trained each on their own data sources, or each trained by their own algorithms, or “both at once”).
Distributed AI scenarios “for the masses”
We will not be discussing edge computations, confidential data operators, scattered mobile searches, or similar fascinating yet not the most consciously and wide-applied (not at this moment) scenarios. We will be much “closer to life” if, for instance, we consider the following scenario (its detailed demo can and should be watched here): a company runs a production-level AI/ML solution, the quality of its functioning is being systematically checked by an external data scientist (i.e., an expert that is not an employee of the company). For a number of reasons, the company cannot grant the data scientist access to the solution but it can send him a sample of records from a required table following a schedule or a particular event (for example, termination of a training session for one or several models by the solution). With that we assume, that the data scientist owns some version of the AI/ML mechanisms already integrated in the production-level solution that the company is running – and it is likely that they are being developed, improved, and adapted to concrete use cases of that concrete company, by the data scientist himself. Deployment of those mechanisms into the running solution, monitoring of their functioning, and other lifecycle aspects are being handled by a data engineer (the company employee).
An example of deployment of a production-level AI/ML solution on InterSystems IRIS platform that works autonomously with a flow of data coming from equipment, was provided by us in this article. The same solution runs in the demo under the link provided in the above paragraph. You can build your own solution prototype on InterSystems IRIS using the content (free with no time limit) in our repo Convergent Analytics (visit sections Links to Required Downloads and Root Resources).
Which “degree of distribution” of AI do we get via such scenario? In our opinion, in this scenario we are rather close to the ideal because the data scientist is “cut from” both the data (just a limited sample is transmitted – although crucial as of a point in time) and the algorithms of the company (data scientist’s own “specimens” are never in 100% sync with the “live” mechanisms deployed and running as part of the real-time production-level solution), he has no access at all to the company IT infrastructure. Therefore, the data scientist’s role resolves to a partial replay on his local computational resources of an episode of the company production-level AI/ML solution functioning, getting an estimate of the quality of that functioning at an acceptable confidence level – and returning a feedback to the company (formulated, in our concrete scenario, as “audit” results plus, maybe, an improved version of this or that AI/ML mechanism involved in the company solution).
Figure 1 Distributed AI scenario formulation
We know that feedback may not necessarily need to be formulated and transmitted during an AI artifact exchange by humans, this follows from publications about modern instruments and already existing experience around implementations of distributed AI. However, the strength of InterSystems IRIS platform is that it allows equally efficiently to develop and launch both “hybrid” (a tandem of a human and a machine) and fully automated AI use cases – so we will continue our analysis based on the above “hybrid” example, while leaving a possibility for the reader to elaborate on its full automation on their own.
How a concrete distributed AI scenario runs on InterSystems IRIS platform
The intro to our video with the scenario demo that is mentioned in the above section of this article gives a general overview of InterSystems IRIS as real-time AI/ML platform and explains its support of DevOps macromechanisms. In the demo, the “company-side” business process that handles regular transmission of training datasets to the external data scientist, is not covered explicitly – so we will start from a short coverage of that business process and its steps.
A major “engine” of the sender business processes is the while-loop (implemented using InterSystems IRIS visual business process composer that is based on the BPL notation interpreted by the platform), responsible for a systematic sending of training datasets to the external data scientist. The following actions are executed inside that “engine” (see the diagram, skip data consistency actions):
Figure 2 Main part of the “sender” business process
(a) Load Analyzer – loads the current set of records from the training dataset table into the business process and forms a dataframe in the Python session based on it. The call-action triggers an SQL query to InterSystems IRIS DBMS and a call to Python interface to transfer the SQL result to it so that the dataframe is formed;
(b) Analyzer 2 Azure – another call-action, triggers a call to Python interface to transfer it a set of Azure ML SDK for Python instructions to build required infrastructure in Azure and to deploy over that infrastructure the dataframe data formed in the previous action;
As a result of the above business process actions executed, we obtain a stored object (a .csv file) in Azure containing an export of the recent dataset used for model training by the production-level solution at the company:
Figure 3 “Arrival” of the training dataset to Azure ML
With that, the main part of the sender business process is over, but we need to execute one more action keeping in mind that any computation resources that we create in Azure ML are billable (see the diagram, skip data consistency actions):
Figure 4 Final part of the “sender” business process
(c) Resource Cleanup – triggers a call to Python interface to transfer it a set of Azure ML SDK for Python instructions to remove from Azure the computational infrastructure built in the previous action.
The data required for the data scientist has been transmitted (the dataset is now in Azure), so we can proceed with launching the “external” business process that would access the dataset, run at least one alternative model training (algorithmically, an alternative model is distinct from the model running as part of the production-level solution), and return to the data scientist the resulting model quality metrics plus visualizations permitting to formulate “audit findings” about the company production-level solution functioning efficiency.
Let us now take a look at the receiver business process: unlike its sender counterpart (runs among the other business processes comprising the autonomous AI/ML solution at the company), it does not require a while-loop, but it contains instead a sequence of actions related to training of alternative models in Azure ML and in IntegratedML (the accelerator for use of auto-ML frameworks from within InterSystems IRIS), and extracting the training results into InterSystems IRIS (the platform is also considered installed locally at the data scientist’s):
Figure 5 “Receiver” business process
(a) Import Python Modules – triggers a call to Python interface to transfer it a set of instructions to import Python modules that are required for further actions;
(b) Set AUDITOR Parameters – triggers a call to Python interface to transfer it a set of instructions to assign default values to the variables required for further actions;
(c) Audit with Azure ML – (we will be skipping any further reference to Python interface triggering) hands “audit assignment” to Azure ML;
(d) Interpret Azure ML – gets the data transmitted to Azure ML by the sender business process, into the local Python session together with the “audit” results by Azure ML (also, creates a visualization of the “audit” results in the Python session);
(e) Stream to IRIS – extracts the data transmitted to Azure ML by the sender business process, together with the “audit” results by Azure ML, from the local Python session into a business process variable in IRIS;
(f) Populate IRIS – writes the data transmitted to Azure ML by the sender business process, together with the “audit” results by Azure ML, from the business process variable in IRIS to a table in IRIS;
(g) Audit with IntegratedML – “audits” the data received from Azure ML, together with the “audit” results by Azure ML, written into IRIS in the previous action, using IntegratedML accelerator (in this particular case it handles H2O auto-ML framework);
(h) Query to Python – transfers the data and the “audit” results by IntegratedML into the Python session;
(i) Interpret IntegratedML – in the Python session, creates a visualization of the “audit” results by IntegratedML;
(j) Resource Cleanup – deletes from Azure the computational infrastructure created in the previous actions.
Figure 6 Visualization of Azure ML “audit” results
Figure 7 Visualization of IntegratedML “audit” results
How distributed AI is implemented in general on InterSystems IRIS platform
InterSystems IRIS platform distinguishes among three fundamental approaches to distributed AI implementation:
· Direct exchange of AI artifacts with their local and central handling based on the rules and algorithms defined by the user
· AI artifact handling delegated to specialized frameworks (for example: TensorFlow, PyTorch) with exchange orchestration and various preparatory steps configured on local and the central instances of InterSystems IRIS by the user
· Both AI artifact exchange and their handling done via cloud providers (Azure, AWS, GCP) with local and the central instances just sending input data to a cloud provider and receiving back the end result from it
Figure 8 Fundamental approaches to distributed AI implementation on InterSystems IRIS platform
These fundamental approaches can be used modified/combined: in particular, in the concrete scenario described in the previous section of this article (“audit”), the third, “cloud-centric”, approach is used with a split of the “auditor” part into a cloud portion and a local portion executed on the data scientist side (acting as a “central instance”).
Theoretical and applied elements that are adding up to the “distributed artificial intelligence” discipline right now in this reality that we are living, have not yet taken a “canonical form”, which creates a huge potential for implementation innovations. Our team of experts follows closely the evolution of distributed AI as a discipline, and constructs accelerators for its implementation on InterSystems IRIS platform. We would be glad to share our content and help everyone who finds useful the domain discussed here to start prototyping distributed AI mechanisms. You can reach our AI/ML expert team using the following e-mail address – MLToolkit@intersystems.com.
Question
Raman Sailopal · Apr 8, 2021
Hi guys,
I've created an iris command line utility to interface the Linux command line with Interface IRIS
https://github.com/RamSailopal/iriscmd
It maybe of use to some of the community
Ram Hi @Raman.Sailopal
You can publish your contribution on OpenExchange
Check out this video about ZPM, so you can provide your command line in an even easier way to install and use it.
New Video: ObjectScript Package Manager ZPM: Installing, Building, Testing, (intersystems.com)
Hi Ram!
We see the request to Open Exchange, but you need to add a License to the repository - e.g. MIT License.
This is a mandatory thing to submit for Open Exchange.
Announcement
Anastasia Dyubaylo · May 4, 2021
Hey Community,
Please join the next InterSystems online programming competition:
🏆 InterSystems FHIR Accelerator Programming Contest 🏆
Submit an application that uses InterSystems FHIR-as-a-service on AWS or helps to develop solutions using InterSystems IRIS FHIR Accelerator.
Duration: May 10 - June 06, 2021
Total prize: $8,750
👉 Landing page 👈
Prizes
1. Experts Nomination - a specially selected jury will determine winners:
🥇 1st place - $4,000
🥈 2nd place - $2,000
🥉 3rd place - $1,000
2. Community winners - an application that will receive the most votes in total:
🥇 1st place - $1,000
🥈 2nd place - $500
🥉 3rd place - $250
If several participants score the same amount of votes, they all are considered winners, and the money prize is shared among the winners.
Who can participate?
Any Developer Community member, except for InterSystems employees. Create an account!
👥 Developers can team up to create a collaborative application. Allowed from 2 to 5 developers in one team.
Do not forget to highlight your team members in the README of your application – DC user profiles.
Contest Period
🛠 May 10 - 30: Application development and registration phase.
✅ May 31 - June 6: Voting phase.
🎉 June 7: Winners announcement.
Note: Developers can improve their apps throughout the entire registration and voting period.
The topic
💡 InterSystems IRIS FHIR Accelerator as a service (FHIRaaS) 💡
FHIRaaS capabilities:
Support for FHIR R4, including the U.S. Core Implementation Guide
Developer portal for testing and understanding FHIR APIs
Multiple methods of authentication, including API Key and OpenID Connect
Batch import of FHIR bundles via sFTP
Logging of FHIR request data
Built on AWS infrastructure that is ISO 27001:2013 and HITRUST certified to support HIPAA and GDPR
The full list of supported services and operations.
➡️ Get your FREE access to InterSystems IRIS FHIR Accelerator Service (FHIRaaS) on AWS ⬅️
Submit an application that uses InterSystems FHIR-as-a-service on AWS or helps to develop solutions using InterSystems IRIS FHIR Accelerator.
Here are the requirements:
Accepted applications: new to Open Exchange apps or existing ones, but with a significant improvement. Our team will review all applications before approving them for the contest.
The application can be built with ANY technology that will use InterSystems IRIS FHIR as a service.
The application should be Open Source and published on GitHub.
The README file to the application should be in English, contain the installation steps, and contain either the video demo or/and a description of how the application works.
Source code of the InterSystems ObjectScript part (if any)should be available in UDL format (not XML). Example.
The requirements above are subject to change.
Helpful resources
1. Template we suggest to start from:
coming soon
2. Documentation:
InterSystems IRIS FHIR Accelerator Service
3. Online courses on InterSystems FHIR support:
Learn FHIR for Software Developers
Building SMART on FHIR Apps with InterSystems FHIR Sandbox
Exploring FHIR Resource APIs
Using InterSystems IRIS for Health to Reduce Readmissions
Connecting Devices to InterSystems IRIS for Health
Monitoring Oxygen Saturation in Infants
FHIR Integration QuickStart
4. Videos:
Getting Started with the InterSystems IRIS FHIR Accelerator Service on AWS
Other FHIR related videos:
6 Rapid FHIR Questions
SMART on FHIR: The Basics
Developing with FHIR - REST APIs
FHIR in InterSystems IRIS for Health
FHIR API Management
Searching for FHIR Resources in IRIS for Health
Also, please check the related FHIR playlist on DC YouTube.
5. Q&A on FHIR:
Explore FHIR tag on DC
Ask questions on community.fhir.org
6. How to submit your app to the contest:
How to publish an application on Open Exchange
How to apply for the contest
Judgment
Voting rules will be announced soon. Stay tuned!
So!
We're waiting for YOUR great project – join our coding marathon to win!
❗️ Please check out the Official Contest Terms here.❗️
Hey guys,
We're pleased to invite you to join the upcoming kick-off webinar dedicated to the FHIR Accelerator Programming Contest!
➡️ InterSystems FHIR Accelerator Contest Kick-Off Webinar
🗓 Monday, May 10 — 01:00 PM EDT
✅ Register here! Thanks for everyone who joined the webinar today!
As it was announced will start providing the access codes to sign up for the FHIRaaS on Thursday 14th of May! Learn more here.
Also @Anton.Umnikov shared the example application that could use InterSystems FHIRaaS. Just change the base-url and provide your key to make it working with FHIRaaS. Hey Developers!
We started the InterSystems FHIR Accelerator Programming Contest!
Feel free to join us, we are waiting for your participation😎
Add your applications to our Contest board 🚀 Participants!
Whose application will be the first? 👀 Don't miss it!
⏯ InterSystems FHIR Accelerator Contest Kick-off Webinar
Hi Community!
The registration period has already begun! Follow our Contest Board and stay tuned.
Waiting for your cool projects! Hey Developers!
The second week of registration has started!
Hurry up to upload your applications! There were requests on what FHIRaaS provides. Here is the very precise document that describes that. The announcement is updated accordingly. Developers!We remind you that you have a great opportunity to get FREE access to the FHIRaaS on AWS! 🔥
Register on our FHIR Portal become a master of the FHIRaaS with InterSystems! Please use this link:👉🏼 https://portal.trial.isccloud.io/account/signup
Feel free to ask any questions regarding the competition here or in the discord-contests channel.
Happy coding! 😊 Hi Developers!
Upload your applications to the Open Exchange and we'll see them on the Contest Board!
Let everyone know about your cool app! 💪 Hey Developers,
The first application is already on the Contest Board!
FHIR Data Studio Connector by @Dmitry Maslennikov
Who's next? Hey developers,
If you haven't seen it yet, don't miss our official contest landing page:
🔥 InterSystems FHIR Accelerator Programming Contest Landing 🔥
Only 3 days left until the end of registration. Hurry up to participate! Developers!
We are waiting for your solutions!
Don't forget to participate! Hey Developers!
❗️Important news❗️We prolong the registration period till the 2nd June included.
Hurry up! 🔥You still have time to upload your app.
We wish you good luck!😉
✅ June 3 - June 6: Voting phase. Hey Developers!
The three application is already on the Contest Board!
FHIR Data Studio Connector by @Dmitry.Maslennikov
iris-on-fhir by @Henrique.GonçalvesDias
FHIR Simple Demo Application by @Marcello.Correa
Last call! Registration for the InterSystems FHIR Accelerator Programming Contest ends today!
Hurry up to upload your application(-s) 😉
Announcement
Anastasia Dyubaylo · Apr 26, 2021
Hey community,
The InterSystems Developer Tools contest is over. Thank you all for participating in our exciting coding marathon!
And now it's time to announce the winners!
A storm of applause goes to these developers and their applications:
🏆 Experts Nomination - winners were determined by a specially selected jury:
🥇 1st place and $4,000 go to the Server Manager for VSCode project by @John.Murray
🥈 2nd place and $1,500 go to the Config-API project by @Lorenzo.Scalese
🥈 2nd place and $1,500 go to the zpm-explorer project by @Henrique.GonçalvesDias and @José.Pereira
🏆 Community Nomination - an application that received the most votes in total:
🥇 1st place and $750 go to the Server Manager for VSCode project by @John.Murray
🥈 2nd place and $500 go to the zpm-explorer project by @Henrique.GonçalvesDias and @José.Pereira
🥉 3rd place and $250 go to the Config-API project by @Lorenzo.Scalese
Congratulations to all the winners and participants!
Thank you all for your attention to the contest and the efforts you pay into this competition!
And what's next?
We will announce the next competition very soon – stay tuned! big CONGRATULATIONS to the winners! I feel honoured that Server Manager 2.0 achieved first place in both categories. Building good tools for developers has been a major part of my professional life for nearly thirty years, so I am pleased this one is proving popular.
Thank you to everyone who voted for me, and to my employer George James Software for allowing me to work on this during office hours. Congratulations to Lorenzo, Henrique and José for their successes. There were some really great entries in the contest, so take a look at them if you haven’t already. Congrats @John.Murray for this deserved victory! Congrats to all the participants! This was an amazing contest!
And It's a real pity this time that we have only 3 winning positions. The community has won a lot more! Hey developers,
We would also like to thank all of our participants and highlight their cool apps:
💥 Grafana Plugin for InterSystems, IntelliJ InterSystems by @Dmitry.Maslennikov
💥 IRIS_REST_Documentation by @davimassaru.teixeiramuta
💥 gj :: locate by @George.James
💥 Git for IRIS by @Marcus.Wurlitzer
💥 zapm-editor by @Sergei.Mihaylenko
💥helper-for-objectscript-language-extensions, IRIS-easy-ECP-workbench, and SSH for IRIS container by @Robert.Cemper1003
Thank you all for such a great contribution! Hope to see your new apps in the next contests! Congrats to the developers that dedicated the time of their lives to create incredible applications!
Congrats to the staff team that make those contests possible and help us to make the Community greater every day.
Congratulations everyone! Congrats!!! Well deserved:) Thank you all for your contributions!! Congrats to the winners. Thank you for the Git for IRIS app. I plan to use that It is a great piece of work John. Well deserved.
Article
Brendan Bannon · Jul 15, 2021
Benjamin De Boe wrote this great article about Universal Cached Queries, but what the heck is a Universal Cached Query (UCQ) and why should I care about it if I am writing good old embedded SQL? In Caché and Ensemble, Cached Queries would be generated to resolve xDBC and Dynamic SQL. Now in InterSystems IRIS embedded SQL has been updated to use Cached Queries, hence the Universal added to the name. Now any SQL executed on IRIS will be done so from a UCQ class.
Why did InterSystems do this? Good Question! The big win here is flexibility in a live environment. In the past, if you add an index or ran TuneTable, the SQL in Cached Queries would make use of this new information right away while embedded SQL would remain unchanged until the class or routine was compiled manually. If your application used deployed classes or only shipped OBJ code, recompiling on the customer system was not an option. Now all SQL statements on a system will be using the latest class def. and the latest tuning data available. In the future, InterSystems IRIS will have optional tools that can monitor and tune your production systems on a nightly basis customizing the SQL plans based on how the tables are being queried. As this toolset grows the power of the Universal Cached Query will grow as well.
Is my embedded SQL slower now? Yes and no. Calling a tag in a different routine is a little more expensive than calling a tag in the same routine, so that is slower, but UCQ code generation was different from embedded, and getting to use those changes more than makes up for the expense of calling a different routine. Are there cases where the UCQ code is slower? Yes, but overall you should see better performance. I am an embedded SQL guy from way back so I always like to point out that Embedded SQL is faster than Dynamic SQL. It still is faster, but with all the work that has been done to make objects faster the margin between the 2 styles is small enough that I will not make fun of you for using dynamic SQL.
How do I check for errors now? Error handling for Embedded SQL has not changed. SQLCODE will be set to a negative number if we hit an error and %msg will be set to the description of that error. What has changed are the types of errors you can get. The default behavior now is that the SQL will not be compiled until the first time the query is run. This means if you misspell a field or table in the routine the error will not get reported when you compile that routine, it will be reported the first time you execute the SQL, same as dynamic SQL. SQLCODE is set for every SQL command but if you are lazy like me you only ever check SQLCODE after a FETCH. Well, now you might want to start checking on the OPEN as well.
&SQL(DECLARE cur CURSOR FOR SELECT Name,junk into :var1, :var2 FROM Sample.Person) &SQL(OPEN cur) write !,"Open Status: ",SQLCODE,?20,$G(%msg) for { &SQL(FETCH cur) write !,"Fecth Status: ",SQLCODE,?20,$G(%msg) QUIT:SQLCODE'=0 w !,var1 } &SQL(CLOSE cur) write !,"Close Status: ",SQLCODE,?20,$G(%msg) QUIT
In the code above I have an invalid field in the SELECT. Because we do not compile the SQL when we compile the routine this error is not reported. When I execute the code the OPEN reports the compile error while the FETCH and CLOSE report a cursor not open error. %msg does not get changed so if you check that at any point you will get helpful info:
USER>d ^Embedded
Open Status: -29 Field 'JUNK' not found in the applicable tables^DECLARE cur CURSOR FOR SELECT Name , junk INTO compiling embedded cached query from Embedded.mac
Fetch Status: -102 Field 'JUNK' not found in the applicable tables^DECLARE cur CURSOR FOR SELECT Name , junk INTO compiling embedded cached query from Embedded.mac
Close Status: -102 Field 'JUNK' not found in the applicable tables^DECLARE cur CURSOR FOR SELECT Name , junk INTO compiling embedded cached query from Embedded.mac
What if I don’t want my embedded SQL to change? You can still do this using Frozen Query Plans. A quick side note, every major IRIS upgrade you do will freeze all SQL Statements so nothing will change if you don’t let it. You can read more about that here.
Now back to dealing with UCQ stuff. Here are 3 ways you could freeze embedded SQL plans in your application:
If you ship an IRIS.DAT:
Do $SYSTEM.OBJ. GenerateEmbedded() to generate the UTC for embedded SQL
Freeze the plans: do $SYSTEM.SQL.Statement.FreezeAll()
Ship the IRIS.DAT
If you use xml files:
Do $SYSTEM.OBJ. GenerateEmbedded() to generate the UTC for embedded SQL
Freeze the plans: do $SYSTEM.SQL.Statement.FreezeAll()
Export the frozen plans: do $SYSTEM.SQL.Statement.ExportAllFrozenPlans()
After loading your application, load the frozen plans: do $SYSTEM.SQL.Statement.ImportFrozenPlans()
Freeze UTC Plans on the customer site:
Load the code with embedded SQL on the customer system
Do $SYSTEM.OBJ. GenerateEmbedded() to generate the UTC for embedded SQL
Freeze all the plans that got generated: do $SYSTEM.SQL.Statement.FreezeAll()
Can I go back to the old behavior? Nope, this is the way it is now. From a developer's point of view, you can get the old behavior back by adding the flag /compileembedded=1 to your compiler options. This will tell the compiler to generate the UCQ class while compiling the class or routine. If there is an issue with the SQL it will be reported at compile time as it did in the past.
Compiling routine : Embedded.macERROR: Embedded.mac(5) : SQLCODE=-29 : Field 'JUNK' not found in the applicable tables^DECLARE cur CURSOR FOR SELECT Name , junk INTO compiling embedded cached query from Embedded.macDetected 1 errors during compilation in 0.034s.
If you are concerned about the overhead of generating the UCQ classes the first time embedded SQL is run you could add this step as part of your application install to generate them all in advance: do $SYSTEM.OBJ. GenerateEmbedded()
This is a very high-level overview of Universal Cached Queries and Embedded SQL. I did not get into any of the real details about what happens under the covers. I just tried to talk about stuff people would run into as they work with Embedded SQL on IRIS. Overall moving to UCQ should make SQL performance more consistent across all types of SQL and it should make updating SQL on a production system easier. There will be some adjustments. Adding the compiler flag will be a big help for me. Now I just need to get used to looking for the generated code in a new place. If you have any questions, comments, concerns about this, or anything else related to SQL on InterSystems IRIS please let me know.
Very nice article @brendan.bannon - thank you for boiling it down to a set of core things that developers will care most about!
Question
Ephraim Malane · Dec 22, 2022
Hi Community,
My IRIS.DAT file is corrupted on one of my Edge productions in the development environment and as a result, I cannot start production.
I would like to recover it if there is a way to do so, please assist.
Regards, You should ask WRC, for the help if you have support If you are a supported customer (with the license under support - SUTA), please contact WRC. More details you can find at https://www.intersystems.com/support/ I haven't seen a database corruption for many years, I literally forgot the last time I saw it, it may be 15+ years or more. And in the past (last millennium) I've seen and dealt db corruption.
Out of curiosity, what are the symptoms or your corruption? How did it happened?
To check if a database is corrupt : Do ^INTEGRIT
To repair a database : Do ^REPAIR (But if you don't know this utility or the internals of database blocks and pointers : don't use it !!!) ^INTEGRIT is the simplest way to check integrity. Run the integrity check output to a file and contact WRC, as others have said, if you have support. The most direct way to resolve database errors is to restore from a good backup and replay journals. If you can't do that, the other alternatives almost always involve loss of information. The WRC has specialists who understand database internals, and WRC always wants to investigate for the root cause of any database problems. Ephraim,
When you say "corrupted" to better understand...- Did you try to mount the DB (from the SMP of with ^MOUNT)? Sometimes if IRIS/Cache was "forced" than a *.lck file on the DB folder need to be deleted in order to allow a successful mount. - If the DB is mounted, did you got a <DATABASE> (or other) error? if so, then what was said using ^Integrity and ^Repair could help - but only if you fully understand how to use those tools (!) Most of the time, a corrupted DB is fixable using those tools, or at least data can be 99% recovered. Depending on the number of errors: if its huge than sometimes it is faster to recover the DB from a valid backup + journal files.
BTW - if this is a mirrored DB than there are other considerations as well.
Happy new year! Thank you so much everyone,
I logged a tiket with WRC and managed to resolve the issue by restoring IRIS.DAT file from another instance
Regards,
Ephraim Malane I'm glad to hear that you contacted WRC and got this resolved.
Question
Iryna Mykhailova · Jan 12, 2023
Hello,
Created my Learning Lab yesterday evening and for the first hour or so everything was fine, but after that it just stopped working and I get Server unavailable for Management Portal:
And nothing would load in VSCode
Is there anything to be done? Because I wanted to upload my student's test on it. And the system wouldn't allow me to create another server, because I already have one. I forwarded your request to onlinetraining@intersystems.com, which deals with the learning site and learning labs. And I cc'd the email address in your profile. Thank you, got a new server in the morning.
Article
Vladimir Prushkovskiy · Oct 31, 2022
It has been asked a few times recently, how one can make Laravel Framework work with InterSystems IRIS Data Platform. It's been a while since this post about Laravel and InterSystems Caché was published. To bring it up to date, the article gives a brief instruction set on how to setup and configure a Laravel project for use with InterSystems IRIS through ODBC.
What is Laravel?
Laravel is a PHP framework that is based on MVC architecture. Using Laravel simplifies and speeds up the backend development while building modern secure web applications. It is very beginner-friendly, widely used in PHP world and tends to be the most popular backend framework according to github.com star rating measured in this video. Combining all that with flexibility and performance capabilities delivered by InterSystems IRIS as a database is seen to be beneficial for both worlds.
This post is organised into 4 steps which represent a sequence of actions one needs to complete in order to make the connection work. Specific ways of completing each step may vary depending on the platform. Commands here are shown for Ubuntu 22.02 (x64).
Setup Driver Manager (unixODBC)
In order to make the connection work we need to install a Driver Manager. Most commonly used driver managers are 'unixODBC' and 'iODBC'. This guide uses 'unixODBC', which may be downloaded here http://www.unixodbc.org/ . Please refer to 'Download' section of the website to download and build. Alternatively, build instructions can also be found here. We'll use here packages from 'apt' package manager for Ubuntu.
Install packages
Install unixodbc package accompanied by libodbccr2 which provides unixODBC Cursor library.
sudo apt update
sudo apt -y install unixodbc libodbccr2 odbcinst
Create a link for Cursor LibraryIn certain cases there might be issues with Shared Object Dependencies after unixODBC installation. This is shown as 'Can't open cursor lib' error. There are few workarounds described in internet. In order to resolve this issue we make a symbolic link to a desired library. First, we locate the library:
sudo find / -type f -name "libodbccr*"
And then we create a link
sudo ln -s /usr/lib/x86_64-linux-gnu/libodbccr.so.2.0.0 /etc/libodbccr.so
Setup ODBC Driver for InterSystems IRIS
ODBC driver for InterSystems IRIS can be obtained in various ways. For example, ODBC Driver is included to all InterSystems IRIS kits. The other option would be Distributions Portal on wrc.intersystems.com.
Alternatively, drivers for all supported platforms can be found here: https://intersystems-community.github.io/iris-driver-distribution/
Download, unpack and install ODBC Driver:
sudo mkdir -p /usr/lib/intersystems/odbc
cd /usr/lib/intersystems/odbc/
sudo wget -q https://raw.githubusercontent.com/intersystems-community/iris-driver-distribution/main/ODBC/lnxubuntu2004/ODBC-2022.1.0.209.0-lnxubuntu2004x64.tar.gz
sudo tar -xzvf /usr/lib/intersystems/odbc/ODBC-2022.1.0.209.0-lnxubuntu2004x64.tar.gz
sudo ./ODBCinstall
sudo rm -f ODBC-2022.1.0.209.0-lnxubuntu2004x64.tar.gz
After that, the driver will be located in the following folder /usr/lib/intersystems/odbc/bin/.
Additional information on drivers and their usage may be found in the docs. This guide uses libirisodbcur6435.so as a driver library.
Setup Laravel project
The traditional and convenient way to interact with a database from Laravel would be using its Eloquent package. Eloquent is an object relational mapper (ORM) that is included by default within the Laravel framework. Only few DBMS vendors are supported out-of-the-box. So in order to implement connection and SQL query builder specifics for InterSystems IRIS (via ODBC) some additional PHP code needs to be written. Thanks to @Jean.Dormehl this gap was covered for InterSystems Caché . The same one could be used for InterSystems IRIS.So in this article we describe steps to setup connection for existing Laravel project using jeandormehl/laracache package, assuming that installation and configuration of php, composer and Laravel is done prior to that.
Install php-odbcMake sure that php-odbc module is installed. You can check the list of modules installed with the following command:
php -m | grep odbc
To install php-odbc extension use the following command using a proper version of php installed on your environment
sudo apt -y install php8.1-odbc
Setup 'jeandormehl/laracache' packageGo to your Laravel project directory, install package and publish its config file.
composer require jeandormehl/laracache
php artisan vendor:publish --tag=isc
Configure IRIS connection
Edit your .env file to contain settings needed to connect to a database. For Unix users it should look similar to this:
DB_CONNECTION=isc
DB_WIN_DSN=
DB_UNIX_DRIVER=/usr/lib/intersystems/odbc/bin/libirisodbcur6435.so
DB_HOST=127.0.0.1
DB_PORT=1972
DB_DATABASE=USER
DB_USERNAME=_SYSTEM
DB_PASSWORD=sys
After editing .env file you may find useful to clear application config cache:
php artisan config:cache
Usage
Let's try to retrieve some data using our new package. As an example we'll create a Model inherited from Laracache\Cache\Eloquent\Model . Just for testing purposes we will count the number of sent messages in Interoperability enabled namespace.
nano app/Models/EnsMessageHeader.php
<?php
namespace App\Models;
use Laracache\Cache\Eloquent\Model;
class EnsMessageHeader extends Model
{
protected $table = 'Ens.MessageHeader';
protected $fillable = [
'MessageBodyClassName'
];
}
To execute a query we may create an artisan console command like this:
nano routes/console.php
<?php
use Illuminate\Foundation\Inspiring;
use Illuminate\Support\Facades\Artisan;
use App\Models\EnsMessageHeader;
Artisan::command('iris:test', function () {
echo EnsMessageHeader::count() . PHP_EOL;
});
Then executing the following command should retrieve the number of records
php artisan iris:test
This scenario should work to a wide range of InterSystems IRIS based products. Great Article - Thanks Vlad !. Nice write-up, thanks Vlad! Thank you very much.
Article
Anastasia Dyubaylo · Jun 28, 2023
Hi Community,
If you wish to share with others your solution/tool and/or your company services which are connected to our products, we will be happy to organize a webinar for you to promote it. We will organize your webinar without any fuss on your side, you just need to tell us what you want to talk about and when you want to do it.
From its side, InterSystems team will:
set up an online webinar,
promote it on Developer Community and social media,
create a registration page in the InterSystems group on a special event platform,
do a dry-run before the webinar to check that everything is as it should be,
provide technical support during the webinar.
To make use of all these, you just need to redeem the reward on Global Masters called "Your Webinar supported by InterSystems" for just 5,000 points
Redeem this reward our team will reach out to get all the info and we will do the rest.
Try your hand at presenting and share your thoughts in a more interactive format!
🫂🫂🫂🫂🫂🗣️🗣️🗣️