Search

Clear filter
Announcement
Anastasia Dyubaylo · Jun 18, 2021

Video: Performance Testing with InterSystems Tools

Hi Community, Enjoy watching the new session recording from InterSystems Virtual Summit 2020: ⏯ Performance Testing with InterSystems Tools Learn about some of the tools used to determine the performance capabilities of your system. See how you can test these capabilities and analyze the results. Additional resources: InterSystems System Alerting and Monitoring (SAM) Documentation SAM GitHub page (with instructions) Yet Another pButtons Extractor 🗣 Presenter: @Pran.Mukherjee, Senior Technology Architect, InterSystems Enjoy watching this video! 👍🏼
Announcement
Anastasia Dyubaylo · Mar 23, 2021

InterSystems Programming Contest: Voting Rules

Hi Developers, Please welcome the new voting rules for the InterSystems programming contests! See details below: You can select 3 projects: the 1st, the 2nd, and the 3rd place upon your decision. This is how it works for the Community leaderboard: Community Leaderboard: Conditions Place 1st 2nd 3rd If you have an article posted on DC and an app uploaded to Open Exchange (OEX) 9 6 3 If you have at least 1 article posted on DC or 1 app uploaded to OEX 6 4 2 If you make any valid contribution to DC (posted a comment/question, etc.) 3 2 1 For the Experts leaderboard, different levels of experts have more "points" power: Experts Leaderboard: Level Place 1st 2nd 3rd VIP Global Masters level or ISC Product Managers 15 10 5 Ambassador GM level 12 8 4 Expert GM level or DC Moderators 9 6 3 Specialist GM level 6 4 2 Advocate GM level or ISC Employees 3 2 1 For those who have any of the above expert levels, votes will be counted in both Expert and Community nominations automatically. This is how it works: To take part in the voting, you need: Sign in to Open Exchange – DC credentials will work. Make any valid contribution to the Developer Community – answer or ask questions, write an article, contribute applications on Open Exchange – and you'll be able to vote. Check this post on the options to make helpful contributions to the Developer Community. We hope this new system will be fairer and you'll have the option to give your votes to more projects that you like. Comments, suggestions are very welcome! Is there any detail rules about how a common non-ISC user can become specialist, expert, ambassordor and even VIP lvl? These ranks all relate to Global Masters (GM) Thx! @Xuying.Zheng Pls help translate this. Thx! Hi Michael,Robert is absolutely right, these levels are related to Global Masters Advocate Hub. You can see a description of all levels and how to achieve them in this post. If you have an article posted on DC and an app downloaded on OEX Did you really mean an app uploaded to OEX? Fixed. Thank you, Alexey!
Article
Sergey Lukyanchikov · Apr 7, 2021

Distributed Artificial Intelligence with InterSystems IRIS

What is Distributed Artificial Intelligence (DAI)? Attempts to find a “bullet-proof” definition have not produced result: it seems like the term is slightly “ahead of time”. Still, we can analyze semantically the term itself – deriving that distributed artificial intelligence is the same AI (see our effort to suggest an “applied” definition) though partitioned across several computers that are not clustered together (neither data-wise, nor via applications, not by providing access to particular computers in principle). I.e., ideally, distributed artificial intelligence should be arranged in such a way that none of the computers participating in that “distribution” have direct access to data nor applications of another computer: the only alternative becomes transmission of data samples and executable scripts via “transparent” messaging. Any deviations from that ideal should lead to an advent of “partially distributed artificial intelligence” – an example being distributed data with a central application server. Or its inverse. One way or the other, we obtain as a result a set of “federated” models (i.e., either models trained each on their own data sources, or each trained by their own algorithms, or “both at once”). Distributed AI scenarios “for the masses” We will not be discussing edge computations, confidential data operators, scattered mobile searches, or similar fascinating yet not the most consciously and wide-applied (not at this moment) scenarios. We will be much “closer to life” if, for instance, we consider the following scenario (its detailed demo can and should be watched here): a company runs a production-level AI/ML solution, the quality of its functioning is being systematically checked by an external data scientist (i.e., an expert that is not an employee of the company). For a number of reasons, the company cannot grant the data scientist access to the solution but it can send him a sample of records from a required table following a schedule or a particular event (for example, termination of a training session for one or several models by the solution). With that we assume, that the data scientist owns some version of the AI/ML mechanisms already integrated in the production-level solution that the company is running – and it is likely that they are being developed, improved, and adapted to concrete use cases of that concrete company, by the data scientist himself. Deployment of those mechanisms into the running solution, monitoring of their functioning, and other lifecycle aspects are being handled by a data engineer (the company employee). An example of deployment of a production-level AI/ML solution on InterSystems IRIS platform that works autonomously with a flow of data coming from equipment, was provided by us in this article. The same solution runs in the demo under the link provided in the above paragraph. You can build your own solution prototype on InterSystems IRIS using the content (free with no time limit) in our repo Convergent Analytics (visit sections Links to Required Downloads and Root Resources). Which “degree of distribution” of AI do we get via such scenario? In our opinion, in this scenario we are rather close to the ideal because the data scientist is “cut from” both the data (just a limited sample is transmitted – although crucial as of a point in time) and the algorithms of the company (data scientist’s own “specimens” are never in 100% sync with the “live” mechanisms deployed and running as part of the real-time production-level solution), he has no access at all to the company IT infrastructure. Therefore, the data scientist’s role resolves to a partial replay on his local computational resources of an episode of the company production-level AI/ML solution functioning, getting an estimate of the quality of that functioning at an acceptable confidence level – and returning a feedback to the company (formulated, in our concrete scenario, as “audit” results plus, maybe, an improved version of this or that AI/ML mechanism involved in the company solution). Figure 1 Distributed AI scenario formulation We know that feedback may not necessarily need to be formulated and transmitted during an AI artifact exchange by humans, this follows from publications about modern instruments and already existing experience around implementations of distributed AI. However, the strength of InterSystems IRIS platform is that it allows equally efficiently to develop and launch both “hybrid” (a tandem of a human and a machine) and fully automated AI use cases – so we will continue our analysis based on the above “hybrid” example, while leaving a possibility for the reader to elaborate on its full automation on their own. How a concrete distributed AI scenario runs on InterSystems IRIS platform The intro to our video with the scenario demo that is mentioned in the above section of this article gives a general overview of InterSystems IRIS as real-time AI/ML platform and explains its support of DevOps macromechanisms. In the demo, the “company-side” business process that handles regular transmission of training datasets to the external data scientist, is not covered explicitly – so we will start from a short coverage of that business process and its steps. A major “engine” of the sender business processes is the while-loop (implemented using InterSystems IRIS visual business process composer that is based on the BPL notation interpreted by the platform), responsible for a systematic sending of training datasets to the external data scientist. The following actions are executed inside that “engine” (see the diagram, skip data consistency actions): Figure 2 Main part of the “sender” business process (a) Load Analyzer – loads the current set of records from the training dataset table into the business process and forms a dataframe in the Python session based on it. The call-action triggers an SQL query to InterSystems IRIS DBMS and a call to Python interface to transfer the SQL result to it so that the dataframe is formed; (b) Analyzer 2 Azure – another call-action, triggers a call to Python interface to transfer it a set of Azure ML SDK for Python instructions to build required infrastructure in Azure and to deploy over that infrastructure the dataframe data formed in the previous action; As a result of the above business process actions executed, we obtain a stored object (a .csv file) in Azure containing an export of the recent dataset used for model training by the production-level solution at the company: Figure 3 “Arrival” of the training dataset to Azure ML With that, the main part of the sender business process is over, but we need to execute one more action keeping in mind that any computation resources that we create in Azure ML are billable (see the diagram, skip data consistency actions): Figure 4 Final part of the “sender” business process (c) Resource Cleanup – triggers a call to Python interface to transfer it a set of Azure ML SDK for Python instructions to remove from Azure the computational infrastructure built in the previous action. The data required for the data scientist has been transmitted (the dataset is now in Azure), so we can proceed with launching the “external” business process that would access the dataset, run at least one alternative model training (algorithmically, an alternative model is distinct from the model running as part of the production-level solution), and return to the data scientist the resulting model quality metrics plus visualizations permitting to formulate “audit findings” about the company production-level solution functioning efficiency. Let us now take a look at the receiver business process: unlike its sender counterpart (runs among the other business processes comprising the autonomous AI/ML solution at the company), it does not require a while-loop, but it contains instead a sequence of actions related to training of alternative models in Azure ML and in IntegratedML (the accelerator for use of auto-ML frameworks from within InterSystems IRIS), and extracting the training results into InterSystems IRIS (the platform is also considered installed locally at the data scientist’s): Figure 5 “Receiver” business process (a) Import Python Modules – triggers a call to Python interface to transfer it a set of instructions to import Python modules that are required for further actions; (b) Set AUDITOR Parameters – triggers a call to Python interface to transfer it a set of instructions to assign default values to the variables required for further actions; (c) Audit with Azure ML – (we will be skipping any further reference to Python interface triggering) hands “audit assignment” to Azure ML; (d) Interpret Azure ML – gets the data transmitted to Azure ML by the sender business process, into the local Python session together with the “audit” results by Azure ML (also, creates a visualization of the “audit” results in the Python session); (e) Stream to IRIS – extracts the data transmitted to Azure ML by the sender business process, together with the “audit” results by Azure ML, from the local Python session into a business process variable in IRIS; (f) Populate IRIS – writes the data transmitted to Azure ML by the sender business process, together with the “audit” results by Azure ML, from the business process variable in IRIS to a table in IRIS; (g) Audit with IntegratedML – “audits” the data received from Azure ML, together with the “audit” results by Azure ML, written into IRIS in the previous action, using IntegratedML accelerator (in this particular case it handles H2O auto-ML framework); (h) Query to Python – transfers the data and the “audit” results by IntegratedML into the Python session; (i) Interpret IntegratedML – in the Python session, creates a visualization of the “audit” results by IntegratedML; (j) Resource Cleanup – deletes from Azure the computational infrastructure created in the previous actions. Figure 6 Visualization of Azure ML “audit” results Figure 7 Visualization of IntegratedML “audit” results How distributed AI is implemented in general on InterSystems IRIS platform InterSystems IRIS platform distinguishes among three fundamental approaches to distributed AI implementation: · Direct exchange of AI artifacts with their local and central handling based on the rules and algorithms defined by the user · AI artifact handling delegated to specialized frameworks (for example: TensorFlow, PyTorch) with exchange orchestration and various preparatory steps configured on local and the central instances of InterSystems IRIS by the user · Both AI artifact exchange and their handling done via cloud providers (Azure, AWS, GCP) with local and the central instances just sending input data to a cloud provider and receiving back the end result from it Figure 8 Fundamental approaches to distributed AI implementation on InterSystems IRIS platform These fundamental approaches can be used modified/combined: in particular, in the concrete scenario described in the previous section of this article (“audit”), the third, “cloud-centric”, approach is used with a split of the “auditor” part into a cloud portion and a local portion executed on the data scientist side (acting as a “central instance”). Theoretical and applied elements that are adding up to the “distributed artificial intelligence” discipline right now in this reality that we are living, have not yet taken a “canonical form”, which creates a huge potential for implementation innovations. Our team of experts follows closely the evolution of distributed AI as a discipline, and constructs accelerators for its implementation on InterSystems IRIS platform. We would be glad to share our content and help everyone who finds useful the domain discussed here to start prototyping distributed AI mechanisms. You can reach our AI/ML expert team using the following e-mail address – MLToolkit@intersystems.com.
Question
Raman Sailopal · Apr 8, 2021

InterSystems IRIS command line utility

Hi guys, I've created an iris command line utility to interface the Linux command line with Interface IRIS https://github.com/RamSailopal/iriscmd It maybe of use to some of the community Ram Hi @Raman.Sailopal You can publish your contribution on OpenExchange Check out this video about ZPM, so you can provide your command line in an even easier way to install and use it. New Video: ObjectScript Package Manager ZPM: Installing, Building, Testing, (intersystems.com) Hi Ram! We see the request to Open Exchange, but you need to add a License to the repository - e.g. MIT License. This is a mandatory thing to submit for Open Exchange.
Announcement
Anastasia Dyubaylo · May 4, 2021

InterSystems FHIR Accelerator Programming Contest

Hey Community, Please join the next InterSystems online programming competition: 🏆 InterSystems FHIR Accelerator Programming Contest 🏆 Submit an application that uses InterSystems FHIR-as-a-service on AWS or helps to develop solutions using InterSystems IRIS FHIR Accelerator. Duration: May 10 - June 06, 2021 Total prize: $8,750 👉 Landing page 👈 Prizes 1. Experts Nomination - a specially selected jury will determine winners: 🥇 1st place - $4,000 🥈 2nd place - $2,000 🥉 3rd place - $1,000 2. Community winners - an application that will receive the most votes in total: 🥇 1st place - $1,000 🥈 2nd place - $500 🥉 3rd place - $250 If several participants score the same amount of votes, they all are considered winners, and the money prize is shared among the winners. Who can participate? Any Developer Community member, except for InterSystems employees. Create an account! 👥 Developers can team up to create a collaborative application. Allowed from 2 to 5 developers in one team. Do not forget to highlight your team members in the README of your application – DC user profiles. Contest Period 🛠 May 10 - 30: Application development and registration phase. ✅ May 31 - June 6: Voting phase. 🎉 June 7: Winners announcement. Note: Developers can improve their apps throughout the entire registration and voting period. The topic 💡 InterSystems IRIS FHIR Accelerator as a service (FHIRaaS) 💡 FHIRaaS capabilities: Support for FHIR R4, including the U.S. Core Implementation Guide Developer portal for testing and understanding FHIR APIs Multiple methods of authentication, including API Key and OpenID Connect Batch import of FHIR bundles via sFTP Logging of FHIR request data Built on AWS infrastructure that is ISO 27001:2013 and HITRUST certified to support HIPAA and GDPR The full list of supported services and operations. ➡️ Get your FREE access to InterSystems IRIS FHIR Accelerator Service (FHIRaaS) on AWS ⬅️ Submit an application that uses InterSystems FHIR-as-a-service on AWS or helps to develop solutions using InterSystems IRIS FHIR Accelerator. Here are the requirements: Accepted applications: new to Open Exchange apps or existing ones, but with a significant improvement. Our team will review all applications before approving them for the contest. The application can be built with ANY technology that will use InterSystems IRIS FHIR as a service. The application should be Open Source and published on GitHub. The README file to the application should be in English, contain the installation steps, and contain either the video demo or/and a description of how the application works. Source code of the InterSystems ObjectScript part (if any)should be available in UDL format (not XML). Example. The requirements above are subject to change. Helpful resources 1. Template we suggest to start from: coming soon 2. Documentation: InterSystems IRIS FHIR Accelerator Service 3. Online courses on InterSystems FHIR support: Learn FHIR for Software Developers Building SMART on FHIR Apps with InterSystems FHIR Sandbox Exploring FHIR Resource APIs Using InterSystems IRIS for Health to Reduce Readmissions Connecting Devices to InterSystems IRIS for Health Monitoring Oxygen Saturation in Infants FHIR Integration QuickStart 4. Videos: Getting Started with the InterSystems IRIS FHIR Accelerator Service on AWS Other FHIR related videos: 6 Rapid FHIR Questions SMART on FHIR: The Basics Developing with FHIR - REST APIs FHIR in InterSystems IRIS for Health FHIR API Management Searching for FHIR Resources in IRIS for Health Also, please check the related FHIR playlist on DC YouTube. 5. Q&A on FHIR: Explore FHIR tag on DC Ask questions on community.fhir.org 6. How to submit your app to the contest: How to publish an application on Open Exchange How to apply for the contest Judgment Voting rules will be announced soon. Stay tuned! So! We're waiting for YOUR great project – join our coding marathon to win! ❗️ Please check out the Official Contest Terms here.❗️ Hey guys, We're pleased to invite you to join the upcoming kick-off webinar dedicated to the FHIR Accelerator Programming Contest! ➡️ InterSystems FHIR Accelerator Contest Kick-Off Webinar 🗓 Monday, May 10 — 01:00 PM EDT ✅ Register here! Thanks for everyone who joined the webinar today! As it was announced will start providing the access codes to sign up for the FHIRaaS on Thursday 14th of May! Learn more here. Also @Anton.Umnikov shared the example application that could use InterSystems FHIRaaS. Just change the base-url and provide your key to make it working with FHIRaaS. Hey Developers! We started the InterSystems FHIR Accelerator Programming Contest! Feel free to join us, we are waiting for your participation😎 Add your applications to our Contest board 🚀 Participants! Whose application will be the first? 👀 Don't miss it! ⏯ InterSystems FHIR Accelerator Contest Kick-off Webinar Hi Community! The registration period has already begun! Follow our Contest Board and stay tuned. Waiting for your cool projects! Hey Developers! The second week of registration has started! Hurry up to upload your applications! There were requests on what FHIRaaS provides. Here is the very precise document that describes that. The announcement is updated accordingly. Developers!We remind you that you have a great opportunity to get FREE access to the FHIRaaS on AWS! 🔥 Register on our FHIR Portal become a master of the FHIRaaS with InterSystems! Please use this link:👉🏼 https://portal.trial.isccloud.io/account/signup Feel free to ask any questions regarding the competition here or in the discord-contests channel. Happy coding! 😊 Hi Developers! Upload your applications to the Open Exchange and we'll see them on the Contest Board! Let everyone know about your cool app! 💪 Hey Developers, The first application is already on the Contest Board! FHIR Data Studio Connector by @Dmitry Maslennikov Who's next? Hey developers, If you haven't seen it yet, don't miss our official contest landing page: 🔥 InterSystems FHIR Accelerator Programming Contest Landing 🔥 Only 3 days left until the end of registration. Hurry up to participate! Developers! We are waiting for your solutions! Don't forget to participate! Hey Developers! ❗️Important news❗️We prolong the registration period till the 2nd June included. Hurry up! 🔥You still have time to upload your app. We wish you good luck!😉 ✅ June 3 - June 6: Voting phase. Hey Developers! The three application is already on the Contest Board! FHIR Data Studio Connector by @Dmitry.Maslennikov iris-on-fhir by @Henrique.GonçalvesDias FHIR Simple Demo Application by @Marcello.Correa Last call! Registration for the InterSystems FHIR Accelerator Programming Contest ends today! Hurry up to upload your application(-s) 😉
Announcement
Anastasia Dyubaylo · Apr 26, 2021

Winners for the InterSystems Developer Tools Contest

Hey community, The InterSystems Developer Tools contest is over. Thank you all for participating in our exciting coding marathon! And now it's time to announce the winners! A storm of applause goes to these developers and their applications: 🏆 Experts Nomination - winners were determined by a specially selected jury: 🥇 1st place and $4,000 go to the Server Manager for VSCode project by @John.Murray 🥈 2nd place and $1,500 go to the Config-API project by @Lorenzo.Scalese 🥈 2nd place and $1,500 go to the zpm-explorer project by @Henrique.GonçalvesDias and @José.Pereira 🏆 Community Nomination - an application that received the most votes in total: 🥇 1st place and $750 go to the Server Manager for VSCode project by @John.Murray 🥈 2nd place and $500 go to the zpm-explorer project by @Henrique.GonçalvesDias and @José.Pereira 🥉 3rd place and $250 go to the Config-API project by @Lorenzo.Scalese Congratulations to all the winners and participants! Thank you all for your attention to the contest and the efforts you pay into this competition! And what's next? We will announce the next competition very soon – stay tuned! big CONGRATULATIONS to the winners! I feel honoured that Server Manager 2.0 achieved first place in both categories. Building good tools for developers has been a major part of my professional life for nearly thirty years, so I am pleased this one is proving popular. Thank you to everyone who voted for me, and to my employer George James Software for allowing me to work on this during office hours. Congratulations to Lorenzo, Henrique and José for their successes. There were some really great entries in the contest, so take a look at them if you haven’t already. Congrats @John.Murray for this deserved victory! Congrats to all the participants! This was an amazing contest! And It's a real pity this time that we have only 3 winning positions. The community has won a lot more! Hey developers, We would also like to thank all of our participants and highlight their cool apps: 💥 Grafana Plugin for InterSystems, IntelliJ InterSystems by @Dmitry.Maslennikov 💥 IRIS_REST_Documentation by @davimassaru.teixeiramuta 💥 gj :: locate by @George.James 💥 Git for IRIS by @Marcus.Wurlitzer 💥 zapm-editor by @Sergei.Mihaylenko 💥helper-for-objectscript-language-extensions, IRIS-easy-ECP-workbench, and SSH for IRIS container by @Robert.Cemper1003 Thank you all for such a great contribution! Hope to see your new apps in the next contests! Congrats to the developers that dedicated the time of their lives to create incredible applications! Congrats to the staff team that make those contests possible and help us to make the Community greater every day. Congratulations everyone! Congrats!!! Well deserved:) Thank you all for your contributions!! Congrats to the winners. Thank you for the Git for IRIS app. I plan to use that It is a great piece of work John. Well deserved.
Article
Brendan Bannon · Jul 15, 2021

Embedded SQL new in InterSystems IRIS

Benjamin De Boe wrote this great article about Universal Cached Queries, but what the heck is a Universal Cached Query (UCQ) and why should I care about it if I am writing good old embedded SQL? In Caché and Ensemble, Cached Queries would be generated to resolve xDBC and Dynamic SQL. Now in InterSystems IRIS embedded SQL has been updated to use Cached Queries, hence the Universal added to the name. Now any SQL executed on IRIS will be done so from a UCQ class. Why did InterSystems do this? Good Question! The big win here is flexibility in a live environment. In the past, if you add an index or ran TuneTable, the SQL in Cached Queries would make use of this new information right away while embedded SQL would remain unchanged until the class or routine was compiled manually. If your application used deployed classes or only shipped OBJ code, recompiling on the customer system was not an option. Now all SQL statements on a system will be using the latest class def. and the latest tuning data available. In the future, InterSystems IRIS will have optional tools that can monitor and tune your production systems on a nightly basis customizing the SQL plans based on how the tables are being queried. As this toolset grows the power of the Universal Cached Query will grow as well. Is my embedded SQL slower now? Yes and no. Calling a tag in a different routine is a little more expensive than calling a tag in the same routine, so that is slower, but UCQ code generation was different from embedded, and getting to use those changes more than makes up for the expense of calling a different routine. Are there cases where the UCQ code is slower? Yes, but overall you should see better performance. I am an embedded SQL guy from way back so I always like to point out that Embedded SQL is faster than Dynamic SQL. It still is faster, but with all the work that has been done to make objects faster the margin between the 2 styles is small enough that I will not make fun of you for using dynamic SQL. How do I check for errors now? Error handling for Embedded SQL has not changed. SQLCODE will be set to a negative number if we hit an error and %msg will be set to the description of that error. What has changed are the types of errors you can get. The default behavior now is that the SQL will not be compiled until the first time the query is run. This means if you misspell a field or table in the routine the error will not get reported when you compile that routine, it will be reported the first time you execute the SQL, same as dynamic SQL. SQLCODE is set for every SQL command but if you are lazy like me you only ever check SQLCODE after a FETCH. Well, now you might want to start checking on the OPEN as well. &SQL(DECLARE cur CURSOR FOR SELECT Name,junk into :var1, :var2 FROM Sample.Person) &SQL(OPEN cur) write !,"Open Status: ",SQLCODE,?20,$G(%msg) for { &SQL(FETCH cur) write !,"Fecth Status: ",SQLCODE,?20,$G(%msg) QUIT:SQLCODE'=0 w !,var1 } &SQL(CLOSE cur) write !,"Close Status: ",SQLCODE,?20,$G(%msg) QUIT In the code above I have an invalid field in the SELECT. Because we do not compile the SQL when we compile the routine this error is not reported. When I execute the code the OPEN reports the compile error while the FETCH and CLOSE report a cursor not open error. %msg does not get changed so if you check that at any point you will get helpful info: USER>d ^Embedded Open Status: -29 Field 'JUNK' not found in the applicable tables^DECLARE cur CURSOR FOR SELECT Name , junk INTO compiling embedded cached query from Embedded.mac Fetch Status: -102 Field 'JUNK' not found in the applicable tables^DECLARE cur CURSOR FOR SELECT Name , junk INTO compiling embedded cached query from Embedded.mac Close Status: -102 Field 'JUNK' not found in the applicable tables^DECLARE cur CURSOR FOR SELECT Name , junk INTO compiling embedded cached query from Embedded.mac What if I don’t want my embedded SQL to change? You can still do this using Frozen Query Plans. A quick side note, every major IRIS upgrade you do will freeze all SQL Statements so nothing will change if you don’t let it. You can read more about that here. Now back to dealing with UCQ stuff. Here are 3 ways you could freeze embedded SQL plans in your application: If you ship an IRIS.DAT: Do $SYSTEM.OBJ. GenerateEmbedded() to generate the UTC for embedded SQL Freeze the plans: do $SYSTEM.SQL.Statement.FreezeAll() Ship the IRIS.DAT If you use xml files: Do $SYSTEM.OBJ. GenerateEmbedded() to generate the UTC for embedded SQL Freeze the plans: do $SYSTEM.SQL.Statement.FreezeAll() Export the frozen plans: do $SYSTEM.SQL.Statement.ExportAllFrozenPlans() After loading your application, load the frozen plans: do $SYSTEM.SQL.Statement.ImportFrozenPlans() Freeze UTC Plans on the customer site: Load the code with embedded SQL on the customer system Do $SYSTEM.OBJ. GenerateEmbedded() to generate the UTC for embedded SQL Freeze all the plans that got generated: do $SYSTEM.SQL.Statement.FreezeAll() Can I go back to the old behavior? Nope, this is the way it is now. From a developer's point of view, you can get the old behavior back by adding the flag /compileembedded=1 to your compiler options. This will tell the compiler to generate the UCQ class while compiling the class or routine. If there is an issue with the SQL it will be reported at compile time as it did in the past. Compiling routine : Embedded.macERROR: Embedded.mac(5) : SQLCODE=-29 : Field 'JUNK' not found in the applicable tables^DECLARE cur CURSOR FOR SELECT Name , junk INTO compiling embedded cached query from Embedded.macDetected 1 errors during compilation in 0.034s. If you are concerned about the overhead of generating the UCQ classes the first time embedded SQL is run you could add this step as part of your application install to generate them all in advance: do $SYSTEM.OBJ. GenerateEmbedded() This is a very high-level overview of Universal Cached Queries and Embedded SQL. I did not get into any of the real details about what happens under the covers. I just tried to talk about stuff people would run into as they work with Embedded SQL on IRIS. Overall moving to UCQ should make SQL performance more consistent across all types of SQL and it should make updating SQL on a production system easier. There will be some adjustments. Adding the compiler flag will be a big help for me. Now I just need to get used to looking for the generated code in a new place. If you have any questions, comments, concerns about this, or anything else related to SQL on InterSystems IRIS please let me know. Very nice article @brendan.bannon - thank you for boiling it down to a set of core things that developers will care most about!
Question
Ephraim Malane · Dec 22, 2022

InterSystems IRIS.DAT file corrupted

Hi Community, My IRIS.DAT file is corrupted on one of my Edge productions in the development environment and as a result, I cannot start production. I would like to recover it if there is a way to do so, please assist. Regards, You should ask WRC, for the help if you have support If you are a supported customer (with the license under support - SUTA), please contact WRC. More details you can find at https://www.intersystems.com/support/ I haven't seen a database corruption for many years, I literally forgot the last time I saw it, it may be 15+ years or more. And in the past (last millennium) I've seen and dealt db corruption. Out of curiosity, what are the symptoms or your corruption? How did it happened? To check if a database is corrupt : Do ^INTEGRIT To repair a database : Do ^REPAIR (But if you don't know this utility or the internals of database blocks and pointers : don't use it !!!) ^INTEGRIT is the simplest way to check integrity. Run the integrity check output to a file and contact WRC, as others have said, if you have support. The most direct way to resolve database errors is to restore from a good backup and replay journals. If you can't do that, the other alternatives almost always involve loss of information. The WRC has specialists who understand database internals, and WRC always wants to investigate for the root cause of any database problems. Ephraim, When you say "corrupted" to better understand...- Did you try to mount the DB (from the SMP of with ^MOUNT)? Sometimes if IRIS/Cache was "forced" than a *.lck file on the DB folder need to be deleted in order to allow a successful mount. - If the DB is mounted, did you got a <DATABASE> (or other) error? if so, then what was said using ^Integrity and ^Repair could help - but only if you fully understand how to use those tools (!) Most of the time, a corrupted DB is fixable using those tools, or at least data can be 99% recovered. Depending on the number of errors: if its huge than sometimes it is faster to recover the DB from a valid backup + journal files. BTW - if this is a mirrored DB than there are other considerations as well. Happy new year! Thank you so much everyone, I logged a tiket with WRC and managed to resolve the issue by restoring IRIS.DAT file from another instance Regards, Ephraim Malane I'm glad to hear that you contacted WRC and got this resolved.
Question
Iryna Mykhailova · Jan 12, 2023

My InterSystems Learning Lab is down

Hello, Created my Learning Lab yesterday evening and for the first hour or so everything was fine, but after that it just stopped working and I get Server unavailable for Management Portal: And nothing would load in VSCode Is there anything to be done? Because I wanted to upload my student's test on it. And the system wouldn't allow me to create another server, because I already have one. I forwarded your request to onlinetraining@intersystems.com, which deals with the learning site and learning labs. And I cc'd the email address in your profile. Thank you, got a new server in the morning.
Article
Vladimir Prushkovskiy · Oct 31, 2022

InterSystems IRIS with Laravel (via ODBC)

It has been asked a few times recently, how one can make Laravel Framework work with InterSystems IRIS Data Platform. It's been a while since this post about Laravel and InterSystems Caché was published. To bring it up to date, the article gives a brief instruction set on how to setup and configure a Laravel project for use with InterSystems IRIS through ODBC. What is Laravel? Laravel is a PHP framework that is based on MVC architecture. Using Laravel simplifies and speeds up the backend development while building modern secure web applications. It is very beginner-friendly, widely used in PHP world and tends to be the most popular backend framework according to github.com star rating measured in this video. Combining all that with flexibility and performance capabilities delivered by InterSystems IRIS as a database is seen to be beneficial for both worlds. This post is organised into 4 steps which represent a sequence of actions one needs to complete in order to make the connection work. Specific ways of completing each step may vary depending on the platform. Commands here are shown for Ubuntu 22.02 (x64). Setup Driver Manager (unixODBC) In order to make the connection work we need to install a Driver Manager. Most commonly used driver managers are 'unixODBC' and 'iODBC'. This guide uses 'unixODBC', which may be downloaded here http://www.unixodbc.org/ . Please refer to 'Download' section of the website to download and build. Alternatively, build instructions can also be found here. We'll use here packages from 'apt' package manager for Ubuntu. Install packages Install unixodbc package accompanied by libodbccr2 which provides unixODBC Cursor library. sudo apt update sudo apt -y install unixodbc libodbccr2 odbcinst Create a link for Cursor LibraryIn certain cases there might be issues with Shared Object Dependencies after unixODBC installation. This is shown as 'Can't open cursor lib' error. There are few workarounds described in internet. In order to resolve this issue we make a symbolic link to a desired library. First, we locate the library: sudo find / -type f -name "libodbccr*" And then we create a link sudo ln -s /usr/lib/x86_64-linux-gnu/libodbccr.so.2.0.0 /etc/libodbccr.so Setup ODBC Driver for InterSystems IRIS ODBC driver for InterSystems IRIS can be obtained in various ways. For example, ODBC Driver is included to all InterSystems IRIS kits. The other option would be Distributions Portal on wrc.intersystems.com. Alternatively, drivers for all supported platforms can be found here: https://intersystems-community.github.io/iris-driver-distribution/ Download, unpack and install ODBC Driver: sudo mkdir -p /usr/lib/intersystems/odbc cd /usr/lib/intersystems/odbc/ sudo wget -q https://raw.githubusercontent.com/intersystems-community/iris-driver-distribution/main/ODBC/lnxubuntu2004/ODBC-2022.1.0.209.0-lnxubuntu2004x64.tar.gz sudo tar -xzvf /usr/lib/intersystems/odbc/ODBC-2022.1.0.209.0-lnxubuntu2004x64.tar.gz sudo ./ODBCinstall sudo rm -f ODBC-2022.1.0.209.0-lnxubuntu2004x64.tar.gz After that, the driver will be located in the following folder /usr/lib/intersystems/odbc/bin/. Additional information on drivers and their usage may be found in the docs. This guide uses libirisodbcur6435.so as a driver library. Setup Laravel project The traditional and convenient way to interact with a database from Laravel would be using its Eloquent package. Eloquent is an object relational mapper (ORM) that is included by default within the Laravel framework. Only few DBMS vendors are supported out-of-the-box. So in order to implement connection and SQL query builder specifics for InterSystems IRIS (via ODBC) some additional PHP code needs to be written. Thanks to @Jean.Dormehl this gap was covered for InterSystems Caché . The same one could be used for InterSystems IRIS.So in this article we describe steps to setup connection for existing Laravel project using jeandormehl/laracache package, assuming that installation and configuration of php, composer and Laravel is done prior to that. Install php-odbcMake sure that php-odbc module is installed. You can check the list of modules installed with the following command: php -m | grep odbc To install php-odbc extension use the following command using a proper version of php installed on your environment sudo apt -y install php8.1-odbc Setup 'jeandormehl/laracache' packageGo to your Laravel project directory, install package and publish its config file. composer require jeandormehl/laracache php artisan vendor:publish --tag=isc Configure IRIS connection Edit your .env file to contain settings needed to connect to a database. For Unix users it should look similar to this: DB_CONNECTION=isc DB_WIN_DSN= DB_UNIX_DRIVER=/usr/lib/intersystems/odbc/bin/libirisodbcur6435.so DB_HOST=127.0.0.1 DB_PORT=1972 DB_DATABASE=USER DB_USERNAME=_SYSTEM DB_PASSWORD=sys After editing .env file you may find useful to clear application config cache: php artisan config:cache Usage Let's try to retrieve some data using our new package. As an example we'll create a Model inherited from Laracache\Cache\Eloquent\Model . Just for testing purposes we will count the number of sent messages in Interoperability enabled namespace. nano app/Models/EnsMessageHeader.php <?php namespace App\Models; use Laracache\Cache\Eloquent\Model; class EnsMessageHeader extends Model { protected $table = 'Ens.MessageHeader'; protected $fillable = [ 'MessageBodyClassName' ]; } To execute a query we may create an artisan console command like this: nano routes/console.php <?php use Illuminate\Foundation\Inspiring; use Illuminate\Support\Facades\Artisan; use App\Models\EnsMessageHeader; Artisan::command('iris:test', function () { echo EnsMessageHeader::count() . PHP_EOL; }); Then executing the following command should retrieve the number of records php artisan iris:test This scenario should work to a wide range of InterSystems IRIS based products. Great Article - Thanks Vlad !. Nice write-up, thanks Vlad! Thank you very much.
Article
Anastasia Dyubaylo · Jun 28, 2023

How to request a webinar hosted by InterSystems

Hi Community, If you wish to share with others your solution/tool and/or your company services which are connected to our products, we will be happy to organize a webinar for you to promote it. We will organize your webinar without any fuss on your side, you just need to tell us what you want to talk about and when you want to do it. From its side, InterSystems team will: set up an online webinar, promote it on Developer Community and social media, create a registration page in the InterSystems group on a special event platform, do a dry-run before the webinar to check that everything is as it should be, provide technical support during the webinar. To make use of all these, you just need to redeem the reward on Global Masters called "Your Webinar supported by InterSystems" for just 5,000 points Redeem this reward our team will reach out to get all the info and we will do the rest. Try your hand at presenting and share your thoughts in a more interactive format! 🫂🫂🫂🫂🫂🗣️🗣️🗣️
Announcement
Michelle Spisak · Sep 20, 2022

Looking to learn InterSystems ObjectScript?

If you’re on the fence about learning InterSystems ObjectScript, we’re making the decision a whole lot easier. We just updated the “Getting Started with InterSystems ObjectScript” learning path with 3 new 5-minute videos — and a capstone exercise to help you pull together everything you’ll learn. 🤝 Get an introduction to InterSystems ObjectScript 🤿 Dive deeper into commands and functions 🤔 Understand data types and variables 👨‍💻 Create a class definition Do it all in our updated learning path, Getting Started with InterSystems ObjectScript.
Announcement
Olga Zavrazhnova · Oct 6, 2022

InterSystems Developer Meetup in San Francisco

Hi Community, Join us for an InterSystems Developer Meetup during TechCrunch Disrupt 2022! We’ll be meeting on Wednesday, October 19th at Bartlett Hall, located at 242 O’Farrell St. (just a few short blocks from the Moscone Center) starting at 6 pm through 8:30 pm PT, where speakers will discuss how developers can bring the code to the data, not data to the code with Embedded Python and Integrated ML on InterSystems IRIS. Food and drinks will be served accompanied by discussions. Agenda: 👉 "InterSystems Overview, Developer Resources, and Startup Program" by @Dean.Andrews2971, Head of Developer Relations, InterSystems 👉 "No Latency Python Apps Close to the Data" by @Raj.Singh5479, Product Manager - Developer Experience, InterSystems 👉 "Machine Learning without a Data Science Team: Fast and Easy AutoML on InterSystems IRIS" by @Akshat.Vora, Senior Systems Developer, InterSystems >> Register here << Hope to see you there!
Article
Philipp Bonin · Oct 19, 2022

InterSystems IRIS integration for Node-RED

The concept of low code development is getting more and more important across all industries. Everybody who is starting to get into low code programming, will inevitably come across Node-RED. InterSystems IRIS is renowned for its interoperability and so should be accessible via Node-RED. For those who have not heard of Node-RED yet: Node-RED is a Low-Code programming application, which is based on so called nodes that are connected with wires. Nodes process incoming messages and forward them to the next connected node. Due to the great community, Node-RED offers a wide variety of nodes for all kind of applications. Given that it would surely be useful to have a node that can interact with InterSystems IRIS right? And that is exactly what I did! Node-RED does already have an InterSystems IRIS integration, but it was not secured against the threat of SQL-Injection. Therefore, I hardened it by safeguarding it against SQL-Injection through parameterizing the statements. To see how to install Node-RED you can follow this guide and for information about the installation of the InterSystems IRIS node, you are invited to review the respective node documentation. Here is a quick Demo on how to use the nodes: First of all we want to create a class called "Demo.Person". It inherits from %Persistent and %Populate, so we can call the method Populate and fill the table with data: Now we seek to insert our custom data into InterSystems IRIS. We can do this by using an insert statement and passing it to the IRIS-Node or by building a JSON Object and passing it to the IRIS_OO-Node: Of course we would also like to access our data in Node-RED: Note that the Node can also parametrize the statement itself. If you want to try out the flow by yourself, you can download it here. At the end I want to show you how fast you can create applications with Node-RED. In the following video you can see how I built an application that generates random machine data (temperature and pressure) and pushes the data to InterSystems IRIS. Afterwards selects it from InterSystems IRIS and displays it in two different ways in the Graphical User Interface: View on npm View on GitHubView on nodered.org Hi Philipp, Thanks for sharing! Congrats on your first contribution to the Developer Community ;) Thank you :) Hi Philipp, did you see my project for Node-RED? And a very recent for n8n, which is an alternative to Node-RED @Philipp.Bonin You are very welcome to publish your integration for Node-RED on Open Exchange.You would need just login here https://openexchange.intersystems.com/ and submit your application. Let me know if you have any questions about Open Exchange platform Hi Dmitry, yes I saw your project and you did really great work there! You mentioned that your node allows SQL injection, therefore I thought my project would be a good addition to yours.
Announcement
Anastasia Dyubaylo · May 30, 2023

InterSystems Grand Prix Contest 2023

Hi Developers, The annual competition for InterSystems IRIS developers is fast approaching! We're super excited to invite you all to join the Grand Prix contest for building open-source solutions using InterSystems IRIS data platform! 🏆 InterSystems Grand Prix Contest 2023 🏆 Duration: June 12th - July 9th, 2023 Prize pool: $26,000 The topic InterSystems Grand Prix is our annual programming contest to find the best application that uses InterSystems IRIS. We welcome applications on any topic! Present any application which uses InterSystems IRIS as a backend (API or database) with any type of InterSystems IRIS API or data model. You are welcome to improve your applications which you presented in the InterSystems contest series last year and submit them for the Grand Prix. And you are welcome to submit a 100% new application. General Requirements: Accepted applications: new to Open Exchange apps or existing ones, but with a significant improvement. Our team will review all applications before approving them for the contest. The application should work either on IRIS Community Edition or IRIS for Health Community Edition. Both could be downloaded as host (Mac, Windows) versions from Evaluation site, or can be used in a form of containers pulled from InterSystems Container Registry or Community Containers: intersystemsdc/iris-community:latest or intersystemsdc/irishealth-community:latest . The application should be Open Source and published on GitHub. The README file to the application should be in English, contain the installation steps, and contain either the video demo or/and a description of how the application works. Only 3 submissions from one developer are allowed. Contest Prizes: 1. Experts Nomination – winners will be selected by the team of InterSystems experts: 🥇 1st place - $7,000 🥈 2nd place - $5,000 🥉 3rd place - $3,000 🏅 4th place - $2,000 🏅 5th place - $1,000 🌟 6-10th places - $200 🌟 11-20th places - $100 2. Community winners – applications that will receive the most votes in total: 🥇 1st place - $3,000 🥈 2nd place - $2,000 🥉 3rd place - $1,000 ✨ Global Masters badges for all winners included! Note: If several participants score the same amount of votes, they all are considered winners, and the money prize is shared among the winners. Important Deadlines: 🛠 Application development and registration phase: June 12, 2023 (00:00 EST): Contest begins. July 2, 2023 (23:59 EST): Deadline for submissions. ✅ Voting period: July 3, 2023 (00:00 EST): Voting begins. July 9, 2023 (23:59 EST): Voting ends. Note: Developers can improve their apps throughout the entire registration and voting period. Who Can Participate? Any Developer Community member, except for InterSystems employees (ISC contractors allowed). Create an account! Developers can team up to create a collaborative application. Allowed from 2 to 5 developers in one team. Do not forget to highlight your team members in the README of your application – DC user profiles. Helpful Resources: ✓ How to work with InterSystems IRIS (for beginners): Build a Server-Side Application with InterSystems IRIS Learning Path for beginners ✓ For beginners with ObjectScript Package Manager (ZPM): How to Build, Test and Publish ZPM Package with REST Application for InterSystems IRIS Package First Development Approach with InterSystems IRIS and ZPM ✓ How to submit your app to the contest: How to publish an application on Open Exchange How to submit an application for the contest ✓ Sample IRIS applications: intersystems-iris-dev-template iris-embedded-python-template interoperability-embedded-python isc-cloud-sql-python-demo rest-api-template integratedml-demo-template iris-fhir-template iris-fullstack-template iris-interoperability-template iris-analytics-template Need Help? Join the contest channel on InterSystems' Discord server or talk with us in the comment to this post. We can't wait to see your projects! Good luck 👍 By participating in this contest, you agree to the competition terms laid out here. Please read them carefully before proceeding. Another amazing and rich contest :-) Thank you amazing team! Looking forward to the submissions! An example idea for Grand Prix entry:Demonstration of a "Language Interface" (chat interface) to replace the functionality of a traditional online shop ( HTML UI ) Use IRIS to hold a product catalogue (Images, Description, Cost, Stock) Use IRIS to present the "Language Interface". Can support via Embedded Python User Experience: Describe what you want instead of search Filter products based on preferences identified in conversation Automatically locate products via catalogue meta data from conversation content CSP or other web Session for current Shopping basket (Item, Quantity, Total) Persist and reuse preferences identified from previous shopping chats Can current ISC interns participate? (not during work hours) Developers!Only two days left to the start of the contest!Get ready to upload your apps! Hi @Jingyu.Lee ! We didn't do this before. I sent you an email. Hey Devs! ❗️ We extend the contest period until July 9th More time for your app! Don't miss your chance to win 😉 Hey Devs! The recording of the "Kick-off webinar Grand Prix 2023" is on InterSystems Developers YouTube! 🔥 Hey Developers!One application has been already added to the contest!🤩 oex-mapping by @Robert.Cemper1003 Upload your apps and join the contest! Dear Developers! Please use technology bonuses to collect more votes and get closer to winning the main prize. 🥳 Happy coding!✌ The links to InterSystems IRIS community Editions are updated. The application should work either on IRIS Community Edition or IRIS for Health Community Edition. Both could be downloaded as host (Mac, Windows) versions from Evaluation site, or can be used in a form of containers pulled from InterSystems Container Registry or Community Containers: intersystemsdc/iris-community:latest or intersystemsdc/irishealth-community:latest . Thanks for reporting! Hey, Community! One more application has been added to the contest! appmsw-warm-home by @MikhailenkoSergey Check it out! Hi Devs!Registration for 🏆 InterSystems Grand Prix Contest 2023 🏆 ends next week) Hurry up to upload your app!😎 FYI: you can also edit the application during the voting time). Hey, Devs! Another application has been added to the contest! RDUH Interface Analyst HL7v2 Browser Extension by Rob Ellis Tips for those who haven't applied yet) 😎 You can upload an old app from previous contests, just update it a bit (add a new feature). Happy coding! Developers! Four new applications have been added to the contest!🤩 oex-vscode-snippets-template by @John.Murray IRIS FHIR Transcribe Summarize Export by Ikram ShahIntegratedMLandDashboardSample by @珊珊.喻 irisapitester by @Daniel.Aguilar Only 2 days left to the end of the registration phase!Don't miss your chance to join the competition!🚀 Devs! Only a few hours left until the end of the registration phase! Hurry up and upload your apps to join the contest! Hi Irina,I've been trying to submit DevBox into the competition but I'm not getting a submit option for it. Figured it out!