Clear filter
Article
Eduard Lebedyuk · Aug 3, 2020
InterSystems IRIS currently limits classes to 999 properties.
But what to do if you need to store more data per object?
This article would answer this question (with the additional cameo of Community Python Gateway and how you can transfer wide datasets into Python).
The answer is very simple actually - InterSystems IRIS currently limits classes to 999 properties, but not to 999 primitives. The property in InterSystems IRIS can be an object with 999 properties and so on - the limit can be easily disregarded.
Approach 1.
Store 100 properties per serial property. First create a stored class that stores a hundred properties.
Class Test.Serial Extends %SerialObject
{
Property col0;
...
Property col99;
}
And in your main class add as much properties as you need:
Class Test.Record Extends %Persistent
{
Property col00 As Test.Serial;
Property col01 As Test.Serial;
...
Property col63 As Test.Serial;
}
This immediately raises your limit to 99900 properties.
This approach offers uniform access for all properties via SQL and object layers (we always know property reference by it's number).
Approach 2.
One $lb property.
Class Test.Record Extends %Persistent
{
Property col As %List;
}
This approach is simpler but does not provide explicit column names.
Use SQL $LIST* Functions to access list elements.
Approach 3.
Use Collection (List Of/Array Of) property.
Class Test.Record Extends %Persistent
{
Property col As List Of %Integer;
}
This approach also does not provide explicit column names for individual values (but do you really need it?). Use property parameters to project the property as SQL column/table.
Docs for collection properties.
Approach 4.
Do not create properties at all and expose them via SQL Stored procedure/%DispatchGetProperty.
Class Test.Record Extends %Persistent
{
Parameter GLVN = {..GLVN("Test.Record")};
/// SELECT Test_Record.col(ID, 123)
/// FROM Test.Record
///
/// w ##class(Test.Record).col(1, )
ClassMethod col(id, num) As %Decimal [ SqlProc ]
{
#define GLVN(%class) ##Expression(##class(Test.Record).GLVN(%class))
quit $lg($$$GLVN("Test.Record")(id), num + 1)
}
/// Refer to properties as: obj.col123
Method %DispatchGetProperty(Property As %String) [ CodeMode = expression ]
{
..col(..%Id(), $e(Property, 4, *))
}
/// Get data global
/// w ##class(Test.Record).GLVN("Test.Record")
ClassMethod GLVN(class As %Dictionary.CacheClassname = {$classname()}) As %String
{
return:'$$$comClassDefined(class) ""
set strategy = $$$comClassKeyGet(class, $$$cCLASSstoragestrategy)
return $$$defMemberKeyGet(class, $$$cCLASSstorage, strategy, $$$cSDEFdatalocation)
}
The trick here is to store everything in the main $lb and use unallocated schema storage spaces to store your data. Here's an article on global storage.
With this approach, you can also easily transfer the data into Python environment with Community Python Gateway via the ExecuteGlobal method.
This is also the fastest way to import CSV files due to the similarity of the structures.
Conclusion
999 property limit can be easily extended in InterSystems IRIS.
Do you know other approaches to storing wide datasets? If so, please share them!
The question is how csvgen could be upgraded to consume csv files with 1000+ cols. While I always advertise CSV2CLASS methods for generic solutions, wide datasets often possess an (un)fortunate characteristic of also being long.
In that case custom object-less parser works better.
Here's how it can be implemented.
1. Align storage schema with CSV structure
2. Modify this snippet for your class/CSV file:
Parameter GLVN = {..GLVN("Test.Record")};
Parameter SEPARATOR = ";";
ClassMethod Import(file = "source.csv", killExtent As %Boolean = {$$$YES})
{
set stream = ##class(%Stream.FileCharacter).%New()
do stream.LinkToFile(file)
kill:killExtent @..#GLVN
set i=0
set start = $zh
while 'stream.AtEnd {
set i = i + 1
set line = stream.ReadLine($$$MaxStringLength)
set @..#GLVN($i(@..#GLVN)) = ..ProcessLine(line)
write:'(i#100000) "Processed:", i, !
}
set end = $zh
write "Done",!
write "Time: ", end - start, !
}
ClassMethod ProcessLine(line As %String) As %List
{
set list = $lfs(line, ..#SEPARATOR)
set list2 = ""
set ptr=0
// NULLs and numbers handling.
// Add generic handlers here.
// For example translate "N/A" value into $lb() if that's how source data rolls
while $listnext(list, ptr, value) {
set list2 = list2 _ $select($g(value)="":$lb(), $ISVALIDNUM(value):$lb(+value), 1:$lb(value))
}
// Add specific handlers here
// For example convert date into horolog in column4
// Add %%CLASSNAME
set list2 = $lb() _ list2
quit list2
} Thanks, Ed!Could you make a PR? I have no concrete ideas on how to automate this.
This is a more case-by-case basis. After more than 42 years of M-programming and in total of 48 years of programming experience I would say, if you need a class with about 1000 or more properties than something is wrong with your (database) design. There is nothing more to say. Period. Wide datasets are fairly typical for:
Industrial data
IoT
Sensors data
Mining and processing data
Spectrometry data
Analytical data
Most datasets after one-hot-encoding applied
NLP datasets
Any dataset where we need to raise dimensionality
Media featuresets
Social Network/modelling schemas
I'm fairly sure there's more areas but I have not encountered them myself.
Recently I have delivered a PoC with classes more than 6400 columns wide and that's where I got my inspiration for this article (I chose approach 4).
@Renato.Banzai also wrote an excellent article on his project with more than 999 properties.
Overall I'd like to say that a class with more than 999 properties is a correct design in many cases. You probably right for a majority of tasks. But how do you manage with AI tasks which NEED to manage thousands of features of entities? And features are properties/fields from data storage perspective.
Anyway, I'm really curious how do you deal with AI/ML tasks in IRIS or Caché. Entity–attribute–value model is usually used for this purpose.
I have already written about this at the time: SQL index for array property elements. That's good and well for sparse datasets (where say you have a record with 10 000 possible attributes but on average only 50 are filled).
EAV does not help in dense cases where every record actually has 10 000 attributes. My EAV implementation is the same as your Approach 3, so it will work fine even with fully filled 4.000.000 attributes.
Since the string has a limit of 3,641,144, approaches with serial and %List are dropped.
All other things being equal, everything depends on the specific technical task: speed, support for Objects/SQL, the ability to name each attribute, the number of attributes, and so on.
Announcement
Anastasia Dyubaylo · Dec 12, 2018
Hi Community!We are pleased to invite you to the upcoming webinar "Using Blockchain with InterSystems IRIS" on 20th of December at 10:00 (Moscow time)! Blockchain is a technology of distributed information storage and mechanisms to ensure its integrity. Blockchain is becoming more common in various areas, such as the financial sector, government agencies, healthcare and others.InterSystems IRIS makes it easy to integrate with one of the most popular blockchain networks – Ethereum. At the webinar we will talk about what a blockchain is and how you can start using it in your business. We will also demonstrate the capabilities of the Ethereum adapter for creating applications using the Ethereum blockchain.The following topics are planned to be considered:Introduction to BlockchainEthereumSmart contracts in EthereumIntersystems IRIS adapter for EthereumApplication example using adapterPresenter: @Nikolay.SolovievAudience: The webinar is designed for developers.Note: The language of the webinar is Russian.We are waiting for you at our webinar! Register now! It is tomorrow! Don't miss Register here! And now this webinar recording is available in a dedicated Webinars in Russian playlist on InterSystems Developers YouTube: Enjoy it!
Question
Nikhil Pawaria · Jan 25, 2019
How we can reduce the size of cache.dat file? Even after deleting the globals of a particular database from management portal size of its cache.dat file is not reduced. This is the way to do it, but make sure you are on a version where this won't cause problems. See:https://www.intersystems.com/support-learning/support/product-news-alerts/support-alert/alert-database-compaction/https://www.intersystems.com/support-learning/support/product-news-alerts/support-alerts-2015/https://www.intersystems.com/support-learning/support/product-news-alerts/support-alerts-2012/ You need to do these three steps in order:Compact Globals in a Database (optional)Compact a DatabaseTruncate a DatabaseIn can be done via ^DATABASE utility or in management portal. CACHE.DAT or IRIS.DAT, can only grow during normal work. But you can shrink it manually. But it is not as easy as it maybe sounds. And depends on version which you use, only past few versions were added with the compact tool. On very old versions you have to copy data from old database to thew new one.You can read my articles, about internal structure of CACHE.DAT, just to know what is this inside. And about database with visualization, where you can see how to compact database, and how it actually works.
Question
Stephan Gertsobbe · Jul 13, 2019
Hi all, we are wondering if anybody has a reporting tool that is capable using IRIS Objects?I know there are things like Crystal Reports and others out there who can read the SQL Data throug ODBC but we need the capability of using object methods while running the report.Since now we where using a JAVA based report generator (ReportWeaver) but since the object binding for JAVA doesn't exist anymore in IRIS data platform, did any of you have an alternative report generator? Looking forward to any answers cheersStephan No that's not really what I meat. My question was much more generic about data projection of object data like listOfObjects.When you look at the projected data of those "object" collections they are projected as $LISTBUILD lists in SQL. So the question was, is there a reporting tool out there in use, that can handle the object data from IRIS as for IRIS there is no object binding anymore like there was for Caché.For Java there is the cachedb.jar and that binding doesn't exist for IRIS. "using object methods while running the report"This is a rather generic statement.If you are using CLASS METHODS (as I 'd assume) you can project each class method as Stored SQL Procedure too.By this, you can make them available to be used over JDBC.Could be an eventual workaround.
Question
Evgeny Shvarov · Feb 12, 2019
Hi Community!What's the limit for Namespaces and Databases for one InterSystems IRIS installation?Yes, I checked with documentation but cannot find it at once. to my understanding, there is no technical limit.Though I believe to remember that it used to be ~16.000 some time in past.Class SYS.Database maps to ^SYS("CONFIG","IRIS","Databases",<DBNAME>) and has NO limit theresimilar Namespaces are stored in SYS("CONFIG","IRIS","Namespaces",<NSPCE>) an are covered by %SYS.Namespace If there is any limit it must be related to internal memory structures. (gmheap ??)
Announcement
Anastasia Dyubaylo · Dec 6, 2018
Hey Developers!Good news! Just in time for the holidays, Gartner Peer Insights is offering customers a $25 digital Visa Gift Card for an approved review of InterSystems IRIS or Caché this month!We decided to support and double the stakes. So! In December '18 you can get the second $25 digital Visa Gift Card for Gartner review on Caché or InterSystems IRIS on InterSystems Global Masters Advocacy Hub!See the rules below.Step #1: To get $25 Visa Card from Gartner Peer Insights, follow this unique link and submit a review. Make a screenshot for Step#2 so that we can see that you reviewed InterSystems IRIS or Caché.Note: The survey takes about 10-15 minutes. Gartner will authenticate the identity of the reviewer, but the published reviews are anonymous. You can check the status of your review and gift card in your Gartner Peer Insights reviewer profile at any time.Step #2: To get $25 Visa Card from InterSystems, complete a dedicated challenge on InterSystems Global Masters Advocacy Hub — upload a screenshot from Step #1.Don't forget: • This promotion is only for reviews entered in the month of December 2018. • InterSystems IRIS and Caché reviews only. • Use mentioned above unique link in order to qualify for the gift cards.Done? Awesome! Your card is on its way! To join Global Masters leave a comment to the post and we'll send the invite! Hurry up to get your $100 from December Caché and IRIS campaign from Gartner and InterSystems! ;) Only 12 days left!The recipe is the following: 1. You are our current customer of Caché or/and InterSystems IRIS.2. Make the review using this link.3. Get your $25 for Caché or InterSystems IRIS review ($50 for both).4. Save the screenshots of reviews and submit it in Global Masters - get another $25 for every Caché and InterSystems IRIS from Global Masters.5. Merry Christmas and have a great new year 2019! This is a good idea, hopefully everyone will do this but I did have a problem.Perhaps I have done this incorrectly but I could not see a way to submit screenshots in the challenge and when you click the "lets review" button, or whatever the actual text was, it closes it as completed and there seems no way to submit a screenshot. Also, the link to the challenge is for the same challenge number as it appears in and it takes you to the Global Masters front page.Also, you don't seem able to review both as suggested, if you use the link again or search for the platform you haven't reviewed yet it will simply state you have already submitted a review. I suspect this is because using the link you have to choose between Iris or Cache and so the offer is for one or the other but not both. Hi David! Thanks for reporting this. Our support team will contact you via GM direct messaging. Dear Community Members!Thank you so much for making reviews! You made InterSystems Data Platforms Caché and InterSystems IRIS a Gartner Customers' Choice 2019 in Operational Database management Systems!
Announcement
Anastasia Dyubaylo · Apr 12, 2019
Hi Community!
We're pleased to invite you to the DockerCon 2019 – the #1 container industry conference for all things Kubernetes, microservices, and DevOps. The event will be held at the Moscone Center in San Francisco from April 29 to May 2.
In addition, there will be a special session "Containerized Databases for Enterprise Applications" presented by @Thomas.Carroll, Product Specialist at InterSystems.
See the details below.
Containerized Databases for Enterprise Applications | Wednesday, May 1, 12:00 PM - 12:40 PM – Room 2001
Session Track: Using Docker for IT Infra & Ops
Containers are now being used in organizations of all sizes. From small startups to established enterprises, data persistence is necessary in many mission critical applications. “Containers are not for database applications” is a misconception and nothing could be further from the truth. This session aims to help practitioners navigate the minefield of database containerization and avoid some of the major pitfalls that can occur. Discussion includes traditional enterprise database concerns surrounding data resilience, high availability, and storage and how they mesh with a containerized deployment.
Speaker BioJoe is a Product Specialist at InterSystems, a passionate problem solver, and a container evangelist. He started his career as a solution architect for enterprise database applications before transitioning to product management. Joe is in the trenches of InterSystems transformation to a container-first, cloud-first product strategy. When he isn’t at a Linux shell he enjoys long walks on the beach, piña coladas (hold the rain), woodworking, and BBQ.
Be the first to register now! It's really great news. And so cool that InterSystems started to participate more in developers conferences. I wish to participate all of them :)
Announcement
Anastasia Dyubaylo · Oct 26, 2020
Hey Community,
We're pleased to invite you all to the Virtual Summit 2020 session dedicated to InterSystems online programming contests, best winning projects, and their developers! Please join:
⚡️ "Best applications of InterSystems programming contest series: Best IntegratedML, FHIR, REST API, Native API, ObjectScript solutions" session ⚡️
Please check the details below.
We will talk about the series of online contests for InterSystems developers. This session will focus on the contest winners and the top applications. Our developers will share their experience of participating in the exiting InterSystems coding marathon and will show demos of their winning projects.
Speakers: 🗣 @Anastasia.Dyubaylo, Community Manager, InterSystems 🗣 @Henrique.GonçalvesDias, System Management Specialist / Database Administrator, Sao Paulo Federal Court🗣 @José.Pereira, Business Intelligence Developer, Shift Consultoria e Sistemas Ltda🗣 @Henry.HamonPereira, System Analyst, BPlus Tecnologia🗣 @Dmitry.Maslennikov, Co-founder, CTO and Developer Advocate, CaretDev Corp🗣 @Renato.Banzai, Machine Learning Engineer Coordinator, Itaú Unibanco
Date & Time:
➡️ Day 1: Tuesday, October 27 (Boston starts Monday, October 26)
APAC
UTC Time
Boston Time
Best Applications of InterSystems Programming Contest Series
2:50 AM
10:50 PM
NA/LATAM/EMEA
UTC Time
Boston Time
Best Applications of InterSystems Programming Contest Series
3:50 PM
11:50 PM
So!
We will be happy to answer your questions in a virtual chat on the conference platform – please join! We'll start in 15 minutes! Please join!
📍 https://intersystems.6connex.com/event/virtual-summit/en-us/contents/433176/share?rid=FocusSessions&nid=804450 💥 Join us NOW here: https://intersystems.6connex.com/event/virtual-summit/en-us/contents/433253/share?rid=FocusSessions&nid=804450
Article
John Murray · Oct 27, 2020
Now that 1.0 has shipped and is featuring in various sessions at Virtual Summit 2020 it seems like a good time to offer some guidance on how to report problems.
InterSystems ObjectScript for VS Code consists of three collaborating VS Code extensions. For ease of installation and management there's a fourth entity, the InterSystems ObjectScript Extension Pack. It's a great way to get started with minimum clicks, and handy to have even if you have already installed the other extensions.
This modular architecture also means there are three different GitHub repositories where issues can be created. Fortunately VS Code itself helps with the task. Here's how to use it:
1. From the Help menu in VS Code choose Report Issue. Alternatively, open the Command Palette (I typically do this by pressing the F1 key) and run Help: Report Issue... (Pro Palette Tip: try typing just hri and see how fast it gets you to the right command)
2. A dialog like this appears:
3. Use the first field to classify your issue:
Bug Report
Feature Request
Performance Issue
4. In the second field pick "An extension".
5. The third dropdown lets you pick one of your installed extensions. You can also type a few characters to find the right entry. For example, isls quickly selects "InterSystems Language Server" for me.
Which one to choose? Here's a rough guide:
InterSystems Language Server
code colo(u)ring
Intellisense
InterSystems ObjectScript
export, import and compile
ObjectScript Explorer (browsing namespace contents)
direct server-side editing using isfs:// folders in a workspace
integration with server-side source control etc
InterSystems Server Manager
password management in local keychain
definition and selection of entries in `intersystems.servers`
If you can't decide, pick InterSystems ObjectScript.
6. Type a descriptive one-line summary of your issue. The dialog may offer a list of existing issues which could be duplicates. If you don't find one that covers yours, proceed.
7. Begin to enter details. At this stage I usually type just one character, then click "Preview on GitHub" to launch a browser page where I can use the familiar GH issue UI to complete my report. Tips for use there:
Paste images from your clipboard directly into the report field on GH. For hard-to-describe issues an animated GIF gets bonus points.
Link to other issues by prefixing the target number with #
Remember that whatever you post here is visible to anyone on the Internet. Mask/remove confidential information. Be polite.
8. When you are happy with what you have written (tip - use the Preview tab) click "Submit new issue".
Using Help: Report Issue... means your version numbers are automatically added.
Announcement
Anastasia Dyubaylo · Oct 27, 2020
Hey Developers,
We remind you about a great opportunity to make a live conversation with InterSystems Product Managers Team on Live Q&A Sessions at Virtual Summit 2020!
🗓 TODAY at 12:40 PM EDT at https://intersystems.6connex.com/event/virtual-summit/en-us/contents/434370/share?rid=FocusSessions&nid=804450
And now we've added more options to make it even easier for you to ask questions upfront:
✅ Submit your questions in the comments to this post
✅ Submit your question to our Discord Channel: discord.gg/WqVjtD
✅ Submit your questions to VS2020questions@InterSystems.com
✅ Send your question personally to @Anastasia.Dyubaylo or @Evgeny.Shvarov in Direct Messages on the community
✅ Submit your question to Q&A Chat on the conference platform during the session
Note: We will pass all your questions to the PM team, and you'll receive answers during the Live Q&A Sessions.
And let me introduce the whole InterSystems Product Managers Team:
@Jeffrey.Fried, Director of Product Managers @Andreas.Dieckow, Principal Product Manager@Robert.Kuszewski, Product Manager - Developer Experience @Raj.Singh5479, Product Manager - Developer Experience @Carmen.Logue, Product Manager - AI and Analytics @Thomas.Dyar, Product Specialist - Machine Learning @Steven.LeBlanc, Product Specialist - Cloud Operations @Patrick.Jamieson3621, Product Manager - Health Informatics Platform@Benjamin.DeBoe, Product Manager @Stefan.Wittmann, Product Manager@Luca.Ravazzolo, Product Manager @Craig.Lee, Product Specialist
So!
Please don't hesitate to ask your questions! Our PM team will be happy to answer you!
➡️ Our Live Q&A Sessions last from November 27 to 29! Schedule in this post. Please join us now!
📍 https://intersystems.6connex.com/event/virtual-summit/en-us/contents/433280/share?rid=FocusSessions&nid=804450 🗓 TODAY at 12:40 PM EDT at https://intersystems.6connex.com/event/virtual-summit/en-us/contents/434195/share?rid=FocusSessions&nid=804450
Please feel free to submit your questions to our PMs team! Don't miss today's Live Q&A Session:
🗓 TODAY at 12:40 PM EDT at https://intersystems.6connex.com/event/virtual-summit/en-us/contents/434370/share?rid=FocusSessions&nid=804450
Don't hesitate to ask your questions!
Article
Yuri Marx · Dec 21, 2020
Today, is important analyze the content into portals and websites to get informed, analyze the concorrents, analyze trends, the richness and scope of content of websites. To do this, you can alocate people to read thousand of pages and spend much money or use a crawler to extract website content and execute NLP on it. You will get all necessary insights to analyze and make precise decisions in a few minutes.
Gartner defines web crawler as: "A piece of software (also called a spider) designed to follow hyperlinks to their completion and to return to previously visited Internet addresses".
There are many web crawlers to extract all relevant website content. In this article I present to you Crawler4J. It is the most used software to extract website content and has MIT license. Crawler4J needs only the root URL, the depth (how many child sites will be visited) and total pages (if you want limit the pages extracted). By default only textual content will be extracted, but you config the engine to extract all website files!
I created a PEX Java service to allows you using an IRIS production to extract the textual content to any website. the content is stored into a local folder and the IRIS NLP reads these files and show to you all text analytics insights!
To see it in action follow these procedures:
1 - Go to https://openexchange.intersystems.com/package/website-analyzer and click Download button to see app github repository.
2 - Create a local folder in your machine and execute: https://github.com/yurimarx/website-analyzer.git.
3 - Go to the project directory: cd website-analyzer.
4 - Execute: docker-compose build (wait some minutes)
5 - Execute: docker-compose up -d
6 - Open your local InterSystems IRIS: http://localhost:52773/csp/sys/UtilHome.csp (user _SYSTEM and password SYS)
7 - Open the production and start it: http://localhost:52773/csp/irisapp/EnsPortal.ProductionConfig.zen?PRODUCTION=dc.WebsiteAnalyzer.WebsiteAnalyzerProduction
8 - Now, go to your browser to initiate a crawler: http://localhost:9980?Website=https://www.intersystems.com/ (to analyze intersystems site, any URL can be used)
9 - Wait between 40 and 60 seconds. A message you be returned (extracted with success). See above sample.
10 - Now go to Text Analytics to analyze the content extracted: http://localhost:52773/csp/IRISAPP/_iKnow.UI.KnowledgePortal.zen?$NAMESPACE=IRISAPP&domain=1
11 - Return to the production and see Depth and TotalPages parameters, increase the values if you want extract more content. Change Depth to analyze sub links and change TotalPages to analyze more pages.
12 - Enjoy! And if you liked, vote (https://openexchange.intersystems.com/contest/current) in my app: website-analyzer
I will write a part 2 with implementations details, but all source code is available in Github. Hi Yuri!Very interesting app!But as I am not a developer, could you please tell more about the results the analizer will give to a marketer or a website owner? Which insights could be extracted form the analysis? Hi @Elena.E
I published a new article about marketing and this app: https://community.intersystems.com/post/marketing-analysis-intersystems-website-using-website-analyzer
About the possible results allows you:
1. Get the most popular words, terms and sentences wrote into the website, so you discover the business focus, editorial line and marketing topics.
2. Sentiment analysis into the sentences, the content is has positive or negative focus
3. Rich cloud words to all the website. Rich because is a semantic analysis, with links between words and sentences
4. Dominance and frequence analysis, to analyze trends
5. Connections paths between sentences, to analyze depth and coverage about editorial topics
6. Search engine of topics covered, the website discuss a topic? How many times do this?
7. Product analysis, the app segment product names and link the all other analysis, so you can know if the website says about your product and Services and the frequency Hi Yuri!
This is a fantastic app!
And works!
But the way to set up the crawler is not that convenient and not very user-friendly.
You never know if the crawler works and if you placed the URL right.
Is it possible to add a page which will let you place the URL, start/stop crawler and display some progress if any?
Maybe I ask a lot :)
Anyway, this is a really great tool to perform IRIS NLP vs ANY site:
Article
Mihoko Iijima · Mar 5, 2021
**This article is a continuation of this post.**
The purpose of this article is to explain how the Interoperability menu works for system integration.

The left side of the figure is the window for accepting information sent from external systems.
There are various ways to receive information, such as monitoring the specified directory at regular intervals to read files, periodically querying the database, waiting for input, or directly calling and having it passed from applications in other systems.
In the system integration mechanism created in the IRIS Interoperability menu, the received information is stored in an object called a **message**. The **message** is sent to the component responsible for the subsequent processing.
A **message** can be created using all the received information or only a part of it.
Suppose you want to send the information contained in the **message** to an external system. In that case, you need to send the message to the component responsible for requesting the external network to process it (the right side of the figure). The component that receives the **message** will request the external system to process it.
Besides, suppose a **message** requires human review, data conversion, or data appending. In that case, the **message** is sent to the component in the middle of the diagram (BPM), which is responsible for coordinating the processing flow.
**Messages** are used to send and receive data between each component. When a **message** is sent or received, the message is automatically stored in the database.
Since **messages** are stored in the database, it is possible to check the difference before and after the data conversion. Check the **message** that was the source of a problem during an operation or start over (resend) from the middle of the process. Verify the status using **messages** at each stage of development, testing, and operation.
A simple picture of system integration would be divided into three components (business services, business processes, and business operations), as shown in the figure below.
There is also a definition called "**production**" that stores information about the components to be used (e.g., connection information).

The role of each component is as follows:
**Business Services**
Responsible for receiving information from external sources, creating **messages**, and sending **messages** to other components.
**Business Processes**
This role is activated when a **message** is received and is responsible for coordinating the process (calling components in the defined order, waiting for responses, waiting for human review results, etc.).
**Business Operations**
This function is activated when a **message** is received and has a role in requesting the external system to process the message.
**Messages** are used to send and receive data between components.
Components other than business services initiate processing when they receive a **message**.
The question is, what is the purpose of creating and using this **message**?
**Messages** are created by retrieving the information you want to relay to the external system from entered data into the business service.
Since not all external systems connected to IRIS use the same type of data format for transmission, and the content to be relayed varies, the production can freely define message classes according to the information.
There are two types of **messages**: request (= request message) and response (= response message). The **message** that triggers the component's activation is called request (= request message), and the **message** that the component responds to after processing is called response (= response message).
These **messages** will be designed while considering the process of relaying them.
In the following articles, we will use a study case to outline the creation of **productions**, **messages**, and components.
Nice article. To be perfect, iris interoperability could be implement bpmn to BPM task, but it is done using bpel.
Article
Mihoko Iijima · Mar 5, 2021
**This article is a continuation of this post.**
In the previous article, we discussed business operations' creation from the components required for system integration.
In this article, you will learn how to create a business process that calls the two business operations you have defined in the sequence order.
* Production
* Message
* **Components**
* Business Services
* **Business Processes**
* Business Operations (previous post)
The business process acts as the coordinator (command center) of the process.
The processing adjustments you may want to implement in the sample include the following:
Step 1: Provide the city name to an external Web API and request weather information.Step 2: Register the result of the query (weather information) from Step 1 and the name of the purchased product received at the start of the production.
In the sample business process, we will wait for the answer to step 1 and adjust step 2 to operate.
In the process of waiting for a response (i.e., synchronization), for instance, what happens if step 1) doesn't respond for a few days?
If new messages are delivered to the business process while waiting for a response for a few days, the messages will not be dismissed since they are stored in a queue. However, the business process will not process new messages, and there will be a delay in the operation.
Note: Business processes and business operations have cues.
Therefore, in production, when there is a synchronous call, there are two ways for the business process to move: **A) to synchronize perfectly**, and B) to save the state of the business process itself in the database and hand over the execution environment so that other processes can run while waiting for a response.
**A) How to synchronize perfectly:**
While a synchronous call is being made, the business process's processing is ongoing, and waiting for the next message to be processed until all processing is completed.➡This function is used when the order of processing needs to be guaranteed in the first-in-first-out method.
B) The method of saving the state of the business process itself in the database and hand over the execution environment so that other processes can run while waiting for a response is
When a synchronous call is made, the process saves its state in the database. When a response message is received, and it is time to process the message, it opens the database and executes the next process. (IRIS will manage the storage and re-opening of business processes in the database). ➡ Used when it is acceptable to switch the processing order of messages (i.e., when it is allowed to process more and more different messages received while waiting for a response).
In the sample, **B)** is used.
There are two types of editors for creating business processes: a Business Process Editor that allows you to place processing boxes (activities) and implement them while defining their execution, and a method for creating them using ObjectScript in Studio or VSCode.
If you use the Business Process Editor, you will use the call activity to invoke the component, but this activity is **implemented** in the **B)** way. **Of course, you can also implement the** **A)** method in the Business Process Editor, except that you will not use the call activity in that case (you will use the code activity).
In this section, I will explain how to create it.
If you use the Business Process Editor, you write them in the Management Portal.
You can also open the business process from the production configuration page. The figure below shows the procedure.

Icons like .png) in this editor are called activities, and those marked with are activities that can invoke other components.
This symbol.png) indicates that a response message will be returned (i.e., a synchronous call will be made). The activity defaults to the asynchronous call setting, which can be changed as needed.
Now let's look at business processes, which are components that are invoked upon receiving a request message, as well as business operations.
In the sample, the request message: It is set to start when it receives a Start.Request and does not return a response message.

In the business process, messages appear in various situations.
Request messages that are sent to business processes.
Request message (+ response message) to be sent when calling another component using the activity.
In the Business Process Editor, the names of the objects that store messages are clearly separated to be able to see which message was sent from which destination.

* request(basic requirements)
The message that triggered the start of the business process, in our example, is Start.Request (the message to be specified in the Request settings on the Context tab in the Business Process Editor)
* response(basic response)
Response message to return to the caller of the business process (not used in the sample) (message to be specified in the settings of the response in the context tab in the Business Process Editor)
* callrequest(request message)
Request message to be sent when calling the component determined by the activity.
* callresponse(response message)
Response message returned from the component specified by the activity.
**callrequest and callresponse are objects that will be deleted when the call processing of the activity is completed.**
All other objects will not disappear until the business process is finished.
Now comes the problem when callresponse disappears.
That's because, as you can see in this sample,
**When calling a component, if you want to use the response result of a previously called component, the response message will be lost, and the information that was to be used in the next component will be erased.**
It is a problem 😓
What should we do?・・・・・
In such a case, you can use the context object.
The context object, like request/response, is an object that survives until the end of the business process.
Moreover, since context is a generic object, it can be defined in the process editor.
In addition to the context, the response object can also be used if it has a property that matches the inherited data's saving.
Now, let's go over the steps again.

Response message in the light blue balloon: Start.Response is an object that will be deleted when the process is finished.
Since we want to use the response message (Start.Response) that contains the weather information as the message to be sent to the next [Business Operation for DB Update], we have to implement the context object in a way that all the property values of the response message (Start.Response) can be assigned to it.
Then what is the setting for the context property?
The properties are defined in "Context Properties" in the Context tab of the Business Process Editor.
In this case, we would like to save all the properties of the response message (Start.Response) to the context object. Therefore, the property type specification is set to Start.Response.

Following that, check the settings in the activity.

The request and response messages have a button called ○○ Builder.
Clicking on this button will launch a line-drawing editor that allows you to specify what you want to register in the properties of each message.

After this, the business operation for requesting a database update (Start.SQLInsertOperation or Start.InsertOperation) is called in the same way with the activity, and you are all set.
(For more information, see Configuring .png) for Business Processes).
Once you have completed the verification, you can test it. The testing method is the same as the one used for testing business operations (see this article).
The trace after the test is as follows:

Since the business process is the coordinator, we could see that it invoked the defined components sequentially, keeping the synchronous execution.
Note 1: The sample only deals with the call activity, but various other activities such as data transformation.
Note 2: Business processes created by ObjectScript alone, other than the Business Process Editor, inherit from the Ens.BusinessProcess class. If it is created in the Business Process Editor, it inherits from the Ens.BusinessProcessBPL class.
The business process is the coordinator of the system integration process.The Business Process Editor provides the following types of variables for messages (request/response/callrequest/callreponse/context).A business process created with the Business Process Editor can work in a way that does not delay other messages, even if there is synchronization in the component's calling.
In the next section, we will finally show you how to develop the last component: business services.
Article
Mihoko Iijima · Mar 5, 2021
**This article is a continuation of this post.**
In the previous article, we reviewed how to create and define messages used to send and receive data between components.
In this article, I will explain how to create a business operation from the component creation methods.
* Production
* Message(previous article)
* **Components**
* Business Services
* Business Processes
* **Business Operations**
We will quickly check the code by referring to the sample.。

| Component Name | Role |
|---|---|
| Start.FileBS | A business service that uses file inbound adapter to read files placed in a specified directory at regular intervals. |
| Start.NonAdapterBS | Business services that allow applications and users to input information directly without using an adapter. |
| Start.WS.WebServiceBS | Business services that allow people to enter information using web services. |
| Start.WeatherCheckProcess | A business process that controls the procedure for acquiring weather information and then registering it in a database. |
| Start.GetKionOperation | Business operation to pass the city name to the web service that provides weather information and sent back. |
| Start.SQLInsertOperation | Business operations using SQL outbound adapters to request registration of weather and purchase information into the database. |
| Start.InsertOperation | Business operations that perform updates to tables in InterSystems IRIS without the use of adapters. |
Note: BS stands for Business Services, BP for Business Processes, and BO for Business Operations.
You need to write ObjectScript in Business Services and Business Operations and can be created in VSCode or Studio. Business Processes can also be made in the Management Portal (see this article for more information on using VSCode).
There is no particular order of creation, but the external site to be connected to is a public site and can be used immediately in this sample. In this case, it is convenient to start with the business operation to make testing easier.
After creating the components, there are test page in the production for business processes and business operations.
However, testing is disabled by default in the production definition to avoid random testing in the production environment.
For details on how to allow "Testing Enables" in Production, use the following settings (the sample Production has been set to "Testing Enabled" in advance):

### 1) Business Operations
In the sample, two types of business operations are provided.
One operation is to pass the city’s name to an external Web API via REST and request the acquisition of weather information. The other operation is to give the weather information and the name of the purchased product to the InterSystems IRIS database and ask for the update process.
#### 1)-1 REST Business Operations
Let’s start by creating an operation that calls an external Web API via REST.
This operation starts the GetKion() method when a Start.Request message is entered, queries an external site, and returns the weather information in a Start.Response message.
See here for code details.
To create a business operation for REST, inherit from **EnsLib.REST.Operation** .
```objectscript
Class Start.GetKionOperation Extends EnsLib.REST.Operation
```
Inheritance of this class provides the following methods in IRIS that match the HTTP methods. Please refer to the documentation for details.
GetURL()— used for HTTP GET operations.
PostURL()— used in HTTP POST operations.
PutURL()— used in a HTTP PUT operations.
DeleteURL()— used in a HTTP DELETE operations.
For REST, use the adapter **EnsLib.HTTP.OutboundAdapter**. Set the adapter name to the **ADAPTER** parameter and the Adapter property, as shown in the example.
The INVOCATION parameter configures the **Queue**.
```objectscript
Parameter ADAPTER = "EnsLib.HTTP.OutboundAdapter";
Property Adapter As EnsLib.HTTP.OutboundAdapter;
Parameter INVOCATION = "Queue";
```
It is necessary to specify the OpenWeather API key to be obtained at runtime. There is a way to display settings that vary depending on the environment in the production settings.
The procedure is as follows:
1. Define the properties
2. Specify the name of the property you created in the SETTINGS parameter (if there are multiple properties, separate them with commas). Optionally, you can also specify a category (use “property name: category name”).
An example code is shown below.
```objectscript
/// APIキーを指定します
Property appid As %String;
/// specify lang option for OpenWeather API (default = ja = japanese)
Property lang As %String [ InitialExpression = "ja" ];
Parameter SETTINGS = "lang:OpenWeatherMap,appid:OpenWeatherMap";
```
The Production Settings page displays the following. The description in the line immediately before the property definition is also displayed in the production settings page, as shown in the figure.

Then, we will review the message map, which is an essential setting for business operations.
.png)
The above definition is defined so that the GetKion() method will work when the **Start.Request message** is **sent**.
In the GetKion() method, the city name can be obtained from the request message’s Area property passed as input information.
By setting the city name as a parameter of the URL published by the external Web API and calling it, you can obtain weather information.
The HTTP server and URL settings are configured in the Production page of the Management Portal. To obtain the settings, use the **Adapter** property provided by the HTTP outbound adapter.
Example) to specify a URL, use ..Adapter.URL
Use the GetURL() method provided by Business Operations for REST to call an external site. The first parameter is the URL to be executed (i.e., the URL specified in the required parameters such as city name). The second parameter is the HTTP response with parameters passed by reference.
Since the weather information is stored in JSON format in the HTTP response, the operation is complete when the data is registered in the response message (=pResponse).

The response message class’s name is specified in the second parameter of the created method by passing reference.
```objectscript
Method GetKion(pRequest As Start.Request, Output pResponse As Start.Response) As %Status
```
To return a response message to the caller, create an instance of the response message, store it in the second parameter variable (_**pResponse**_), and set the necessary information in the properties.
```objectscript
set pResponse.AreaDescription=weatherinfo.weather.%Get(0).description
set pResponse.KionMax=weatherinfo.main."temp_max"
set pResponse.KionMin=weatherinfo.main."temp_min"
set pResponse.Area=weatherinfo.name
// this code is fit to Japan time because weatherinfo.dt is UTC
set unixEpochFormat=weatherinfo.dt+32400
set dt=$system.SQL.Functions.DATEADD("s",unixEpochFormat,"1970-01-01 00:00:00")
set pResponse.AreaPublicTime=dt
```
Since HTTP responses from external sites are returned in JSON format, the stream that could be obtained from the HTTP response is used to convert it into a dynamic object that is convenient for JSON operations within IRIS.
```objectscript
set weatherinfo={}.%FromJSON(tHttpResponse.Data)
```
An example of a returned JSON string is shown below:
```json
{
"coord": {
"lon": 135.5022,
"lat": 34.6937
},
"weather": [
{
"id": 803,
"main": "Clouds",
"description": "broken clouds",
"icon": "04d"
}
],
"base": "stations",
"main": {
"temp": 11.38,
"feels_like": 8.33,
"temp_min": 11,
"temp_max": 12.22,
"pressure": 1007,
"humidity": 62
},
"visibility": 10000,
"wind": {
"speed": 2.57,
"deg": 220
},
"clouds": {
"all": 75
},
"dt": 1611820991,
"sys": {
"type": 1,
"id": 8032,
"country": "JP",
"sunrise": 1611784750,
"sunset": 1611822143
},
"timezone": 32400,
"id": 1853909,
"name": "Osaka",
"cod": 200
}
```
The maximum temperature, minimum temperature, and weather can be obtained as follows:
```objectscript
set pResponse.AreaDescription=weatherinfo.weather.%Get(0).description
set pResponse.KionMax=weatherinfo.main."temp_max"
set pResponse.KionMin=weatherinfo.main."temp_min"
```
If you would like to learn more about JSON manipulation in IRIS, please refer to this article and documentation.
Now, let’s use the production testing tool to see if we can get the weather information properly.
Open the Production page (**Management Portal> Interoperability> Configuration> Production**), click Start.GetKionOperation, and then click the **"Test" button** on the "**Action**" tab.
Specify a city name (Naha, Sapporo, Nagano, Shinjuku, etc.) for **Area**, and click the “**Run Test Service**” button.
You can see the test results below, with the maximum and minimum temperatures and the weather listed.

Continue to learn how to use the Trace page.

Selecting a horizontal rectangle such as  in the left screen causes the information, in the right screen, to change.
Messages sent and received during the system integration process are automatically saved in the database. Using the message Visual Trace page, you can see in detail what messages were passed to which components in chronological order and whether there was a response or not.
Besides, if an error occurs,
“An error occurred while sending/receiving/receiving □ message to the component from ○ to △.”
a red mark will appear where the error occurred so that you can see it. Of course, in addition to tracing, we also have an event log page.
**(Management Portal > [Interoperability] > [View] > [Event Log])**
Moving on, let’s check out the operation to request an update to the database.
#### 1)-2 Business operations that request updates to the database
The sample provides two types of operations: Start.SQLInsertOperation and Start.InsertOperation.
Each of them is an operation to request a database update, but Start.SQLInsertOperation uses the SQL outbound adapter, while Start.InsertOperation has no adapter.
The difference between the two is,
operation using the SQL outbound adapter is assumed to be accessed via ODBC/JDBC connections so that the database connection destination can be switched in the production settings.
For operations that do not use adapters, it is assumed that the DB update target is a database within the range of visibility from the production configuration and that no connection destination switching occurs.
The IRIS database can be used to store arbitrary data during system integration. However, suppose the system configuration changes for some reason a few years later, and the need to connect to a database on a different server arises. In that case, the operation without the adapter cannot be continued.
On the other hand, operations using the SQL outbound adapter can be operated if there are no changes processing the content of the destination specification (if there is no problem with the SQL statement to be executed, it can be connected to databases of different products).
During system integration, there may be cases where connection information changes due to external system reasons. Therefore it is vital to have a design that can flexibly respond to changes. For this reason, it is recommended to create components that support external connections in a loosely coupled manner.
However, suppose there is no change in the configuration in the future. In that case, you can access the database in IRIS without using the ODBC/JDBC connection, so you can choose to use the adapter or not, depending on your usage.
Let’s take a look at the Start.SQLInsertOperation code that uses the adapter.
The adapter used in the sample is an SQL outbound adapter, which allows you to request the database to execute SQL statements.
Different adapters provide different methods. Please refer to the documentation for details on the methods provided by the adapters.

Then review the code for Start.InsertOperation, without using the adapter.
Whether you use an adapter or not, the message map and method definitions for the operation are required. If you do not use an adapter, you do not need to define the “Paramter” and “Property” for the adapter.
.png)
Business operations without adapters: In Start.InsertOperation, SQL is executed using ObjectScript (the comment statement is the update process in object operations).
The implementation is satisfactory if the database to be updated is not detached from IRIS.
We found out that operations using adapters provide a reliable method to request processing from the destination. We also confirmed that it is possible to create operations without using adapters and freely write code for them.
Next, I would like to explain how to create a business process that calls the operations for getting weather information and updating the database in the correct order.
Article
Evgeny Shvarov · Feb 9, 2021
Hi developers!
Recently we announced the preview of Embedded Python technology in InterSystems IRIS.
Check the Sneak Peak video by @Robert.Kuszewski.
Embedded python gives the option to load and run python code in the InterSystems IRIS server. You can either use library modules from Python pip, like numpy, pandas, etc, or you can write your own python modules in the form of standalone py files.
So once you are happy with the development phase of the IRIS Embedded Python solution there is another very important question of how the solution could be deployed.
One of the options you can consider is using the ZPM Package manager which is described in this article.
I want to introduce you a template repository that introduces a deployable ZPM module and shows how to build such a module.
The example is very simple and it contains one sample.py, that demonstrates the usage of pandas and NumPy python libs and the test.cls objectscript class that makes calls to it.
The solution could be installed with ZPM as:
zpm "install iris-python-template"
NB: Make sure the IRIS you install the module contains an Embedded Python preview code. E.g. you can use the image:
intersystemsdc/iris-ml-community:2020.3.0.302.0-zpm
With commands:
docker run --rm --name my-iris -d --publish 9091:1972 --publish 9092:52773 intersystemsdc/iris-ml-community:2020.3.0.302.0-zpm
docker exec -it my-iris iris session IRIS
USER>zpm "install iris-python-template"
[iris-python-template] Reload START
...
[iris-python-template] Activate SUCCESS
The module installs sample.py python file and titanic.csv sample file along with test.cls to the system.
E.g. sample.py exposes meanage() function which accepts the csv file path and calculates the mean value using numpy and pandas llibraries.
test.cls objectscript class loads the python module with the following line code:
set tt=##class(%SYS.Python).Import("sample")
then provides the path to csv file and collects the result of the function.
Here is how you can test the installed module:
USER>d ##class(dc.python.test).Today()
2021-02-09
USER>d ##class(dc.python.test).TitanicMeanAge()
mean age=29.69911764705882
USER>
OK! Next, is how to deploy Embedded Python modules?
You can add the following line to module.xml:
<FileCopy Name="python/" Target="${mgrdir}python/"/>
the line copies all python files from the python folder of the repository to the python folder inside /mgr folder of IRIS installation.
This lets the python modules then be imported from ObjectScript via ##class(%SYS.Python).Import() method.
Also if you want the data files to be packed into the ZPM module check another FileCopy line in the module that imports the data folder from the repository along with titanic.csv into the package:
<FileCopy Name="data/" Target="${mgrdir}data/"/>
this is it!
Feel free to use the template as a foundation for your projects with Embedded Python for IRIS!
Any questions and comments are appreciated! Hi Evgeny!
I tried embedded Python in my multi model contest app but used an ugly approach to deploy Python code. I didn't realize that ZPM could do this for me... Nice tip! Thanks, Jose!
Yes, indeed ZPM option of delivering files to a target IRIS installation looks elegant and robust. Maybe it could be used not only for Embedded python but e.g. for jar-files delivery and data. @Yuri.Gomes, what do you think? Nice option! OK. I did node.js, @Yuri.Gomes Java is yours. A suggestion is allows ZPM to copy from a HTTP URL like a github address.