Clear filter
Article
Nikita Savchenko · Feb 12, 2019
ˮ This is one of my articles which was never published in English. Let's fix it!Hello! This article is about quite a practical way of developing InterSystems solutions without using the integrated tools like Studio or Atelier. All the code of the project can be stored in the form of "traditional" source code files, edited in your favorite development environment (for example, Visual Studio Code), indexed by any version control system and arbitrarily combined with many external tools for code analysis, preprocessing, packaging and so on.The approach described in this article is suitable for any types of projects on top of InterSystems products. In my case, I developed a couple of my applications (WebTerminal, Visual Editor, Class Explorer) using this approach. This article demonstrates a development cycle which is not traditional for InterSystems, but rather is the practical one, which you may prefer to use for some of your developments.TL;DR Here are some examples of projects that utilize the approach described in this article: WebTerminal, Class Explorer, Visual Editor, Entity Browser (possibly, some other projects have picked up this idea - comment below!). If you want to check the file structure of these projects, click "Open" located right after "Repository" label, and you'll be redirected to GitHub. I've been developing these projects completely without Studio/Atelier!Below is a description of several simplest ways to organize such project development technique. Each method can be modified and expanded to a full-fledged tool for importing, assembling, or even debugging projects in InterSystems Caché, however, the purpose of this article is to provide the basics only and to show that it can work.The described approach to development feature the following:The entire project (its source code) is located in the file system, with any arbitrary directory structure.The project directory is indexed by the Git version control system, has readme file, configs and scripts required for importing/compiling the project.The source code of the classes is in CLS format (as it appears in Studio/Atelier).Work on the project is carried out entirely in the file system, code writing - in any external text editor or IDE.The main feature of this approach is that you can connect any additional tools, for example, code preprocessing (like stub replacements at the compilation stage), front end assembly and so on.This article will not cover ObjectScript routines, CSP and other files but only ObjectScript class files. The work with the first ones can be organized in the same way as with ObjectScript classes. When necessary, by analogy with the presented example, you can implement the support of importing ObjectScript routines yourself. Regarding CSP files, these are just files on the disk, so you don't need to import them at all. To make CSP files work with your InterSystems application, just copy them to the directory of your application.The method described in the article does not require any additional tools and platforms, except for the installed version 2016.2+ of InterSystems Caché (Ensemble, Healthcare) or InterSystems IRIS. Additional assembly and preprocessing of the client code in this article is build with Node.JS, however, you can use any other technology you like. NodeJS is an open source and easy to use platform, which is chosen here because there are many ready-to-go packages already built for the tasks we are about to perform.Motivation Behind Development in Non-InterSystems IDEsThe question arises, why not just continue to develop in the Studio, or switch to a “new studio”, Atelier? What is the point of not using these IDEs at all?The ObjectScript programming language is very different from other common languages such as C#, Java, JavaScript, Python, Golang and others. The key difference here is that the language is "closed" by itself. Out-of-the box, many tools come directly from InterSystems, which is slowly changing with the introduction of InterSystems Open Exchange, a collection of community-created applications and tools for InterSystems products, and the politics of the corporation to make InterSystems more open. On my opinion, these changes are necessary to make ObjectScript a world-class player in the list of programming languages.Moreover, historically, ObjectScript programs, as well as their source code, are stored directly in the DBMS itself. Before the UDL support was introduced in InterSystems Caché 2016.2 (or CDL in version 2013.2 - read below), in order to extract the source code from the database, it was necessary to write a considerable program to export plain text sources to the files, and put even more effort into getting the code back into DBMS. Now, exporting and importing plain text source code is possible with just a single command, so that you can easily organize a “traditional” model for solutions development: editing source code files — compiling — getting results.Before Atelier, it wasn't simply possible to develop InterSystems applications on Linux / MacOS without a VM, since Caché Studio was supported only for Windows. Now that Atelier is based on Eclipse IDE, you can develop on any platform supported by Eclipse. However, the method described in the article is completely cross-platform.Some projects have many other sources and files besides ObjectScript classes. The question here is how to properly organize the source code of the entire project. Today, the next development cycle is used for projects using InterSystems Tech: you work on sources in Studio/Atelier, and then you can do export XML/CLS files to a vcs-indexed file system with a help of embedded tools. These exported files are not intended for modifications. In case of Atelier, the development cycle is designed around Atelier only, and each and every extension has to be supported by the IDE. There is a little support of external tools, build tools, code analyzers, preprocessors, as well as there is no support for an arbitrary project structure and so on. To sum up, mostly what was designed initially is supported only.Finally, the most important motivation, taking into account all of the above, is to open ObjectScript Programming Language for the whole world. This has already began: InterSystems introduced InterSystems Open Exchange, an ObjectScript support was developed for Sublime Text Editor, Atom and Visual Studio Code, and so on. See? That's what it is about!Syntax highlighting in Visual Studio Code - an "external" IDE for ObjectScriptIntroductionExporting ObjectScript program sources to UDL (Universal Definition Language) format landed completely just in InterSystems Caché 2016.2. In previous versions, starting with InterSystems Caché 2013.2, the class code export was also possible, using the methods of the class %Compiler.UDL.TextServices. It is also worth mentioning that, starting from version 2016.2, the Atelier REST API is also available for importing/exporting class definitions for InterSystems products.Before UDL, it was only possible to export and import the XML representation of classes, which was just a big mess for version control systems. Through XML class definition also had a plaintext code you could edit, you weren't able to see clean commit diffs (example: one of my projects which still has some XMLs in it), do merge requests and so on. UDL cleans this up and opens new possibilities for projects development on top of InterSystems products.The result of this article is the simplest possible project organized entirely in the file system, and several scripts that create a single command to build and import the whole thing into DBMS.PrerequisitesLet's assume that we have an ObjectScript project, which consists of class definitions (as well as, possibly, some routine code and a static front end). This is a necessary and sufficient condition to start applying the development method described in this article to an existing or a new project.It is assumed that the machine you work on has a locally installed DBMS IRIS/Caché/Ensemble/HealthShare version 2016.2+. To implement this method of development in earlier versions of InterSystems Caché (starting with 2013.2), you will need to adapt the suggested examples using the %Compiler.UDL.TextServices methods. If you don't have any InterSystems' products installed, you can try one out here. During the installation, specify the Unicode encoding instead of 8-bit, and leave all the other items suggested by the installation wizard unchanged.The article uses the Git version control system. If you do not have Git installed, you have to install it.Creating a ProjectThe directory structure of the demonstration project is as follows:Wherein:The source code of the project is located in the “source” directory, and, in the corresponding “cls” subdirectory, there is a tree of packages and classes. In the screenshot of the project structure above, as an example, the DevProject package is located, along with Robot class (DevProject.Robot) and REST subpackage.The import.* Script imports the project into the DBMS.The project code shown above is available on GitHub. It is suggested to clone the project to the local machine by following the instructions below:git clone https://github.com/ZitRos/cache-dev-projectThe project contains the source/cls directory, which contains the usual package hierarchy. For the demonstration, a simplest class was created containing the Message class-method, which displays the message “Welcome, Anonymous!”:Class DevProject.Robot{ClassMethod Message(name As %String = "Anonymous"){ write "Welcome, ", name, "!"}}To import this and other classes into DBMS, you can use one of the following ways:1. Use Atelier:It doesn't make sense to perform all these steps each time we would like to test our project. Hence, we are going to automate this.2. Execute the following command in the terminal window:do $system.OBJ.ImportDir("D:/Path/To/the/Project/source/cls",,"ck /checkuptodate=all",,1)This command recursively loads all files from the D:/Path/To/the/Project/source/cls directory into the current namespace, and also compiles those classes that have changed since the last import. Thus, reloaded classes without changes will not take time to compile.The second option is also isn't the most convenient solution - every time a project starts, you need to open a Caché terminal, enter a login-password pair (on instances with a normal protection level enabled), switch to the desired namespace and finally execute the command saved somewhere in a notebook. Finally, it is possible to automate this using the third option.3. Create a script to automate all the routine stuff and use just this:importCalling the latter command in the case of development in almost any external IDE can be simplified even more, to the click of a single button or running a program which will watch files and re-import each time something changes.Thus, the entire project is in the file system, work is being done with the plaintext files, and if necessary, just a single command imports and compiles the whole project without a hassle.The Import ScriptLet's take a closer look onto a script that imports a project into DBMS. In order to do this, it needs some additional information about your InterSystems instance, namely, it's install location, import namespace, as well as the username and password to log in to the system. This data is coded directly into the script, however, it could be separated to a config file.The source code of the script is available on GitHub for Windows and *nix systems. All that needs to be done is several variables change in the script once before starting work on the project.The script executes the cache.exe executable file, which is located in the /bin/ directory of the installed DBMS, and passes two arguments to it: the database directory and the namespace. Then, the script sends a user name, a password, and a few simple ObjectScript commands to the input interface via the terminal interface, importing classes and reporting a successful import or error.Thus, the user gets all the necessary information about the import and compilation of the classes, as well as any errors that may have occurred during the compilation process. Here's the example of the output of the import.bat script:Importing project...Node: DESKTOP-ILGFMGK, Instance: ENSEMBLE20162USER>Load of directory started on 06/29/2016 22:59:10Loading file C:\Users\ZitRo\Desktop\cache-dev-project\source\cls\DevProject\Robot.cls as udlLoading file C:\Users\ZitRo\Desktop\cache-dev-project\source\cls\DevProject\REST\Index.cls as udlCompilation started on 06/29/2016 22:59:10 with qualifiers 'ck /checkuptodate=all'Class DevProject.REST.Index is up-to-date.Compiling class DevProject.RobotCompiling routine DevProject.Robot.1Compilation finished successfully in 0.003s.Load finished successfully.IMPORT STATUS: OKNow we can ensure that the project was indeed imported:USER > do ##class(DevProject.Robot).Message()Welcome, Anonymous!More Complex ExampleTo maximize the benefits of developing a project using InterSystems technologies, we will try to do something more attractive by adding a graphical interface and building the project with the use of NodeJS platform and Gulp task runner. The result is a web page which image is shown below.Emphasis will be placed on how it is possible to organize the development of such a project. First let's took at architecture of the suggested solution.The project consists of static client code (HTML, CSS, JavaScript), a class on the server that describes the REST interface, and one persistent class.A client with a GET request gets a list of robots that are located on the server. Also, when you click on the “Spawn a new robot!” button, the client sends a GET request to the server, as a result of which a new instance of the Robot class is created and added to the display list. (note: robot creation request should actually be a POST request, but we won't complicate things much in this example)Technical implementation of the project can be viewed on GitHub (in “extended” branch). In the article, further attention will be paid to the method of developing such projects.Here, unlike the previous example, the client part of the application is added, which is located in the source/static directory, and the project is built using Node.JS and Gulp.For example, besides the other code in the project there you can find some special comments like these:<div class="version">v<!-- @echo package.version --></div>When building a project, this comment will be replaced with the project version, which is listed in the package.json file. The build script also minimizes CSS and JavaScript code, and copies all the processed code into the `build` directory.In the import script, unlike the previous example, the following changes were added:Before importing, the project is assembled (bundled).Import files are now imported from the build directory, as they pass through the preprocessor.The files are copied from the build/static directory to the Caché's csp/user directory with CSP files. Thus, after importing the application it immediately becomes available.Detailed instructions for installing and running this project are available in the description of the repository.The result is a project that needs to be set up only once, by modifying several variables in the import.* file.The considered development cycle is used in several of my own projects: WebTerminal, Class Explorer, Visual Editor, Entity Browser. Soon it may be used in other projects, including your own ones :)IDE and DebuggingThis development method does not provide any debugging utilities, it is simple as that. If you use more comprehensive debugging tools for your ObjectScript code rather than simply logging something to globals, you still have to use integrated debugging tools in InterSystems products.However, besides of that, the described development method has a big advantage: the ability to use your favorite development environment for writing ObjectScript code, whether this is vim or a simple notebook, MacOS or *nix or any other programming language is used - you get the same workflow. But on the other hand, ObjectScript does not feature such a comprehensive support outside the Studio/Atelier. This means that even syntax highlighting is currently not quite well handled by external editors, let alone autocompletion. But all this is about to change in the near future, as more and more effort is being put into the open-source initiatives.In the meantime, you can use elementary syntax highlighting by keywords that some IDEs offer, such as IntelliJ or Visual Studio Code:In the case of IntelliJ IDEA is your favorite IDE, you can try it right now - here is the settings file you need to import using the File -> Import Settings menu. Highlighting is quite simple and incomplete, any additions are welcome.ConclusionThe purpose of this article is to introduce something new into the world of development of InterSystems applications, present another version of development to the public and contribute to the spread of ObjectScript as a programming language as a whole. Any feedback, ideas and discussions are very welcome!Thank you! Nice writing, Nikita!Just want to mention that there is a new community option to code ObjectScript you've probably never tried - VSCode plugin for ObjectScript by @Dmitry.Maslennikov.A lot of developers can name VSCode as "favorite" IDE and the plugin can do really a lot for InterSystems IRIS developers today. This is an interesting approach, and I do like it. One question that comes to mind is handling the concept of refactoring packages or class names/files. An example if your cache-dev-project gets restructured in a branch that would be used for testing and DevProject package turns into something more descriptive like RobotProject. When the import script runs the code the server will have a DevProject package and a RobotProject if it was switching between the branches. If one of the packages or classes becomes obsolete then it would be nice to have a way to delete the Class from the server code.I don't think VSCode has a way to handle this. Just a little food for thought. I would not be so sure in your doubts about VSCode. VSCode itself supports refactoring staff, we just do not have it in ObjectScript extension, yet. Deleting obsolete classes, for sure very interesting and quite difficult task. But better to solve it another way, with just clean rebuild. Or for example I can add action delete in the context menu in server explorer, so, a developer will be able to manually delete any class/routine on the server from VSCode. Hello Matthew! Thank you for your feedback.Indeed good point. One idea that comes to my mind for this case is to improve the import script to file the list of classes which were ever imported and those which are used now. By using this list the import script can resolve which classes to delete and which to keep. However, deleting classes can always introduce unwanted side effects, but in terms of a project this should be consistent. Dmitry this sounds great to have a way to do refactoring. I would love to get some update on coming features to VSCode as I do use it often. As for adding the Action to delete in the context menu that would be a very helpful feature. There are times a class is created for new functionality and then gets deleted if a better solution is found. I would much rather manually delete 1 class than perform a clean rebuild. I will admit sometimes a clean rebuild is needed, but depending on the circumstances can take longer. Fill the issue, please, so, I will keep it saved
Article
Evgeny Shvarov · Sep 19, 2019
Hi Developers!
Recently we launched InterSystems Package Manager - ZPM. And one of the intentions of the ZPM is to let you package your solution and submit into the ZPM registry to make its deployment as simple as "install your-package" command.
To do that you need to introduce module.xml file into your repository which describes what is your InterSystems IRIS package consists of.
This article describes different parts of module.xml and will help you to craft your own.
I will start from samples-objectscript package, which installs into IRIS the Sample ObjectScript application and could be installed with:
zpm: USER>install samples-objectscript
It is probably the simplest package ever and here is the module.xml which describes the package:
<?xml version="1.0" encoding="UTF-8"?>
<Export generator="Cache" version="25">
<Document name="samples-objectscript.ZPM">
<Module>
<Name>samples-objectscript</Name>
<Version>1.0.0</Version>
<Packaging>module</Packaging>
<SourcesRoot>src</SourcesRoot>
<Resource Name="ObjectScript.PKG"/>
</Module>
</Document>
</Export>
Let's go line-by-line through the document.
<Export generator="Cache" version="25">
module.xml belongs to the family of Cache/IRIS xml documents so this line states this relation to let internal libraries to recognize the document.
Next section is <Document>
<Document name="samples-objectscript.ZPM">
Your package should have a name. The name can contain letters in lower-case and "-" sign. E.g. samples-objectscript in this case. please put the name of your package in the name clause of the Document tag with the .ZPM extension.
Inner elements of the Document are:
<Name> - the name of your package. In this case:
<Name>samples-objectscript</Name>
<Version> - the version of the package. In this case:
<Version>1.0.0</Version>
<Packaging>module</Packaging> - the type of packaging. Put the module parameter here.
<Packaging>module</Packaging>
<SourcesRoot> - a folder, where zpm will look for ObjectScript to import.
In this case we tell to look for ObjectScript in /src folder:
<SourcesRoot>src</SourcesRoot>
<Resource Name> - elements of ObjectScript to import. This could be packages, classes, includes, globals, dfi, etc.
The structure under SourceRoot folder should be the following:
/cls - all the ObjectScript classes in Folder=Package, Class=file.cls form. Subpackages are subfolders
/inc - all the include files in file.inc form.
/mac - all the mac routines.
/int - all the "intermediate" routines (AKA "other" code, the result of a compilation of mac code, or ObjectScirpt without classes and macro).
/gbl - all the globals in xml form of export.
/dfi - all the DFI files in xml form of export. Each pivot comes in pivot.dfi file, each dashboard comes in dashboard.dfi file.
E.g. here we import the ObjectScript page. This will tell to ZPM to look for /src/cls/ObjectScript folder and import all the classes from it:
<Resource Name="ObjectScript.PKG"/>
So! To prepare your solution for packaging put ObjectScript classes into some folder of your repository inside /cls folder and place all packages and classes in package=folder, class=file.cls form.
If you store classes in your repo differently and don't want a manual work to prepare the proper folder structure for ObjectScript there are plenty of tools which do the work: Atelier and VSCode ObjectScript export classes this way, also there is isc-dev utility which exports all the artifacts from namespace ready for packaging.
Packaging mac routines
This is very similar to classes. Just put routines under /mac folder. Example.
<?xml version="1.0" encoding="UTF-8"?>
<Export generator="Cache" version="25">
<Document name="DeepSeeButtons.ZPM">
<Module>
<Name>DeepSeeButtons</Name>
<Version>0.1.7</Version>
<Packaging>module</Packaging>
<SourcesRoot>src</SourcesRoot>
<Resource Name="DeepSeeButtons.mac"/>
</Module>
</Document>
</Export>
Some other elements
There are also optional elements like:<Author>
Which could contain <Organization> and <CopyrightDate> elements.
Example:
<Author>
<Organization>InterSystems</Organization>
<CopyrightDate>2019</CopyrightDate>
</Author>
Packaging CSP/Web applications
ZPM can deploy web applications too.
To make it work introduce CSPApplication element with the clauses of CSP Application parameters.
For example, take a look on DeepSeeWeb module.xml CSPApplication tag:
<CSPApplication
Url="/dsw"
DeployPath="/build"
SourcePath="${cspdir}/dsw"
ServeFiles="1"
Recurse="1"
CookiePath="/dsw"
/>
This setting will create a Web application with the name /dsw and will copy all the files from /build folder of the repository into ${cspdir}/dsw folder which is a folder under IRIS csp directory.
REST API application
If this is a REST-API application the CSPApplication element will contain dispatch class and could look like the MDX2JSON module.xml:
<CSPApplication
Path="/MDX2JSON"
Url="/MDX2JSON"
CookiePath="/MDX2JSON/"
PasswordAuthEnabled="1"
UnauthenticatedEnabled="1"
DispatchClass="MDX2JSON.REST"
/>
Dependencies
Your module could expect the presence of another module installed on the target system. This could be described by <Dependencies> element inside <Document> element which could contain several <ModuleReference> elements each of which has <Name> and <Version> and which state what other modules with what version should be installed before your one. This will cause ZPM to check, whether modules are installed and if not perform the installation.
Here is an example of dependency DSW module on MDX2JSON module:
<Dependencies>
<ModuleReference>
<Name>MDX2JSON</Name>
<Version>2.2.0</Version>
</ModuleReference>
</Dependencies>
Another example where ThirdPartyPortlets depends on Samples BI(holefoods):
<Dependencies>
<ModuleReference>
<Name>holefoods</Name>
<Version>0.1.0</Version>
</ModuleReference>
</Dependencies>
There are also options to run your arbitrary code to set up the data, environment and we will talk about it in the next articles.
How to build your own package
Ok! Once you have a module.xml you can try to build the package and test if the module.xml structure is accurate.
You may test via zpm client. Install zpm on an IRIS system and load the package code with load command:
zpm: NAMESPACE>load path-to-the-project
The path points to the folder which contains the resources for the package and has module.xml in the root folder.
E.g. you can test the package building this project. Check out it and build a container with docker-compose-zpm.yml.
Open terminal in SAMPLES namespace and call ZPM:
zpm: SAMPLES>
zpm: SAMPLES>load /iris/app
[samples-objectscript] Reload START
[samples-objectscript] Reload SUCCESS
[samples-objectscript] Module object refreshed.
[samples-objectscript] Validate START
[samples-objectscript] Validate SUCCESS
[samples-objectscript] Compile START
[samples-objectscript] Compile SUCCESS
[samples-objectscript] Activate START
[samples-objectscript] Configure START
[samples-objectscript] Configure SUCCESS
[samples-objectscript] Activate SUCCESS
The path is "/iris/app" cause we tell in docker-compose-zpm.yml that we map the root of the project to /iris/app folder in the container. So we can use this path to tell zpm where to load the project from.
So! The load performed successfully. And this means that module.xml could be used to submit a package to the developers' community repository.
Now you know how to make a proper module.xml for your application.
How to submit the application to InterSystems Community repository
As for today there two requirements:
1. Your application should be listed on Open Exchange
2. Request me in Direct Message or in comments to this post if you want your application to be submitted to the Community Package manager repository.
And you should have a module.xml working!) Updated module documents from .MODULE to .ZPM Hi @Evgeny.Shvarov I'm creating a module.xml for iris-history-monitor, and during the process, a question came up.When you run docker-compose up in my project, the Installer has an invoke tag to execute a class method.
But how can I make this works in the ZPM? Here is objectscript package template, which has an example module.xml with almost everything which could happen in a package.
Take a look on invoke tag:
<Invokes>
<Invoke Class="community.objectscript.PersistentClass" Method="CreateRecord"></Invoke>
<Invoke Class="community.objectscript.ClassExample" Method="SetToTheGlobal">
<Arg>42</Arg>
<Arg>Text Data</Arg>
</Invoke>
</Invokes>
Place calls elements <Invoke> in <Invokes> tag. You can pass parameters if you need. This article describes all the details about <invoke> elements. Perfect!
Thanks @Evgeny.Shvarov ZPM forces to use categories in the folder structure... perhaps, to make it easier, VS Code ObjectScript extension should be configured with that option by default... just an idea.
Also, is it there any place with full doc about module.xml? Articles are full of really useful info but having to navigate through all of them is a bit confusing. Hi Salva!
Thanks, but not anymore. Check the article.
Also, is it there any place with full doc about module.xml? Articles are full of really useful info but having to navigate through all of them is a bit confusing.
Sure. Here is the ZPM documentation Oh my... I didn't see the Wiki...
Thanks! Aggggghhh... OK... come back to previous structure. At least... I can confirm that it was a good idea...but I was not the first one Yes. We first introduced this one cause it exactly how Atelier API exports ObjectScript artifacts by default, but IMHO the simplified one is much better.
And that's why we have it in the ZPM basic template. What is the rule for VERSION in Dependencies ?Is it an EQUAL or a MINIMUM Version.
e.g. <Version>0.0.0</Version> would mean any version
<Dependencies>
<ModuleReference>
<Name>holefoods</Name>
<Version>0.1.0</Version>
</ModuleReference>
</Dependencies>
Thanks !
Announcement
Benjamin De Boe · Jan 8, 2019
Hi, As we announced at our Global Summit in October, we are developing dedicated connectors for a number of third-party data visualization tools for InterSystems IRIS. With these connectors, we want to combine an excellent user experience with optimal performance when using those tools to visualize data managed on InterSystems IRIS Data Platform. If you are already using Tableau or Power BI products to access our data platform through their respective generic ODBC connectors today, we're interested in learning more about your experiences thus far and would be very grateful if you could spend a few minutes on our survey.survey for Tableau userssurvey for Microsoft Power BI usersThanks,benjamin @Benjamin.DeBoe
Have you made any progress on this?
We just starting playing around with using Web Data Connectors in Tableau to call APIs into Cache.
Best,
Mike @Mike.Davidovich I am working with Benjamin on an Alpha version of the Tableau connector. Interested in your experience with using OBDC or JDBC connection to Tableau. Have you used that and what else would you like to see with a Tableau connector?
Feel free to post here or email directly -- carmen.logue@intersystems.com @Carmen.Logue Thanks, Carmen! I'm not sure I can add too much to your Alpha development I personally haven't been using the OBDC to connect into Cache. What I do know is that some other groups have used ODBC to connect into Cache with SQL projections.
My team want's to avoid projections specifically (at this point at least) because we tend to only use them to get data from Cache to our Data Warehouse (Oracle). Other than that, we still traverse our database via globals and good old MUMPS programming. The project I'm working on is taking those many many routines that we have that traverse globals for reporting, and transforming the data to JSON streams. An %CSP.REST API will call those routines and provide the data to a Tableau web data connector so we can get instant, live data without projecting our whole database.
I'm just getting start with Tablaue and Cache so I may have some more input in the future.
Best,
Mike
Announcement
Andreas Dieckow · Jan 9, 2019
InterSystems has completed the verification process for OpenJDK 8.Customers now have the option to either use the Oracle JDK, or the OpenJDK with all InterSystems products and versions that support Java 8. Support for future versions will continue to be supported on both of these Java Development Kits. Nice another step to much more visibility.
Announcement
Anastasia Dyubaylo · May 22, 2019
Hi Community!
Please welcome a new video on InterSystems Developers YouTube Channel:
InterSystems IRIS from Spark to Finish
This video demonstrates spinning up a cluster that combines InterSystems IRIS' powerful data management with Apache Spark's unique approach to parallelizing complex data engineering and analytics. Together, they let you make the best use of your distributed environment.
Takeaway: The combination of InterSystems IRIS and Apache Spark enables me to build powerful analytical solutions.Presenter: @Amir.Samary
And...
Additional materials to the video you can find in this InterSystems Online Learning Course.
Don't forget to subscribe our InterSystems Developers YouTube Channel.
Enjoy and stay tuned!
Announcement
Anastasia Dyubaylo · May 13, 2019
Hey Community!
It's time again for good tidings for you!
For the first time, InterSystems will be part of the WeAreDevelopers World Congress in Berlin, Germany, which brings together developers, IT experts and digital innovators to discuss and shape the future of application development.
From 6 to 7 June, we’re ready to welcome you at our booth #A5 and show you how InterSystems technologies enable intelligent interoperability and accelerate the creation of powerful, data-driven applications.
Schedule your individual meeting with InterSystems @ WeAreDevelopers in Berlin by quickly filing out the form on our event page [https://dach.intersystems.de/WeAreDevelopers2019] – the first three applicants will receive a FREE ticket!
To make it more challenging for you, we’ve decided to provide the webpage & form in German only.
So...Do not miss your chance!
Join the World’s Largest Developers Congress with InterSystems!
Article
Sergey Kamenev · May 23, 2019
PHP, from the beginning of its time, is renowned (and criticized) for supporting integration with a lot of libraries, as well as with almost all the DB existing on the market. However, for some mysterious reasons, it did not support hierarchical databases on the globals.
Globals are structures for storing hierarchical information. They are somewhat similar to key-value database with the only difference being that the key can be multi-level:
Set ^inn("1234567890", "city") = "Moscow"
Set ^inn("1234567890", "city", "street") = "Req Square"
Set ^inn("1234567890", "city", "street", "house") = 1
Set ^inn("1234567890", "year") = 1970
Set ^inn("1234567890", "name", "first") = "Vladimir"
Set ^inn("1234567890", "name", "last") = "Ivanov"
In this example, multi-level information is saved in the global ^inn using the built-in ObjectScript language. Global ^inn is stored on the hard drive (this is indicated by the “^” sign in beginning).
In order to work with globals from PHP, we will need new functions that will be added by the PHP module, which will be discussed below.
Globals support many functions for working with hierarchies: traversal tree on fixed level and in depth, deleting, copying and pasting entire trees and individual nodes. And also ACID transactions - as is done in any quality database. All this happens extremely quickly (about 105-106 inserts per second on regular PC) for two reasons:
Globals are a lower level abstraction when compared to SQL,
The bases have been in production on the globals for decades, and during this time they were polished and their code was thoroughly optimized.
Learn more about globals in the series of articles titled "Globals Are Magic Swords For Managing Data.":
Part 1.Trees. Part 2.Sparse arrays. Part 3.
In this world, globals are primarily used in storage systems for unstructured and sparse information, such as: medical, personal data, banking, etc.
I love PHP (and I use it in my development work), and I wanted to play around with globals. There was no PHP module for IRIS and Caché. I contacted InterSystems and asked them to create it. InterSystems sponsored the development as part of an educational grant and my graduate student and I created the module.
Generally speaking, InterSystems IRIS is a multi-model DBMS, and that's why from PHP you can work with it via ODBC using SQL, but I was interested in globals, and there was no such connector.
So, the module is available for PHP 7.x (was tested for 7.0-7.2). Currently it can only work with InterSystems IRIS and Caché installed on the same host.
Module page on OpenExchange (a directory of projects and add-ons for developers at InterSystems IRIS and Caché).
There is a useful DISCUSS section where people share their related experiences.
Download here:
https://github.com/intersystems-community/php_ext_iris
Download the repository from the command line:
git clone https://github.com/intersystems-community/php_ext_iris
Installation instructions for the module in English and Russian.
Module Functions:
.article_table td{
padding: 5px;
}
.article_table th{
text-align: center;
}
PHP function
Description
Working with data
iris_set($node, value)
Setting a node value.
iris_set($global, $subscript1, ..., $subscriptN, $value); iris_set($global, $value);
Returns: true or false (in the case of an error). All parameters of this function are strings or numbers. The first one is the name of the global, then there are the indexes, and the last parameter is the value.
iris_set('^time',1);
iris_set('^time', 'tree', 1, 1, 'value');
ObjectScript equivalent:
Set ^time = 1
Set ^time("tree", 1, 1) = "value"
iris_set($arrayGlobal, $value);
There are just two parameters: the first one is the array in which the name of the global and all its indexes are stored, and the second one is the value.
$node = ['^time', 'tree', 1, 1];
iris_set($node,'value');
iris_get($node)
Getting a node value.
Returns: a value (a number or a line), NULL (the value is not defined), or FALSE (in the event of an error).
iris_get($global, $subscript1, ..., $subscriptN); iris_get($global);
All parameters of this function are lines or numbers. The first one is the name of the global, and the rest are subscripts. The global may not have subscripts.
$res = iris_get('^time');
$res1 = iris_get('^time', 'tree', 1, 1);
iris_get($arrayGlobal);
The only parameter is the array in which the name of the global and all its subscripts are stored.
$node = ['^time', 'tree', 1, 1];
$res = iris_get($node);
iris_zkill($node)
Deleting a node value.
Returns: TRUE or FALSE (in the event of an error).
It is important to note that this function only deletes the value in the node and does not affect lower branches.
iris_zkill($global, $subscript1, ..., $subscriptN); iris_zkill($global);
All parameters of this function are lines or numbers. The first one is the name of the global, and the rest are subscripts. The global may not have subscripts.
$res = iris_zkill('^time'); // Lower branches are not deleted.
$res1 = iris_zkill('^time', 'tree', 1, 1);
iris_zkill($arrayGlobal);
The only parameter is the array in which the name of the global and all its subscripts are stored.
$a = ['^time', 'tree', 1, 1];
$res = iris_zkill($a);
iris_kill($node)
Deleting a node and all descendant branches.
Returns: TRUE or FALSE (in the case of an error).
iris_kill($global, $subscript1, ..., $subscriptN); iris_kill($global);
All parameters of this function are lines or numbers. The first one is the name of the global, and the rest are indexes. The global may not have indexes, in which case it is deleted in full.
$res1 = iris_kill('^example', 'subscript1', 'subscript2');
$res = iris_kill('^time'); // The global is deleted in full.
iris_kill($arrayGlobal);
The only parameter is the array in which the name of the global and all its subscripts are stored.
$a = ['^time', 'tree', 1, 1];
$res = iris_kill($a);
iris_order($node)
Traversal the branches of the global on a given level
Returns: the array in which the full name of the previous node of the global on the same level is stored or FALSE (in the case of an error).
iris_order($global, $subscript1, ..., $subscriptN);
All parameters of this function are strings or numbers. The first one is the name of the global, and the rest are subscripts. Form of usage in PHP and ObjectScript equivalent:
iris_order('^ccc','new2','res2'); // $Order(^ccc("new2", "res2"))
iris_order($arrayGlobal);
The only parameter is the array in which the name of the global and the subscripts of the initial node are stored.
$node = ['^inn', '1234567890', 'city'];
for (; $node !== NULL; $node = iris_order($node))
{
echo join(', ', $node).'='.iris_get($node)."\n";
}
Returns:
^inn, 1234567890, city=Moscow
^inn, 1234567890, year=1970
iris_order_rev($node)
Traversal the branches of the global on a given level in reverse order
Returns: the array in which the full name of the previous node of the global on the same level is stored or FALSE (in the case of an error).
iris_order_rev($global, $subscript1, ..., $subscriptN);
All parameters of this function are lines or numbers. The first one is the name of the global, and the rest are subscripts. Form of usage in PHP and ObjectScript equivalent:
iris_order_rev('^ccc','new2','res2'); // $Order(^ccc("new2", "res2"), -1)
iris_order_rev($arrayGlobal);
The only parameter is the array in which the name of the global and the subscripts of the initial node are stored.
$node = ['^inn', '1234567890', 'name', 'last'];
for (; $node !== NULL; $node = iris_order_rev($node))
{
echo join(', ', $node).'='.iris_get($node)."\n";
}
Returns:
^inn, 1234567890, name, last=Ivanov
^inn, 1234567890, name, first=Vladimir
iris_query($CmdLine)
Traversal of the global in depth
Returns: the array in which the full name of the lower node (if available) or the next node of the global (if there is no embedded node) is contained.
iris_query($global, $subscript1, ..., $subscriptN);
All parameters of this function are strings or numbers. The first one is the name of the global, and the rest are subscripts. Form of usage in PHP and ObjectScript equivalent:
iris_query('^ccc', 'new2', 'res2'); // $Query(^ccc("new2", "res2"))
iris_query($arrayGlobal);
The only parameter is the array in which the name of the global and the indexes of the initial node are stored.
$node = ['^inn', 'city'];
for (; $node !== NULL; $node = iris_query($node))
{
echo join(', ', $node).'='.iris_get($node)."\n";
}
Returns:
^inn, 1234567890, city=Moscow
^inn, 1234567890, city, street=Req Square
^inn, 1234567890, city, street, house=1
^inn, 1234567890, name, first=Vladimir
^inn, 1234567890, name, last=Ivanov
^inn, 1234567890, year=1970
The order differs from the order in which we established it because everything is automatically sorted in ascending order in the global during insertion.
Service functions
iris_set_dir($FullPath)
Setting up a directory with a database
Returns: TRUE or FALSE (in the case of an error).
iris_set_dir('/InterSystems/Cache/mgr');
This must be performed before connecting to the database.
iris_exec($CmdLine)
Execute database command
Returns: TRUE or FALSE (in the case of an error).
iris_exec('kill ^global(6)'); // The ObjectScript command for deleting a global
iris_connect($login, $pass)
Connect to database
iris_quit()
Close connection with DB
iris_errno()
Get error code
iris_error()
Get text description of error
If you want to play around with the module, check e.g. docker container implementation
git clone https://github.com/intersystems-community/php_ext_iris
cd php_ext_iris/iris
docker-compose build
docker-compose up -d
Test the demo page on localhost:52080 in the browser.PHP files that can be edited and played with are in the php/demo folder which will be mounted to inside the container.
To test IRIS use the admin login with the SYS password.
To get into the IRIS settings, use the following URL:http://localhost:52773/csp/sys/UtilHome.csp
To get into the IRIS console of this container, use the following command:
docker exec -it iris_iris_1 iris session IRIS
Especially for DC and those who wants to use we run a virtual machine with a Caché php-module was set up.
Demo page on english. Demo page on russian. Login: habr_test Password: burmur#@8765
For self-installation of the module for InterSystems Caché
Have Linux. I tested for Ubuntu, the module should also compiled and work under Windows, but I didn’t test it.
Download the free version:
InterSystems Caché (registration required). As to Linux, Red Hat and Suse are supported out of the box, but you can also install them on other distribution packages.
Install the cach.so module in PHP according to the instructions..
Just out of interest, I ran two primitive tests to check the speed of inserting new values into the database in the docker container on my PC (AMD FX-9370@4700Mhz 32GB, LVM, SATA SSD).
Insertion of 1 million new nodes into the global took 1.81 seconds or 552K inserts per second.
Updating a value in the same global 1,000,000 times took 1.98 seconds or 505K updates per second. An interesting fact is that the insertion occurs faster than the update. Apparently this is a consequence of the initial optimization of the database aimed at quick insertion.
Obviously, these tests cannot be considered 100% accurate or useful, since they are primitive and are done in the container. On more powerful hardware with a disk system on a PCIe SSD, tens of millions of inserts per second can be achieved.
What else can be completed and the current state
Useful functions for working with transactions can be added (you can still use them with iris_exec).
The function of returning the whole global structure is not implemented, so as not to traverse the global from PHP.
The function of saving a PHP array as a subtree is not implemented.
Access to local database variables is not implemented. Only using iris_exec, although it's better with iris_set.
Global traversal in depth in the opposite direction is not implemented.
Access to the database via an object using methods (similar to current functions) is not implemented.
The current module is not quite yet ready for production: not tested for high loads and memory leaks. However, should someone need it, please feel free to contact me at any time (Sergey Kamenev sukamenev@gmail.com).
Bottom line
For a long time, the worlds of PHP and hierarchical databases on globals practically did not overlap, although globals provide strong and fast functionality for specific data types (medical, personal).
I hope that this module will motivate PHP programmers to experiment with globals and ObjectScript programmers for simple development of web interfaces in PHP.
P.S. Thank you for your time! Nice!Just tried this with docker container on my local machine.And got 1 million insertions(1,000,000) in 1,45 sec on my mac pro. Cool!
Announcement
Anastasia Dyubaylo · Aug 26, 2019
Hi Developers!
InterSystems Developers Community today unites more than 7,000 developers from all over the world. Since 2016, our community has been growing and improving for you, our dear developers!
Together we've done a lot over these years, and much more is planned for the future!
So, who makes our community better every day? Who tries for all of us and improves the space for developers?
Let's warmly greet our team:
@Evgeny.Shvarov – founder of InterSystems Developers community, Startups and Community manager at InterSystems.
@David.Reche – founder & manager of Spanish Developers Community & Senior Sales Engineer at InterSystems.
@Anastasia.Dyubaylo – our Community & Social Media Manager at InterSystems. She leads Global Masters Advocacy Hub and all InterSystems Developers social networks. Anastasia also reviews many of InterSystems' events on the community and on social media.
@Olga.Zavrazhnova2637 – our Global Masters Advocacy Hub Manager at InterSystems. She is managing Global Masters since it's launching in 2016. Now Olga creating engagement campaigns and exploring new rewards ideas for Global Masters.
@Julia.Fedoseeva – Educational and Logistics manager at InterSystems and also our Global Masters Advocacy Hub Manager. She organizes the delivery of GM Rewards around the whole world.
Gamification and Community Management – it's about these guys. They're supporting you on your way with InterSystems Global Masters Advocacy Hub!
And...
Of course, our remarkable Moderators in Developers Community team.
Please welcome:
@Eduard.Lebedyuk – Sales Engineer at InterSystems in Moscow, Russia.
@Dmitry.Maslennikov – Developer Advocate, co-founder of CaretDev corp.
@Sean.Connelly – Managing Director / Software Engineer at MemCog LTD.
@John.Murray – Senior Product Engineer at George James Software.
@Scott.Roth – Senior Applications Development Analyst at the Ohio State University Wexner Medical Center.
@Jeffrey.Drumm – Vice President and Chief Operating Officer at Healthcare Integration Consulting Group (HICG).
@Henrique.GonçalvesDias – System Management Specialist, Database Administrator at Sao Paulo Federal Court.
And our Moderators of the Spanish Community Team:
@Francisco.López1549 – Project Manager & Head of Interoperability at Salutic Soluciones, S.L.
@Nancy.Martinez – Solution Consultant at Ready Computing.
So!
Now you know all InterSystems Developer Community heroes!
Stay tuned with us! 🚀 Awesome!!! I've always got superb guidance, and direction to all my questions. Thank you guys !!Nice to put faces to the people :) Great job guys !! Thanks, @Eric.David! Thank you! Happy to work with such people! Great team, perfect Community! A nice overview -- I like those "cheat sheets" Applause for your all!Keep up the good work all! Like the community very much ! Thanks to all of you, guys, for your effort and help!! Great job!! (And remember, with great power comes great responsibility ) Thanks, Udo! Another version of KYC - Know Your Community ) Thanks, Marco! See you on GS2019!
Article
Evgeny Shvarov · Mar 14, 2019
Hi Community!
I think everyone keeps the source code of the project in the repository nowadays: Github, GitLab, bitbucket, etc. Same for InterSystems IRIS projects check any on Open Exchange.
What do we do every time when start or continue working with a certain repository with InterSystems Data Platform?
We need a local InterSystems IRIS machine, have the environment for the project set up and the source code imported.
So every developer performs the following:
Check out the code from repo
Install/Run local IRIS installation
Create a new namespace/database for a project
Import the code into this new namespace
Setup all the rest environment
Start/continue coding the project
If you dockerize your repository this steps line could be shortened to this 3 steps:
Check out the code from repo
Run docker-compose build
Start/continue coding the project
Profit - no any hands-on for 3-4-5 steps which could take minutes and bring head ache sometime.
You can dockerize (almost) any your InterSystems repo with a few following steps. Let’s go!
How to dockerize the repo and what does this mean?
Basically, the idea is to have docker installed in your machine which will build the code and environment into a container which will then run in docker and will work in the way developer introduced at a first place. No any "What is the OS version?", "What else did you have on this IRIS installation?".
It's every time a clean page (or a clean IRIS container) which we use to setup environment (namespaces, databases, web-apps, users/roles ) and import code into a clean just created database.
Will this "dockerize" procedure harm greatly your current repo?
No. It will need to add 2-3 new files in the root of the repo and following a few rules which you can setup on your own.
Prerequisites
Download and Install docker.
Download and install IRIS docker image. In this example, I will use full InterSystems IRIS preview: iris:2019.1.0S.111.0 which you can download from WRC-preview., see the details.
If you work with the instance which needs a key place the iris.key in the place you will use all the time. I put it on my Mac into Home directory.
Dockerising the repo
To dockerise your repo you need to add three files into the root folder of your repo.
Here is the example of dockerized repo - ISC-DEV project, which helps to import/export source code from IRIS database. This repo has additional Dockerfile, docker-compose.yml and installer.cls I will describe below.
First is Dockerfile, which will be used by docker-compose build command
Dockerfile
FROM intersystems/iris:2019.1.0S.111.0 # need be the same image as installed
WORKDIR /opt/app
COPY ./Installer.cls ./
COPY ./cls/ ./src/
RUN iris start $ISC_PACKAGE_INSTANCENAME quietly EmergencyId=sys,sys && \
/bin/echo -e "sys\nsys\n" \
# giving %ALL to the user admin
" Do ##class(Security.Users).UnExpireUserPasswords(\"*\")\n" \
" Do ##class(Security.Users).AddRoles(\"admin\", \"%ALL\")\n" \
# importing and running the installer
" Do \$system.OBJ.Load(\"/opt/app/Installer.cls\",\"ck\")\n" \
" Set sc = ##class(App.Installer).setup(, 3)\n" \
" If 'sc do \$zu(4, \$JOB, 1)\n" \
# introducing OS Level authorization (to remove login/pass prompt in container)
" Do ##class(Security.System).Get(,.p)\n" \
" Set p(\"AutheEnabled\")=p(\"AutheEnabled\")+16\n" \
" Do ##class(Security.System).Modify(,.p)\n" \
" halt" \
| iris session $ISC_PACKAGE_INSTANCENAME && \
/bin/echo -e "sys\nsys\n" \
| iris stop $ISC_PACKAGE_INSTANCENAME quietly
CMD [ "-l", "/usr/irissys/mgr/messages.log" ]
This Dockerfile copies installer.cls and the source code from /cls folder of repo into /src foler into the container
It also runs some config settings, which give admin user %All role, infinite password ‘SYS’, introduces OS level authorization and runs the %Installer.
What’s in %Installer?
Class App.Installer
{
XData MyInstall [ XMLNamespace = INSTALLER ]
{
<Manifest>
<Default Name="NAMESPACE" Value="ISCDEV"/>
<Default Name="DBNAME" Value="ISCDEV"/>
<Default Name="APPPATH" Dir="/opt/app/" />
<Default Name="SOURCESPATH" Dir="${APPPATH}src" />
<Default Name="RESOURCE" Value="%DB_${DBNAME}" />
<Namespace Name="${NAMESPACE}" Code="${DBNAME}-CODE" Data="${DBNAME}-DATA" Create="yes" Ensemble="0">
<Configuration>
<Database Name="${DBNAME}-CODE" Dir="${APPPATH}${DBNAME}-CODE" Create="yes" Resource="${RESOURCE}"/>
<Database Name="${DBNAME}-DATA" Dir="${APPPATH}${DBNAME}-DATA" Create="yes" Resource="${RESOURCE}"/>
</Configuration>
<Import File="${SOURCESPATH}" Recurse="1"/>
</Namespace>
</Manifest>
}
ClassMethod setup(ByRef pVars, pLogLevel As %Integer = 3, pInstaller As %Installer.Installer, pLogger As %Installer.AbstractLogger) As %Status [ CodeMode = objectgenerator, Internal ]
{
Return ##class(%Installer.Manifest).%Generate(%compiledclass, %code, "MyInstall")
}
}
It creates the namespace/database ISCDEV and imports the code from source folder -/src.
Next is docker-compose.yml file, which will be used when we run the container with docker-compose up command.
version: '2.4'
services:
iris:
build: .
restart: always
ports:
- 52773:52773
volumes:
- ~/iris.key:/usr/irissys/mgr/iris.key
This config will tell docker on what port we will expect IRIS working on our host. First (52773) is a host, second is container’s internal port of a container (52773)
in volumes section docker-compose.yml provides access to an iris key on you machine inside the container in the place, where IRIS is looking for it:
- ~/iris.key:/usr/irissys/mgr/iris.key
To start coding with this repo you do the following:
1. Clone/git pull the repo into any local directory.
2. Open the terminal in this directory and run
user# docker-compose build
this will build the container.
3. Run the IRIS container with your project
user# docker-compose up -d
Open your favorite IDE, connect to the server on localhost://52773 and develop your success with InterSystems IRIS Data Platforms ;)
You can use this 3 files to dockerize your repository. Just put the right name for source code in Dockerfile, the right namespace(s) in Installer.cls and place for iris.key in docker-compose.yml and use benefits of Docker containers in your day-to-day development with InterSystems IRIS.
Nice one Evgeny! I like it!I'm sure it'll help all those that want to leverage the agility of containers and our quarterly container releases. Thank you, Luca! Besides agility, I like the saving of the developer's time on environment setup. Docker IMHO is the fastest way for a developer to start compiling the code. And what's even better - it's a standard way from project-to-project:
docker-compose build #(when needed)
docker-compose up -d #(always)
And forgot to add, that to open IRIS terminal just call the following:
user$ docker-compose exec iris iris session iris
Very nice tip, worked fine, and with this I can use IRIS terminal from VSCode terminal VSCode, while configured to work with docker, have a short action to open terminal, through menu on connection status
Hello,
And a great tutorial, thanks! One question tho. Do you know how to pass environment variables to the %Installer? In my scenario, I would like to configure a CI/CD pipeline and define some environment variables when building the docker container and reference those variables within the %Installer to create for ex. A specific user account (username and password as env variables). How can I achieve this?
I've tried setting the env variables within the Dockerfile with ENV someEnv="1234" and try to get the variable with %System.Util.GetEnviron("someEnv") within %Installer, but it just returns an empty string.
If you have any insight or tips, it would be appreciated.
Cheers!Kari Vatjus-Anttila
Not sure why this doesn't work. Calling for experts @Dmitry.Maslennikov @Eduard.Lebedyuk Yeah, working with Environment variables is quite tricky, it may not be in a place where you would expect it. I would not recommend it for %Installer, you should focus on Variables feature there, and pass variable to setup method when you call it. It should work. Here's an example:
Installer
Dockerfile
Please consider providing sample code. 💡 This article is considered as InterSystems Data Platform Best Practice.
Announcement
Anastasia Dyubaylo · Apr 17, 2019
Hi Community!Please welcome a new video on InterSystems Developers YouTube Channel:Implementing vSAN for InterSystems IRIS Specific examples of using VMware and vSAN will illustrate practical advice for deploying InterSystems IRIS, whether on premises or in the cloud.Takeaway: I know how to deploy InterSystems IRIS using VMware and vSAN.Presenter: @Murray.Oldfield And...Additional materials to the video you can find in this InterSystems Online Learning Course.Don't forget to subscribe our InterSystems Developers YouTube Channel. Enjoy and stay tuned!
Article
Yuri Marx · Dec 22, 2020
Hi community, I used website-analyzer - an app that uses InterSystems NLP and Crawler4J to extract all website content and do NLP on it. I limited to 200 pages and discovered this:
Top 10 Concepts - business and content topics in the InterSystems site:
Other frequency concepts, see the focus in the iris speed, scale and data value:
Top 10 Concept Dominance - business relations in the InterSystems site. See the InterSystems slogans ("first data platform", "intersystems data platform" and "healthcare data platform")
See the related concepts - the marketing message is: "customer choice", "leading provider", "excellence" and intersystems geo presence in the world, very well!
It is possible see cloud words around word "InterSystems", for example: (it is important to qualify the brand)
Cloud words using paths is very impressive to see general speeach in the website, see:
It is possible do benchmark with business competitors too, run the website-analyzer and see!
Announcement
Anastasia Dyubaylo · Dec 25, 2020
Hey Developers,
See how to use the DTL Generator in InterSystems IRIS to create a data transformation:
⏯ Using the DTL Generator in InterSystems IRIS
👉🏼 Subscribe to InterSystems Developers YouTube.
Enjoy and stay tuned! Hi all,
I was wondering in which version I can find the DTL Generator, or is it still in development?
In version ' IRIS for Windows (x86-64) 2020.1.1 (Build 408U) Sun Mar 21 2021 22:04:53 EDT' I cannot find it. @Stefan.Wittmann maybe you can help answer this question? Hi @Menno.Voerman the DTL Generator is still in development and was presented at the last Virtual Summit to gather some feedback, which we are still incorporating and fine-tuning with some partners. Are you interested in the DTL generator for migrating interfaces to InterSystems IRIS or for other reasons? Hi @Stefan.Wittmann,
I was just curious about this functionality. For a project I need to convert ORU~R01 to MDM~T02 messages.
For now it's fine to write the DTL by myself (small messages). I would be happy to see this functionality in a further release. It can indeed be very useful with migration projects.
Article
Eduard Lebedyuk · Jan 12, 2021
DataGrip is a multi-engine database environment targeting the specific needs of professional SQL developers, DataGrip makes working with databases an enjoyable and productive experience.
To work with InterSystems IRIS from DataGrip you'll need to add InterSystems JDBC driver first (once per DataGrip) and after that add all your InterSystems IRIS connections.
Part 1: Add InterSystems IRIS JDBC Driver
1. Go To File → DataSources
2. Go to + → Driver
3. Set Driver properties:
Name: InterSystems IRIS
Class: com.intersystems.jdbc.IRISDriver
Add JDBC Driver file: path to /<IRIS>/dev/java/lib/JDK18/intersystems-jdbc-3.2.0.jar (version can change, also if you don't have InterSystems IRIS installed, you can download the drivers here)
Add URL Template: Basic jdbc:IRIS://{host}[:{port}]/{database} (you can add additional parameters as described here)
4. Click OK to save the driver definition.
Part 2: Add InterSystems IRIS Instance
1. In the same DataSources window go to +→InterSystems IRIS
2. Specify connection settings depending on your instance and click Test Connection. You'll see an error or connection successful.
It it's a new local installation the defaults likely are:
Host: localhost
Port: 1972
User: _SYSTEM
Password: SYS
Database: USER
3. Click OK to save.
That's it! Now you can query and explore InterSystems IRIS through Datagrip.
Announcement
Evgeny Shvarov · Jan 11, 2021
Hi Developers!
Here're the technology bonuses for the InterSystems Multi-Model Contest that will give you extra points in the voting:
InterSystems Globals (key-value)
InterSystems SQL
InterSystems Objects
Your data model
ZPM Package deployment
Docker container usage
See the details below.
InterSystems Globals (key-value) - 2 points
InterSystems Globals are multidimensional sparse arrays that are being used to store any data in InterSystems IRIS. Each Globals node could be considered a key, which you can set a value for. InterSystems IRIS provides a set of APIs, including ObjectScript commands and Native API to manage Globals.
Tools:
Managing globals in the management portal
Documentation:
Using Multidimensional Storage (Globals)
Using Globals
Articles:
Globals are Magic Swords for managing data
The art of mapping Globals to Classes
Videos:
Globals QuickStart
You can collect 2 points for using Globals via ObjectScript or Native API in your
InterSystems SQL - 2 points
InterSystems IRIS provides SQL access to data via ObjectScript, REST API, JDBC.
Tools:
VSCode SQL Tools
DBeaver
SQL in Management Portal
Other SQL tools
Documentation:
SQL Access
InterSystems SQL Reference
Articles:
Class Queries in ObjectScript
Videos:
SQL Things you should know
Collect 2 bonus points by using InterSystems SQL in your application.
InterSystems Objects - 2 points
InterSystems IRIS provides the way to store and change instances of objects in globals via ObjectScript/REST API, Native API for Java/.NET/Node.js/Python, and XEP for Java/.NET.
Documentation:
Object Access
Get 2 bonus points for the usage of Object Access in your application.
Your data model - 2 points
InterSystems IRIS can be used as a data platform that exposes your own data model API. You are able to use ObjectScript, REST API or Native API to expose your own API which provides any special data model, like Time-Series, Spatial, Graph, RDF/Triple, Column store, Document store.
Introduce any of the new data-model API and collect 2 bonus points.
ZPM Package deployment - 2 points
You can collect the bonus if you build and publish the ZPM(ObjectScript Package Manager) package for your Full-Stack application so it could be deployed with:
zpm "install your-multi-model-solution"
command on IRIS with ZPM client installed.
ZPM client. Documentation.
Docker container usage - 2 points
The application gets a 'Docker container' bonus if it uses InterSystems IRIS running in a docker container.
You can collect the bonus if you use any of the following docker templates:
IRIS Interoperability Template
Feel free to ask any questions about using the listed technologies.
Good luck in the competition!
P.S. The current tech. bonus list is subject to change before the contest starts - stay tuned with the updates. Hi participants!
In order to simplify the assessment for bonuses could you please include in the readme the links where your app uses a certain model type?
E.g.:
Object: https://github.com/intersystems-community/iris-multi-model-api-template/blob/8b3247f9dcd1e6e70274fde6bfdbe3f54ca0b95d/src/dc/Sample/MultiModelREST.cls#L85
SQL: https://github.com/intersystems-community/iris-multi-model-api-template/blob/8b3247f9dcd1e6e70274fde6bfdbe3f54ca0b95d/src/dc/Sample/MultiModelREST.cls#L95
Key-value: https://github.com/intersystems-community/iris-multi-model-api-template/blob/8b3247f9dcd1e6e70274fde6bfdbe3f54ca0b95d/src/dc/Sample/MultiModelREST.cls#L104 Hi Manager!
According to your request, I added the model corresponding link in the readme file!Hope to get technical score in the competition。
Application name:HealthInfoQueryLayer
readme file:https://github.com/ZBT-95/-IRIS-/blob/main/README.md
Thanks, Botai Zhang!
Article
Mihoko Iijima · Mar 5, 2021
**This article is a continuation of this post.**
In the previous article, how the Interoperability menu works for system integration was explained.
In this article, I would like to explain how to develop a system integration using the Interoperability menu.
To begin with, what kind of process do you want to create? While thinking about this, make the following content.
* Production
* Message
* Components
* Business Services
* Business Processes
* Business Operations
Production is a definition used to specify the components required for system integration and to store the component settings, which are configured using the Management Portal (internally stored as a class definition for Production).
For example, suppose you are creating a business service that processes files placed in a specified directory at regular intervals. In that case, it is necessary to set up exactly which directories to monitor and which files to process. A Production is prepared to store these settings.
The settings depend on the **adapter** used by the component that sends and receives data.
Adapters are classes to simplify the connection to external systems, some are protocol-specific such as Mail/File/SOAP/FTP/HTTP/SQL/TCP, and some are standards specific HL7.
Please refer to the documentation (protocol-specific adapters and adapters related to EDI documentation for more information on adapters.
Since we will define the necessary components for the **Production**, "Start Production" will start the system integration, and "Stop Production" will stop the system integration.
The development required to complete the Production is the creation of the components necessary for system integration, specifically the following contents:
* Message
* Components (Business Services, Business Processes, Business Operations)
* Data conversion, etc.
The content above will be explained slowly in the articles coming after this one.
First of all, let's start **Production** using the sample **Production** and check the message process by processing data while checking the settings.
Sample can be downloaded from https://github.com/Intersystems-jp/selflearning-interoperability.
To use a Container, download the sample code using the git clone, navigate the clone's directory, and run docker-compose up -d It's that easy!
See here for the procedure (it will take some time to create a container).
If you do not use containers, create a new namespace after downloading the sample, and import all class definition files (extension .cls) under the src folder into the created namespace.
For more information on the process of creating a namespace, please refer to the video after 07:03 of this article.
Please refer to the README for more details on the sample code.
When you are ready, access the management portal (change the web server's port number to match your environment).
****
Go to the **Management Portal > Interoperability > Configuration > Production**.
If you are using a method other than containers, connect to the namespace where you imported the source code, access [Configuration] > [Production], click the [Open] button, select [Start] > [Production], and then click the [Start] button.
※ If you are using something other than a container, you will need to make some initial settings. Please set up the contents described below before trying the following contents.

The production page will be displayed as **[**● **Component Name]** for each of the "Service", "Process", and "Operation" components.
Click on the component name to change the contents of the "Settings" tab on the right side of the screen.
For example, when you click on **Start.GetKionOperation** (single click), the display is as follows.

This component has the [HTTP Server] and [URL] settings for connecting to the Web API.
There is a [appid] field at the bottom of the settings where you can enter the API key that you get it.
There is a [lang] field near [appid] and is set "ja" ("ja" = Japanese). [lang] set language of response from OpenWeather. For English, set "en".
When you finish to set these settings , click the "Apply" button.

If you are using a container, the setup is complete. For more information, please click [here](#datasend).
* * *
#### If you are experimenting with something other than containers
Please make the following two settings in advance:
1) Configure the SSL client.
Since the Web API to be connected to will be communicated using HTTPS, configure the SSL client on the IRIS side in advance.
To match the settings of the sample production, we will use the name **[openweather]**. The settings in the Production are as follows:

**Click the Management Portal > [System Administration] > [Security] > [SSL/TLS Configuration] > [Create New Configuration]** button, enter **"openweather"** in the "Configuration Name" field, and then click in the "Save" button to finish.

2) Create a base URL for REST
In the sample production, we have configured it so that the information can be entered via REST, and the base URL for REST needs to be configured on the IRIS side.
In the sample, we set /start as the base URL. Since the Start.REST class exists in the namespace where the sample was imported, we will specify this class as the dispatch class and add %All as the application role to omit authentication at the time of access.
**Management Portal > System Administration > Security > Applications > Web Application Path > Click the "Create new web application"** button.
In the Name field, specify **/start**; in the Namespace field, specify the namespace from which the sample was imported; in the Dispatch Class field, specify **Start.REST**; in the Allowed Authentication Method field, select **"Unauthenticated"**, and save the file.
After saving, add the **%All** role to the **application role** on the "Application Roles" tab.


* * *
### Try to send data
Once you are all set up, try to use a business service to send information via REST and let it run.
The above example is a URL that supposes that someone has purchased "Takoyaki" in Osaka City.
The screen after execution is as follows.

Check the messages that have been sent to the **Production**.
In the **Management Portal > Interoperability > Configuration > Production**, click on the service below:
.png)
Select the **"Messages"** tab on the right side of the screen and click on any number below to the header field column. If you do not see it, reload your browser.

Using the Visual Trace page, you can see the information of **messages** sent and received between components. You can see that the weather information is retrieved from the Web API and sent back in the **light blue frame**.
In this way, you can use tracing to see what data was being sent and received at that time and in what order.
Throughout this article, we have confirmed that **Production** has defined the necessary components and their settings for system integration by referring to the sample code settings.
We also confirmed that we could refer to the messages flowing through the **Production** in chronological order by using the Visual Trace page.
In the next articles, we will discuss the concept behind creating the **"message"** shown in this trace and how actually to define it.