Clear filter
Announcement
Anastasia Dyubaylo · Aug 29, 2019
Hi Everyone!
New video, recorded by @Benjamin.DeBoe, is already on InterSystems Developers YouTube:
Scaling Fluidly with InterSystems IRIS
In this video, @Benjamin.DeBoe, InterSystems Product Manager, explains how sharding capabilities in InterSystems IRIS provide a more economical approach to fluidly scaling your systems than traditional vertical scaling.
And...
To learn more about InterSystems IRIS and its scalability features, you can browse more content at http://www.learning.intersystems.com.
Enjoy watching the video!
Article
Evgeny Shvarov · Aug 29, 2019
Hi Developers!
Often when we develop some library, tool, package, whatever on InterSystems ObjectScript we have a question, how we deploy this package on the target machine?
Also, we often expect that some other libraries already installed, so our package depends on them, and often on some particular version of it.
When you code on javascript, python, etc the role of packages deployment with dependency management takes package manager.
So, I'm pleased to announce that InterSystems ObjectScript Package Manager available!
CAUTION!
Official Disclaimer.
InterSystems ObjectScript Package Manager server situated on pm.community.intersystems.com and InterSystems ObjectScript Package Manager client installable from pm.community.intersystems.com or from Github are not supported by InterSystems Corporation and are presented as-is under MIT License. Use it, develop it, contribute to it on your own risk.
How does it work?
InterSystems ObjectScript Package Manager consists of two parts. There is a Package Manager server which hosts ObjectScript packages and exposes API for ZPM clients to deploy and list packages. Today we have a Developers Community Package Manager server available at pm.community.intersystems.com.
You can install any package into InterSystems IRIS via ZPM client installed first into IRIS system.
How to use InterSystems Package Manager?
1. Check the list of available packages
Open https://pm.community.intersystems.com/packages/-/all to see the list of currently available packages.
[{"name":"analyzethis","versions":["1.1.1"]},{"name":"deepseebuttons","versions":["0.1.7"]},{"name":"dsw","versions":["2.1.35"]},{"name":"holefoods","versions":["0.1.0"]},{"name":"isc-dev","versions":["1.2.0"]},{"name":"mdx2json","versions":["2.2.0"]},{"name":"objectscript","versions":["1.0.0"]},{"name":"pivotsubscriptions","versions":["0.0.3"]},{"name":"restforms","versions":["1.6.1"]},{"name":"thirdpartychartportlets","versions":["0.0.1"]},{"name":"webterminal","versions":["4.8.3"]},{"name":"zpm","versions":["0.0.6"]}]
Every package has the name and the version.
If you want to install one on InterSystems IRIS you need to have InterSystems ObjectScript Package Manager Client aka ZPM client installed first.
2. Install Package Manager client
Get the release of the ZPM client from ZPM server: https://pm.community.intersystems.com/packages/zpm/latest/installer
It is ObjectScript package in XML, so it could be installed by importing into classes via Management Portal, or by terminal:
USER>Do $System.OBJ.Load("/yourpath/zpm.xml","ck")
once installed it can be called from any Namespace cause it installs itself in %SYS as Z-package.
3. Working with ZPM client
Zpm client has CLI interface. Call zpm in any namespace like:
USER>zpm
zpm: USER>
Call help to see the list of all the available commands.
Check the list of currently available packages on ZPM server (pm.community.intersystems.com):
zpm: USER>repo -list-modules -n registry
deepseebuttons 0.1.7 dsw 2.1.35 holefoods 0.1.0 isc-dev 1.2.0 mdx2json 2.2.0 objectscript 1.0.0 pivotsubscriptions 0.0.3 restforms 1.6.1 thirdpartychartportlets 0.0.1 webterminal 4.8.3 zpm 0.0.6
Installing a Package
To install the package call
install package-name version
This will install the package with all the dependencies. You can omit version to get the latest package. Here is how to install the latest version of web terminal:
zpm: USER> install webterminal
How to know what is already installed?
Call list command:
zpm:USER> list
zpm 0.0.6
webterminal 4.8.3
Uninstall the Package
zpm: USER> uninstall webterminal
Supported InterSystems Data Platforms
Currently, ZPM supports InterSystems IRIS and InterSystems IRIS for Health.
I want my package to be listed on Package Manager
It's possible. The requirements are:
Code should work in InterSystems IRIS
You need to have module.xml in the root.
Module.xml is the file which describes the structure of the package and what is need to be set up on the deployment phase. Examples of module.xml could be very simple, e.g.
ObjectScript Example
Or relatively simple:
Samples BI (previously known as HoleFoods),
Web Terminal
Module with dependencies:
DeepSee Web expects MDX2JSON to be installed and this is how it described in module.xml:DeepSeeWeb
If you want your application to be listed on the Community Package Manager comment in this post or DM me.
Collaboration and Support
ZPM server source code is not available at the moment and will be available soon.
ZPM client source code is available here and is currently supported by InterSystems Developers Community and is not supported by InterSystems Corporation. You are welcome to submit issues and pull requests.
Roadmap
The current roadmap is:
introduce Open Exchange support,
introduce the automation for package updating and uploading;
open source ZPM server.
Stay tuned and develop your InterSystems ObjectScript packages on InterSystems IRIS! Interesting... however I have some questions:
1 - Is there any plans to automatize the `module.xml` generation by using something like a Wizard?
2 - Is there any plans to support non-specific dependency versions like NPM does?
3 - Is it possible to run pre/post-install scripts as well? Kind of what installer classes do.
4 - Is also possible to use the `module.xml`to provide a contextual root? this way it would be used to run UnitTests without the need of defining (or overwriting) the UnitTestRoot global. I already did it with Port, so it's not hard to implement as you basically have to overwrite the `Root` method:
```objectscript
Class Port.UnitTest.Manager Extends %UnitTest.Manager
{
ClassMethod Root() As %String
{
// This provides us the capability to search for tests unrelated to ^UnitTestRoot.
return ##class(%File).NormalizeFilename(##class(Port.Configuration).GetWorkspace())
}
ClassMethod RunTestsFromWorkspace(projectName As %String, targetClass As %String = "", targetMethod As %String = "", targetSpec As %String = "") As %Status
{
set recursive = "recursive"
set activeProject = $get(^||Port.Project)
set ^||Port.Project = projectName
if targetClass '= "" set target = ##class(Port.UnitTest.Util).GetClassTestSpec(targetClass)
else set target = ##class(Port.Configuration).GetTestPath()
if targetMethod '= "" {
set target = target_":"_targetMethod
set recursive = "norecursive"
}
set sc = ..RunTest(target_$case(targetSpec, "": "", : ":"_targetSpec), "/"_recursive_"/run/noload/nodelete")
set ^||Port.Project = activeProject
return sc
}
}
``` 1 - Is there any plans to automatize the module.xml generation by using something like a Wizard?Submit an issue? More over, craft a module which supports that! And PR - it's a Community Package manager.3 - Is it possible to run pre/post-install scripts as well? Kind of what installer classes do.I think, this already in place. @Dmitry.Maslennikov who contributed a lot will comment.4 - Is also possible to use the module.xmlto provide a contextual root?We maybe can use the code! Thanks! @Dmitry.Maslennikov ? And!It worth to mention, that credit in development goes to:@Timur.Safin6844 @Timothy.Leavitt @Dmitry.Maslennikov Thank you very much, guys! I hope the list of contributors will be much longer soon! 1 - Is there any plans to automatize the module.xml generation by using something like a Wizard?Any reasons for it? Are you so lazy, that you can't write this simple XML by hand? Just kidding, not yet, I think best and fastest what I can do it, add Intellisense in vscode for such files, so you can help to do it easier. Any UI, at the moment, is just a waste of time, it is not so important. And anyway, is there any wizard from NPM?2 - Is there any plans to support non-specific dependency versions like NPM does?It is already there, should work the same as semver in npm3 - Is it possible to run pre/post-install scripts as well? Kind of what installer classes do.There is already something like this, but I would like to change this way.4 - Is also possible to use the module.xmlto provide a contextual root? Not sure about contextual root. But if saying about UnitTests, yes actually there are many things which should be changed in the original %UnitTests engine. But in this case, it has some way to run tests without care about UnitTestRoot global. ZPM itself has own module.xml, and look there. You will find lines about UnitTests. with this definition, you run these commands, and it will run tests in different phases
zpm: USER>packagename test
zpm: USER>packagename verify
> Any reasons for it? Are you so lazy, that you can't write this simple XML by hand? Just kidding, not yet, I think best and fastest what I can do it, add Intellisense in vscode for such files, so you can help to do it easier. Any UI, at the moment, is just a waste of time, it is not so important. And anyway, is there any wizard from NPM?
Haha, I'll overlook this first line. I meant something like a CLI wizard really, that asks you for steps, but maybe this can be something apart, you know Yeoman don't you?
NPM does have the command `npm init` which asks you the basic information about your package and generates a package.json.
> It is already there, should work the same as semver in npm
Nice! Does it follow the same format as the one from NPM? (symbolically speaking ^ and *)
> Not sure about contextual root. But if saying about UnitTests, yes actually there are many things which should be changed in the original %UnitTests engine. But in this case, it has some way to run tests without care about UnitTestRoot global. ZPM itself has own module.xml, and look there. You will find lines about UnitTests. with this definition, you run these commands, and it will run tests in different phases.
Yeah, that's exactly what I meant about contextual root, my wording looked wrong because I intended for this feature to be used outside UnitTest, but now I see that there isn't much use for it outside of unit testing. NPM does have the command npm init which asks you the basic information about your package and generates a package.json.Yes, kind of init command sounds useful. You know, we anyway have many differences with npm, for instance. Like, zpm works inside the database with nothing on disk, while npm in OS close to source files, but I think we can find the way how to achieve the best way.It is already there, should work the same as semver in npmNice! Does it follow the same format as the one from NPM?Yes, the same way This brought me to another question. How does the ZPM handle two modules that use the same dependency but both have different versions while being used as peer dependencies?
Example:
Module A using dependency C requires its major 1.
Module B also uses dependency C but requires the newer major 2.
The requisite for this case is: You must have both because you're making a module D that uses both module A and B. In our case, it should actually fail the installation, as we don't have any way to install two versions at the same time.But, not sure, if we already have such check. Hi @Evgeny.Shvarov
In Step "2. Install Package Manager client", it says to run the following code:
Do $System.OBJ.Load("/yourpath/zpm.xml")
It also needs to be compiled and should probably be:
Do $System.OBJ.Load("/yourpath/zpm.xml","ck") Can ZPM run in Cache' ?
Tried to install ZPM on Cache' but failed:
%SYS>D $system.OBJ.Load("/tmp/deps/zpm.xml","ck")Load started on 04/21/2021 11:30:27Loading file /tmp/deps/zpm.xml as xmlImported class: %ZPM.InstallerCompiling class %ZPM.InstallerCompiling routine : %ZPM.Installer.G1.MACERROR #6353: Unexpected attributes for element Import (ending at line 9 character 45): Recurse > ERROR #5490: Error reported while running generator for method '%ZPM.Installer:setup' > ERROR #5030: An error occurred while compiling class %ZPM.InstallerCompiling routine %ZPM.Installer.1ERROR: Compiling method/s: ExtractPackageERROR: %ZPM.Installer.1(3) : MPP5610 : Referenced macro not defined: 'FileTempDir' TEXT: Set pFolder = ##class(%File).NormalizeDirectory($$$FileTempDir)ERROR: Compiling method/s: MakeERROR: %ZPM.Installer.1(7) : MPP5610 : Referenced macro not defined: 'FileTempDir' TEXT: Set tmpDir = ##class(%File).NormalizeDirectory($$$FileTempDir)%ZPM.Installer.1.INT(79) ERROR #1002: Invalid character in tag : '##class(%Library.File).NormalizeDirectory($$$FileTempDir)' : Offset:61 [zExtractPackage+1^%ZPM.Installer.1] TEXT: Set pFolder = ##class(%Library.File).NormalizeDirectory($$$FileTempDir)%ZPM.Installer.1.INT(117) ERROR #1002: Invalid character in tag : '##class(%Library.File).NormalizeDirectory($$$FileTempDir)' : Offset:62 [zMake+4^%ZPM.Installer.1] TEXT: Set tmpDir = ##class(%Library.File).NormalizeDirectory($$$FileTempDir)Detected 5 errors during load.
ZPM works in IRIS only, but you can search around on DC, and OEX - there were attempts to provide alternative support of ZPM in Cache. I hope this ZPM will be a yet-another-reason to upgrade to IRIS ;) When I try to install webterminal, I am getting the following...
IRIS for Windows (x86-64) 2022.1 (Build 209U) Tue May 31 2022 12:16:40 EDT
zpm:USER>install webterminal [USER|webterminal] Reload START (C:\InterSystems\HealthConnect_2022_1\mgr\.modules\USER\webterminal\4.9.2\)[USER|webterminal] Reload SUCCESS[webterminal] Module object refreshed.[USER|webterminal] Validate START[USER|webterminal] Validate SUCCESS[USER|webterminal] Compile STARTInstalling WebTerminal application to USERCreating WEB application "/terminal"...WEB application "/terminal" is created.Assigning role %DB_CACHESYS to a web application; resulting roles: :%DB_CACHESYS:%DB_USERCreating WEB application "/terminalsocket"...WEB application "/terminalsocket" is created.ERROR #879: Target role %DB_CACHESYS does not exist.[webterminal] Compile FAILUREERROR! Target role %DB_CACHESYS does not exist. > ERROR #5090: An error has occurred while creating projection WebTerminal.Installer:Reference. Hi Scott! which version of IRIS it is? What is the version of ZPM client? I have seen this on both... HealthShare Health Connect
IRIS for Windows (x86-64) 2022.1 (Build 209U) Tue May 31 2022 12:16:40 EDT
IRIS for UNIX (Red Hat Enterprise Linux 8 for x86-64) 2022.1 (Build 209U) Tue May 31 2022 12:13:24 EDT
using the link from https://pm.community.intersystems.com/packages/zpm/latest/installer
zpm:USER>version
%SYS> zpm 0.5.0https://pm.community.intersystems.com - 1.0.6 This was fixed by me on https://github.com/intersystems-community/webterminal/pull/149 in July 2022.
The ZPM package is outdated.
I recommend you use instructions at https://intersystems-community.github.io/webterminal/#docs to install the latest version. @Nikita.Savchenko7047 made a ZPM release - now could be installable via ZPM too
Thanks @John.Murray ! ZPM package is updated When a config for instance CSPApplication already exist and the module.xml has the CSPApplication section, can you opt to not override config on server for if the client for instance made changes to application Roles then after an upgrade it is missing
Announcement
Evgeny Shvarov · Sep 4, 2019
Hi InterSystems Developers!Here is are the release notes of Developers Community new features added in August 2019. What's new?Site performance fix;New features for the job opportunity announcements;feature to embed DC posts into other sites;minor enhancements.See the details below.DC Performance fixIn August we've fixed a major problem in performance and think that we don't need any more tweaks on it. If you feel that some DC functionality needs performance fixes please submit an issue.New features for Job Opportunities on InterSystems Developers communityDevelopers!We want to let you know about all the opportunities for a job as InterSystems Developers so with this release we've introduced a few new features to make Job opportunities announcements on DC more visible and handy to use.Every post with Job Opportunity now have the button - I'm interested which, if you click it, sends a Direct Message to a member who submitted a vacancy with the following text:"Hi! I'm interested in your job opportunity "name of job opportunity"(link). Send me more information please".Job Opportunities now have a new icon - suitcase (for your laptop?).Job posters!If the job opportunity is closed you can click a Close button now which will hide it and which will let developers know that this opportunity is no longer available.This action will hide the opportunity into drafts.DC articles embedded to other sitesWith this release, we introduced a feature to embed DC post (an article, or event, or question) into another site.Just add
"?iframe"
parameter to an article and this gives you a link ready to be embedded into another site as an iframe.
Example:
Minor changes
Again we've fixed a lot of bugs and introduced some nice features, e.g. better support for tables in markdown, new options for members sorting - by posts rating, comments rating, answers rating.
Also, we added an extra link to the app in the bottom of the article if the link to the Open Exchange app is introduced.
We also added a few enhancements in translated articles management - see new articles in Spanish Community.
See the full list of changes with this release and submit your issues and feature requests in a new kanban.
Stay tuned!
Article
Sean Connelly · Sep 10, 2019
In this article, we will explore the development of an IRIS client for consuming RESTful API services that have been developed to the OData API standard.We will be exploring a number of built-in IRIS libraries for making HTTP requests, reading and writing to JSON payloads, and seeing how we can use them in combination to build a generic client adaptor for OData. We will also explore the new JSON adapter for deserializing JSON into persistent objects.Working with RESTful APIsREST is a set of engineering principles that were forged from the work on standardizing the world wide web. These principles can be applied to any client-server communication and are often used to describe an HTTP API as being RESTful.REST covers a number of broad principles that include stateless requests, caching, and uniform API design. It does not cover implementation details and there are no general API specifications to fill in these gaps.The side effect of this ambiguity is that RESTful APIs can lack some of the understanding, tools, and libraries that often build up around stricter ecosystems. In particular, developers must construct their own solutions for the discovery and documentation of RESTful APIs.ODataOData is an OASIS specification for building consistent RESTful API's. The OASIS community is formed from a range of well-known software companies that include Microsoft, Citrix, IBM, Red Hat, and SAP. OData 1.0 was first introduced back in 2007, and the most recent version 4.1 was released this year.The OData specification covers things like metadata, consistent implementations of operations, queries, and exception handling. It also includes additional features such as actions and functions.Exploring the TripPinWS OData APIFor this article we’ll be using the TripPinWS API, which is provided as an example by Odata.org.As with any RESTful API, we would typically expect a base URL for the service. Visiting this base URL in OData will also return a list of API entities.https://services.odata.org:443/V4/(S(jndgbgy2tbu1vjtzyoei2w3e))/TripPinServiceRWWe can see that the API includes entities for Photos, People, Airlines, Airports, Me, and a function called GetNearestAirport.The response also includes a link to the TripPinWS metadata document.https://services.odata.org/V4/(S(djd3m5kuh00oyluof2chahw0))/TripPinServiceRW/$metadataThe metadata is implemented as an XML document and includes its own XSD document. This opens up the possibility of consuming metadata documents using code generated from the IRIS XML schema wizard.The metadata document might look fairly involved at first glance, but it's just describing the properties of types that are used to construct entity schema definitions.We can get back a list of People from the API by using the following URL.https://services.odata.org/V4/(S(4hkhufsw5kohujphemn45ahu))/TripPinServiceRW/PeopleThis returns a list of 8 people, 8 being a hard limit for the number of entities per result. In the real world, we would probably use a much larger limit. It does, however, provide an example of how OData includes additional hypertext links such as the @odata.nextLink, which we can use to fetch the next page of People in the search results.We can also use query string values to narrow down the results list, such as selecting only the top 1 result.https://services.odata.org/V4/(S(4hkhufsw5kohujphemn45ahu))/TripPinServiceRW/People?$top=1We can also try filtering requests by FirstName.https://services.odata.org/V4/(S(4hkhufsw5kohujphemn45ahu))/TripPinServiceRW/People?$filter=FirstName eq 'Russell'In this instance, we used the eq operator to filter on all FirstNames that equal 'Russell'. Note the importance of wrapping strings in single quotes. OData provides a variety of different operators that can be used in combination to build up highly expressive search queries.IRIS %Net PackageIRIS includes a comprehensive standard library. We’ll be using the %Net package, which includes support for protocols such as FTP, Email, LDAP, and HTTP.To use the TripPinWS service we will need to use HTTPS, which requires us to register an HTTPS configuration in the IRIS management portal. There are no complicated certificates to install so it’s just a few steps:Open the IRIS management portal.Click on System Administration > Security > SSL/TLS Configurations.Click the "Create New Configuration" button.Enter the name "odata_org" and hit save.You can choose any name you’d like, but we’ll be using odata_org for the rest of the article.We can now use the HttpRequest class to get a list of all people. If the Get() worked, then it will return 1 for OK. We can then access the response object and output the result to the terminal:DC>set req=##class(%Net.HttpRequest).%New()DC>set req.SSLConfiguration="odata_org"DC>set sc=req.Get("https://services.odata.org:443/V4/(S(jndgbgy2tbu1vjtzyoei2w3e))/TripPinServiceRW/People")DC>w sc1DC>do req.HttpResponse.OutputToDevice()Feel free to experiment with the base HttpRequest before moving on. You could try fetching Airlines and Airports or investigate how errors are reported if you enter an incorrect URL.Developing a generic OData ClientLet's create a generic OData client that will abstract the HttpRequest class and make it easier to implement various OData query options.We’ll call it DcLib.OData.Client and it will extend %RegisteredObject. We’ll define several subclasses that we can use to define the names of a specific OData service, as well as several properties that encapsulate runtime objects and values such as the HttpRequest object. To make it easy to instantiate an OData client, we will also override the %OnNew() method (the class's constructor method) and use it to set up the runtime properties.Class DcLib.OData.Client Extends %RegisteredObject{Parameter BaseURL;Parameter SSLConfiguration;Parameter EntityName;Property HttpRequest As %Net.HttpRequest;Property BaseURL As %String;Property EntityName As %String;Property Debug As %Boolean [ InitialExpression = 0 ];Method %OnNew(pBaseURL As %String = "", pSSLConfiguration As %String = "") As %Status [ Private, ServerOnly = 1 ]{ set ..HttpRequest=##class(%Net.HttpRequest).%New() set ..BaseURL=$select(pBaseURL'="":pBaseURL,1:..#BaseURL) set ..EntityName=..#EntityName set sslConfiguration=$select(pSSLConfiguration'="":pSSLConfiguration,1:..#SSLConfiguration) if sslConfiguration'="" set ..HttpRequest.SSLConfiguration=sslConfiguration quit $$$OK}}We can now define a client class that is specific to the TripPinWS service by extending DcLib.OData.Client and setting the BaseURL and SSL Configuration parameters in one single place.Class TripPinWS.Client Extends DcLib.OData.Client{Parameter BaseURL = "https://services.odata.org:443/V4/(S(jndgbgy2tbu1vjtzyoei2w3e))/TripPinServiceRW";Parameter SSLConfiguration = "odata_org";}With this base client in place, we can now create a class for each entity type that we want to use in the service. By extending the new client class all we need to do is define the entity name in the EntityName parameter.Class TripPinWS.People Extends TripPinWS.Client{Parameter EntityName = "People";}Next, we need to provide some more methods on the base DcLib.OData.Client class that will make it easy to query the entities.Method Select(pSelect As %String) As DcLib.OData.Client{ do ..HttpRequest.SetParam("$select",pSelect) return $this}Method Filter(pFilter As %String) As DcLib.OData.Client{ do ..HttpRequest.SetParam("$filter",pFilter) return $this}Method Search(pSearch As %String) As DcLib.OData.Client{ do ..HttpRequest.SetParam("$search",pSearch) return $this}Method OrderBy(pOrderBy As %String) As DcLib.OData.Client{ do ..HttpRequest.SetParam("$orderby",pOrderBy) return $this}Method Top(pTop As %String) As DcLib.OData.Client{ do ..HttpRequest.SetParam("$top",pTop) return $this}Method Skip(pSkip As %String) As DcLib.OData.Client{ do ..HttpRequest.SetParam("$skip",pSkip) return $this}Method Fetch(pEntityId As %String = "") As DcLib.OData.ClientResponse{ if pEntityId="" return ##class(DcLib.OData.ClientResponse).%New($$$ERROR($$$GeneralError,"Entity ID must be provided"),"") set pEntityId="('"_pEntityId_"')" if $extract(..BaseURL,*)'="/" set ..BaseURL=..BaseURL_"/" set sc=..HttpRequest.Get(..BaseURL_..EntityName_pEntityId,..Debug) set response=##class(DcLib.OData.ClientResponse).%New(sc,..HttpRequest.HttpResponse,"one") quit response}Method FetchCount() As DcLib.OData.ClientResponse{ if $extract(..BaseURL,*)'="/" set ..BaseURL=..BaseURL_"/" set sc=..HttpRequest.Get(..BaseURL_..EntityName_"/$count") set response=##class(DcLib.OData.ClientResponse).%New(sc,..HttpRequest.HttpResponse,"count") quit response}Method FetchAll() As DcLib.OData.ClientResponse{ #dim response As DcLib.OData.ClientResponse if $extract(..BaseURL,*)'="/" set ..BaseURL=..BaseURL_"/" set sc=..HttpRequest.Get(..BaseURL_..EntityName,..Debug) set response=##class(DcLib.OData.ClientResponse).%New(sc,..HttpRequest.HttpResponse,"many") if response.IsError() return response //if the response has a nextLink then we need to keep going back to fetch more data while response.Payload.%IsDefined("@odata.nextLink") { //stash the previous value array, push the new values on to it and then //set it back to the new response and create a new value iterator set previousValueArray=response.Payload.value set sc=..HttpRequest.Get(response.Payload."@odata.nextLink",..Debug) set response=##class(DcLib.OData.ClientResponse).%New(sc,..HttpRequest.HttpResponse) if response.IsError() return response while response.Value.%GetNext(.key,.value) { do previousValueArray.%Push(value) } set response.Payload.value=previousValueArray set response.Value=response.Payload.value.%GetIterator() } return response}We've added nine new methods. The first six are instance methods for defining query options, and the last three are methods for fetching one, all, or a count of all entities.Notice that the first six methods are essentially a wrapper for setting parameters on the HTTP request object. To make implementation coding easier, each of these methods returns an instance of this object so that we can chain the methods together. Before we explain the main Fetch() method let’s see the Filter() method in action.set people=##class(TripPinWS.People).%New().Filter("UserName eq 'ronaldmundy'").FetchAll()while people.Value.%GetNext(.key,.person) { write !,person.FirstName," ",person.LastName }If we use this method, it returns:Ronald MundyThe example code creates an instance of the TripPinWS People object. This sets the base URL and certificate configuration in its base class. We can then call its Filter method to define a filter query and then FetchAll() to trigger the HTTP request.Note that we can directly access the people results as a dynamic object, not as raw JSON data. This is because we are also going to implement a ClientResponse object that makes exception handling simpler. We also generate dynamic objects depending on the type of result that we get back.First, let's discuss the FetchAll() method. At this stage, our implementation classes have defined the OData URL in its base class configuration, the helper methods are setting additional parameters, and the FetchAll() method needs to build the URL and make a GET request. Just as in our original command-line example, we call the Get() method on the HttpRequest class and create a ClientResponse from its results.The method is complicated because the API only returns eight results at a time. We must handle this in our code and use the previous result's nextLink value to keep fetching the next page of results until there are no more pages. As we fetch each additional page, we store the previous results array and then push each new result on to it.The Fetch(), FetchAll() and FetchCount() methods return an instance of a class called DcLib.OData.ClientResponse. Let's create that now to handle both exceptions and auto deserialize valid JSON responses.Class DcLib.OData.ClientResponse Extends %RegisteredObject{Property InternalStatus As %Status [ Private ];Property HttpResponse As %Net.HttpResponse;Property Payload As %Library.DynamicObject;Property Value;Method %OnNew(pRequestStatus As %Status, pHttpResponse As %Net.HttpResponse, pValueMode As %String = "") As %Status [ Private, ServerOnly = 1 ]{ //check for immediate HTTP error set ..InternalStatus = pRequestStatus set ..HttpResponse = pHttpResponse if $$$ISERR(pRequestStatus) { if $SYSTEM.Status.GetOneErrorText(pRequestStatus)["<READ>" set ..InternalStatus=$$$ERROR($$$GeneralError,"Could not get a response from HTTP server, server could be uncontactable or server details are incorrect") return $$$OK } //if mode is count, then the response is not JSON, its just a numeric value //validate that it is a number and return all ok if true, else let it fall through //to pick up any errors that are presented as JSON if pValueMode="count" { set value=pHttpResponse.Data.Read(32000) if value?1.N { set ..Value=value return $$$OK } } //serialise JSON payload, catch any serialisation errors try { set ..Payload={}.%FromJSON(pHttpResponse.Data) } catch err { //check for HTTP status code error first if $e(pHttpResponse.StatusCode,1)'="2" { set ..InternalStatus = $$$ERROR($$$GeneralError,"Unexpected HTTP Status Code "_pHttpResponse.StatusCode) if pHttpResponse.Data.Size>0 return $$$OK } set ..InternalStatus=err.AsStatus() return $$$OK } //check payload for an OData error if ..Payload.%IsDefined("error") { do ..HttpResponse.Data.Rewind() set error=..HttpResponse.Data.Read(32000) set ..InternalStatus=$$$ERROR($$$GeneralError,..Payload.error.message) return $$$OK } //all ok, set the response value to match the required modes (many, one, count) if pValueMode="one" { set ..Value=..Payload } else { set iterator=..Payload.value.%GetIterator() set ..Value=iterator } return $$$OK}Method IsOK(){ return $$$ISOK(..InternalStatus)}Method IsError(){ return $$$ISERR(..InternalStatus)}Method GetStatus(){ return ..InternalStatus}Method GetStatusText(){ return $SYSTEM.Status.GetOneStatusText(..InternalStatus)}Method ThrowException(){ Throw ##class(%Exception.General).%New("OData Fetch Exception","999",,$SYSTEM.Status.GetOneStatusText(..InternalStatus))}Method OutputToDevice(){ do ..HttpResponse.OutputToDevice()}}Given an instance of the ClientResponse object, we can first test to see if there was an error. Errors can happen on several levels, so we want to return them in a single, easy-to-use solution.set response=##class(TripPinWS.People).%New().Filter("UserName eq 'ronaldmundy'").FetchAll()if response.IsError() write !,response.GetStatusText() quitThe IsOK() and IsError() methods check the object for errors. If an error occurred, we can call GetStatus() or GetStatusText() to access the error, or use ThrowException() to pass the error to an exception handler.If there is no error, then the ClientResponse will assign the raw payload object to the response Payload property:set ..Payload={}.%FromJSON(pHttpResponse.Data)It will then set the response Value property to the main data array within the payload, either as a single instance or as an array iterator to traverse many results.I've put all of this code together in a single project on GitHub https://github.com/SeanConnelly/IrisOData/blob/master/README.md which will make more sense when reviewed as a whole. All of the following examples are included in the source GitHub project.Using the OData ClientThere is just one more method we should understand on the base Client class: the With() method. If you don't want to create an instance of every entity, you can instead use the With() method with just one single client class. The With() method will establish a new client with the provided entity name:ClassMethod With(pEntityName As %String) As DcLib.OData.Client{ set client=..%New() set client.EntityName=pEntityName return client}We can now use it to fetch all people using the base Client class:/// Fetch all "People" using the base client class and .With("People")ClassMethod TestGenericFetchAllUsingWithPeople(){ #dim response As DcLib.OData.ClientResponse set response=##class(TripPinWS.Client).With("People").FetchAll() if response.IsError() write !,response.GetStatusText() quit while response.Value.%GetNext(.key,.person) { write !,person.FirstName," ",person.LastName }}Or, using an entity per class approach:/// Fetch all "People" using the People classClassMethod TestFetchAllPeople(){ #dim people As DcLib.OData.ClientResponse set people=##class(TripPinWS.People).%New().FetchAll() if people.IsError() write !,people.GetStatusText() quit while people.Value.%GetNext(.key,.person) { write !,person.FirstName," ",person.LastName }}As you can see, they’re very similar. The correct choice depends on how important autocomplete is to you with concrete entities, and whether you want a concrete entity class to add more entity-specific methods.DC>do ##class(TripPinWS.Tests).TestFetchAllPeople()Russell WhyteScott KetchumRonald Mundy… more peopleNext, let's implement the same for Airlines:/// Fetch all "Airlines"ClassMethod TestFetchAllAirlines(){ #dim airlines As DcLib.OData.ClientResponse set airlines=##class(TripPinWS.Airlines).%New().FetchAll() if airlines.IsError() write !,airlines.GetStatusText() quit while airlines.Value.%GetNext(.key,.airline) { write !,airline.AirlineCode," ",airline.Name }}And from the command line ...DC>do ##class(TripPinWS.Tests).TestFetchAllAirlines()AA American AirlinesFM Shanghai Airline… more airlinesAnd now airports:/// Fetch all "Airports"ClassMethod TestFetchAllAirports(){ #dim airports As DcLib.OData.ClientResponse set airports=##class(TripPinWS.Airports).%New().FetchAll() if airports.IsError() write !,airports.GetStatusText() quit while airports.Value.%GetNext(.key,.airport) { write !,airport.IataCode," ",airport.Name }}And from the command line...DC>do ##class(TripPinWS.Tests).TestFetchAllAirports()SFO San Francisco International AirportLAX Los Angeles International AirportSHA Shanghai Hongqiao International Airport… more airportsSo far we’ve been using the FetchAll() method. We can also use the Fetch() method to fetch a single entity using the entity’s primary key:/// Fetch single "People" entity using the persons IDClassMethod TestFetchPersonWithID(){ #dim response As DcLib.OData.ClientResponse set response=##class(TripPinWS.People).%New().Fetch("russellwhyte") if response.IsError() write !,response.GetStatusText() quit //lets use the new formatter to pretty print to the output (latest version of IRIS only) set jsonFormatter = ##class(%JSON.Formatter).%New() do jsonFormatter.Format(response.Value)}In this instance, we are using the new JSON formatter class, which can take a dynamic array or object and output it to formatted JSON.DC>do ##class(TripPinWS.Tests).TestFetchPersonWithID(){ "@odata.context":"http://services.odata.org/V4/(S(jndgbgy2tbu1vjtzyoei2w3e))/TripPinServiceRW/$metadata#People/$entity", "@odata.id":"http://services.odata.org/V4/(S(jndgbgy2tbu1vjtzyoei2w3e))/TripPinServiceRW/People('russellwhyte')", "@odata.etag":"W/\"08D720E1BB3333CF\"", "@odata.editLink":"http://services.odata.org/V4/(S(jndgbgy2tbu1vjtzyoei2w3e))/TripPinServiceRW/People('russellwhyte')", "UserName":"russellwhyte", "FirstName":"Russell", "LastName":"Whyte", "Emails":[ "Russell@example.com", "Russell@contoso.com" ], "AddressInfo":[ { "Address":"187 Suffolk Ln.", "City":{ "CountryRegion":"United States", "Name":"Boise", "Region":"ID" } } ], "Gender":"Male", "Concurrency":637014026176639951}Persisting ODataIn the final few examples, we will demonstrate how the OData JSON could be deserialized into persistent objects using the new JSON adapter class. We will create three classes — Person, Address, and City — which will reflect the Person data structure in the OData metadata. We will use the %JSONIGNOREINVALIDFIELD set to 1 so that the additional OData properties such as @odata.context do not throw a deserialization error.Class TripPinWS.Model.Person Extends (%Persistent, %JSON.Adaptor){Parameter %JSONIGNOREINVALIDFIELD = 1;Property UserName As %String;Property FirstName As %String;Property LastName As %String;Property Emails As list Of %String;Property Gender As %String;Property Concurrency As %Integer;Relationship AddressInfo As Address [ Cardinality = many, Inverse = Person ];Index UserNameIndex On UserName [ IdKey, PrimaryKey, Unique ];} Class TripPinWS.Model.Address Extends (%Persistent, %JSON.Adaptor){Property Address As %String;Property City As TripPinWS.Model.City;Relationship Person As Person [ Cardinality = one, Inverse = AddressInfo ];} Class TripPinWS.Model.City Extends (%Persistent, %JSON.Adaptor){Property CountryRegion As %String;Property Name As %String;Property Region As %String;}Next, we will fetch Russel Whyte from the OData service, create a new instance of the Person model, then call the %JSONImport() method using the response value. This will populate the Person object, along with the Address and City details.ClassMethod TestPersonModel(){ #dim response As DcLib.OData.ClientResponse set response=##class(TripPinWS.People).%New().Fetch("russellwhyte") if response.IsError() write !,response.GetStatusText() quit set person=##class(TripPinWS.Model.Person).%New() set sc=person.%JSONImport(response.Value) if $$$ISERR(sc) write !!,$SYSTEM.Status.GetOneErrorText(sc) return set sc=person.%Save() if $$$ISERR(sc) write !!,$SYSTEM.Status.GetOneErrorText(sc) return}We can then run a SQL command to see the data is persisted.SELECT ID, Concurrency, Emails, FirstName, Gender, LastName, UserNameFROM TripPinWS_Model.PersonID Concurrency Emails FirstName Gender LastName UserNamerussellwhyte 637012191599722031 Russell@example.com Russell@contoso.com Russell Male Whyte russellwhyteFinal ThoughtsAs we’ve seen, it’s easy to consume RESTful OData services using the built-in %NET classes. With a small amount of additional helper code, we can simplify the construction of OData queries, unify error reporting, and automatically deserialize JSON into dynamic objects.We can then create a new OData client just by providing its base URL and, if required, an HTTPS configuration. We then have the option to use this one class and the .With('entity') method to consume any entity on the service, or create named subclasses for the entities that we are interested in.We have also demonstrated that it's possible to deserialize JSON responses directly into persistent classes using the new JSON adaptor. In the real world, we might consider denormalizing this data first and ensure that the JSON adapter class works with custom mappings.Finally, working with OData has been a real breeze. The consistency of service implementation has required much less code than I often experience with bespoke implementations. Whilst I enjoy the freedom of RESTful design, I would certainly consider implementing a standard in my next server-side solution. Awesome! Thanks Sean,
This helped me understand the OData specification in a quick way.
Paul Excellent post. My app allows expose IRIS as a Odata server, see: https://openexchange.intersystems.com/package/OData-Server-for-IRIS
Announcement
Olga Zavrazhnova · Sep 22, 2022
Hi Community! InterSystems will be a technical sponsor at CalHacks hackathon by UC Berkeley, October 14-16 2022. We can't reveal our challenge at the moment, but as a little spoiler - it is related to healthcare ;)The team from our side @Evgeny.Shvarov @Dean.Andrews2971 @Regilo.Souza @Akshat.Vora and more!Join InterSystems in this fun and inspirational event - apply to be a mentor, volunteer, or judge here
Announcement
Anastasia Dyubaylo · Feb 14, 2023
Hi Developers,
Enjoy watching the new video on InterSystems Developers YouTube:
⏯ InterSystems Startup Accelerator Pitching Sessions @ Global Summit 2022
Check out the innovative solutions introduced by cutting-edge healthcare startups. Members of InterSystems 2022 FHIR Startup Accelerator - Caelestinus - will present their solutions, all built on InterSystems FHIR Cloud and Health Connect services.
Presenters:
🗣 @Evgeny.Shvarov, Startups and Community Manager, InterSystems🗣 @Dean.Andrews2971, Head of Developer Relations, InterSystems🗣 Martin Zubek, Business Development Manager🗣 Tomas Studenik, Caelestinus Incubator cofounder
Enjoy watching and stay tuned! 👍
Article
Chad Severtson · Apr 12, 2023
Spoilers: Daily Integrity Checks are not only a best practice, but they also provide a snapshot of global sizes and density.
Tracking the size of the data is one of the most important activities for understanding system health, tracking user activity, and for capacity planning ahead of your procurement process. InterSystems products store data in a tree-structure called globals. This article discusses how to determine global sizes – and therefore the size of your data. The focus is on balancing impact versus precision.
SQL tables are simply a projection of underlying globals. Looking at table size currently means needing to look at the corresponding global sizes. A more efficient sampling-based mechanism is currently being developed. Understanding the relationship between tables and globals may require some additional steps, discussed below.
Data
The specific data that needs to be collected varies depending on the specific question you're trying to answer. There is a fundamental difference between the space "allocated" for a global and the space "used" by a global which is worth considering. In general, the allocated space is usually sufficient as it corresponds to the space used on disk. However, there are situations where the used space and packing data are essential -- e.g. when determining if a global is being stored efficiently following a large purge of data.
Allocated Space - These are units of 8KB blocks. Generally, only one global can use one block. Therefore, even the smallest global occupies at least 8KB. This is also functionally the size on disk of the global. Determining allocated space only requires examining bottom-pointer blocks (and data-blocks which contain big-strings). Except in rare or contrived scenarios, there are typically multiple orders of magnitude fewer pointer blocks than data blocks. This metric is usually sufficient to understand growth trends if collected on a regular basis.
Used Space – “Used” is the sum of the data stored within the global and the necessary overhead. Globals often allocate more space on disk than is actually “used” as a function of usage patterns and our block structure.
Packing: Calculating the actual space used will also provide information about the global “packing” – how densely the data is stored. It can sometimes be necessary or desirable to store the globals more efficiently -- especially if they are not frequently updated. For systems with random updates, inserts, or deletes, a packing of 70% is generally considered optimal for performance. This value fluctuates based on activity. Spareness most often correlates with deletions.
IO Cost: Unfortunately, with great precision comes great IO requirements. Iterating 8KB block by 8KB block through a large database will not only take a long time, but it may also negatively impact performance on systems that are already close to their provisioned limits. This is much more expensive than determining if a block is allocated. This operation will take on the order of (# of parallel processes) / (read latency) * (database size – free space) to return an answer.
InterSystems provides several tools for determining the size of globals within a particular database. Generally, both the global name and the full path of the underlying database directory need to be known in order to determine the size. For more complex deployments, math is required to determine the total size of a global spread across multiple databases via subscript level mapping.
Determining Global Names:
Use the Extent Manager to list the globals associated with a table:
SQL: Call %ExtentMgr.GlobalsUsed('Package.Class.cls')
Review the storage definition within the Management Portal, within VS Code (or Studio), or by querying %Dictionary.StorageDefinition.
SQL: SELECT DataLocation FROM %Dictionary.StorageDefinition WHERE parent = 'Package.ClassName'
ObjectScript: write ##class(%Dictionary.ClassDefinition).%OpenId("Package.ClassName").Storages.GetAt(1).DataLocation
Hashed Global Names are common when the tables are defined using DDLs, i.e. CREATE TABLE. This behavior can be modified by specifying USEEXTENTSET and DEFAULTGLOBAL. Using hashed global names and storing only one index per global have shown performance benefits. I use the following query to list the non-obvious globals in a namespace
SQL for All Classes:
SELECT Parent, DataLocation, IndexLocation, StreamLocation
FROM %Dictionary.StorageDefinition
WHERE Parent->System = 0 AND DataLocation IS NOT NULL
SQL for Specific Classes:
CALL %ExtentMgr.GlobalsUsed('Package.Class.cls')
Determining Database Path:
For the simplest deployments where a namespace does not have additional global mappings for application data, it is often possible to substitute "." for the directory. That syntactic sugar will tell the API to look at the current directory for the current namespace.
For SQL oriented deployments, CREATE DATABASE follows our best practices and creates TWO databases -- one for code and one for data. It’s best to verify the default globals database for the given Namespace in the Management Portal or in the CPF.
It is possible to programmatically determine the destination directory for a particular global (or subscript) in the current namespace:
ObjectScript:
set global = "globalNameHere"
set directory = $E(##class(%SYS.Namespace).GetGlobalDest($NAMESPACE, global),2,*)
For more complex deployments with many mappings, it may be necessary to iterate through Config.MapGlobals in the %SYS Namespace and sum the global sizes:
SQL: SELECT Namespace, Database, Name FROM Config.MapGlobals
Determining Global Sizes:
Once the name of the global and the destination database path are determined, it is possible to collect information on the global size. Here are a few options:
Integrity Check – Nightly Integrity Checks are a good practice. An even better practice is to perform them against a restored backup to also verify the backup and restore process while offloading the IO to another system. This process verifies the physical integrity of the database blocks by reading each allocated block. It also tabulates both the allocated space of all the globals AND tracks the average packing of the blocks along the way.
See Ray’s great post on Integrity Check performance.
In IRIS 2022.1+, Integrity Checks can now even multi-process a single global.
Example Integrity Check Output:
Global: Ens.MessageHeaderD 0 errors found
Top Pointer Level: # of blocks=1 8kb (2% full)
Pointer Level: # of blocks=25 200kb (19% full)
Bottom Pointer Level: # of blocks=3,257 25MB (79% full)
Data Level: # of blocks=2,630,922 20,554MB (88% full)
Total: # of blocks=2,634,205 20,579MB (88% full)
Elapsed Time = 238.4 seconds, Completed 01/17/2023 23:41:12
%Library.GlobalEdit.GetGlobalSize – The following APIs can be used to quickly determine the allocated size of a single global. This may still take some time for multi-TB globals.
ObjectScript: w ##class(%Library.GlobalEdit).GetGlobalSize(directory, globalName, .allocated, .used, 1)
Embedded Python:
import iris
allocated = iris.ref("")
used =iris.ref("")
fast=1
directory = "/path/to/database"
global = "globalName"
iris.cls('%Library.GlobalEdit').GetGlobalSize(directory, global, allocated, used, fast)
allocated.value
used.value
%Library.GlobalEdit.GetGlobalSizeBySubscript – This is helpful for determining the size of subscript or subscript range. E.g. Determine the size of one index. It will include all descendants within the specified range. Warning: as of IRIS 2023.1 there is not a “fast” flag to only return the allocated size. It will read all of the data blocks within the range.
ObjectScript: ##class(%Library.GlobalEdit).GetGlobalSizeBySubscript(directory, startingNode, EndingNode, .size)
%SYS.GlobalQuery.Size – This API is helpful for surveying multiple globals within a database, with or without filters. A SQL Stored Procedure available for customers that primarily interact with IRIS via SQL.
SQL: CALL %SYS.GlobalQuery_Size('database directory here', '','*',,,1)
^%GSIZE – Executing this legacy utility and choosing to “show details” will read each data block to determine the size of the data. Without filtering the list of globals, it may read through almost the entire database block by block with a single thread.
Running ^%GSIZE with details is the slowest option for determining global sizes. It much slower than our heavily optimized Integrity Checks!
There is an additional entry point that will return the allocated size for a particular global – including when scoped to a subscript. Unfortunately, it does not work on subscript ranges.
ObjectScript: write $$AllocatedSize^%GSIZE("global(""subscript"")")
Database Size – The easiest case for determining global size is when there is only a single global within a single database. Simply subtract the total free space within the database from the total size of the database. The database size is available from the OS or via SYS.Database. I often use a variation of this approach to determine the size of a disproportionately large global by subtracting the sum of all the other globals in the database.
ObjectScript: ##class(%SYS.DatabaseQuery).GetDatabaseFreeSpace(directory, .FreeSpace)
SQL: call %SYS.DatabaseQuery_FreeSpace()
Embedded Python:
import iris
freeSpace = iris.ref("")
directory = "/path/to/database"
iris.cls('%SYS.DatabaseQuery').GetDatabaseFreeSpace(directory, freeSpace)
freeSpace.value
Process Private Globals - PPGs are special process-scoped globals stored within IRISTEMP. They are often not enumerated by the other tools. When IRISTEMP is expanding rapidly or reporting low freespace, PPGs are frequently the explanation. Consider examining the per process usage of PPGs via %SYS.ProcessQuery.
SQL: SELECT PID, PrivateGlobalBlockCount FROM %SYS.ProcessQuery ORDER BY PrivateGlobalBlockCount DESC
Questions for the readers:
How often do you track your global sizes?
What do you do with global size information?
For SQL focused users, do you track the size of individual indices?
Chad, thank you for complete explanation of available options. As to you questions:
1. We have a TASKMGR task which calculates the size of each global in all databases. It's usually scheduled by our customers for daily run.2. The main purpose of collecting such info is the ability to quickly answer the questions like this: "why my database is growing so fast?". Integrity Check is not used for the similar purpose because it can't be scheduled for daily run due to its relative slowness in our versions of Cache and IRIS. Wow, what a useful thread! Thank you.
I especially like the fact that you offer solution in each IRIS language : SQL, Python, ObjectScript. Great article Chad!
FWIW, we're working on a faster version of ^%GSIZE (with an API more fitting the current century ;-) ) that uses stochastic sampling similar to the faster table stats gathering introduced in 2021.2? I'll also take the opportunity for a shameless plug of my SQL utilities package, which incorporates much of what's described in this article and will take advantage of that faster global size estimator as soon as it's released. I plan to delete this article the second that the version containing Stochastic Sampling is installed everywhere. In the meantime, I can think of a few customers that need alternatives.
Your SQL utilities are great and also available via Open Exchange.
Article
Shanshan Yu · Apr 19, 2023
With the improvement of living standards, people pay more and more attention to physical health. And the healthy development of children has become more and more a topic of concern for parents. The child's physical development can be reflected from the child's height and weight. Therefore, it is of great significance to predict the height and weight in a timely manner. Pay attention to the child's developmental state through scientific prediction and comparison.
The project uses InterSystems IRIS Cloud SQL to support by entering a large number of weight and height related data, and establishes AutoML based on IntegratedML for predictive analysis. According to the input parent height, it can quickly predict the future height of children, and judge whether the child's body mass index is based on the current height and weight status. In the normal range.
Key Applications: InterSystems IRIS Cloud SQL, IntegratedML
Function:
By applying this program, the height of children in normal developmental state can be quickly predicted. Through the results, parents can judge whether the child's development is normal and whether clinical intervention is required, which will help to understand the child's future height; through the current weight status Determine whether the current child's BMI is normal and understand the child's current health status.
Application Scenario
1. Children's height prediction
2. Monitoring of child development
Application Principles
The client and server of the application were completed using Vue and Java respectively, while the database was completed using InterSystems Cloud SQL, a cloud database platform from Intersystems.
***The main prediction function of the application uses the Integrated ML of InterSystems Cloud SQL. It effectively helped me quickly create and train data models, and successfully implemented prediction functions.
Test Flow
① Select the module
② Fill in the relevant data. If there is adult sibling data, you can click add to fill in the information.
③ Click Submit and wait for the prediction result to appear in a while.
Article
Vadim Aniskin · Apr 28, 2023
Hey Community!
Here is a short article on how to create an idea on InterSystems Ideas.
0. Register on Ideas Portal if you aren't a member yet or log in. You can easily register using your InterSystems Developer Community ID.
1. Click on the "Add a new idea" button
and you will see the form to add the idea.
2. First, provide a one-sentence summary of the idea which is the required field. When you start typing, you will see a list of ideas with similar words in their names or tags. In case a similar idea is already created, vote or comment on this idea. The optimal size of an idea summary is 4-12 words.
3. Next, describe the idea in the "Please add more details" field.
In addition to text, you can attach screenshots or other files and insert tables and links. There is a full-screen mode that helps you see the whole description of your idea without scrolling.
4. Then you need to fill in the required field "Category". The correct category will help to assign your idea to the appropriate expert in the InterSystems team.
In case you first sorted ideas by category and then pushed the button "Add a new idea", the idea's category will be added automatically.
5. Optionally, you can add tags to your idea, so other users can find it easily based on tags. The list of tags starts with tags having an "InterSystems" title, all other tags are sorted in alphabetical order.
6. Click on "Add idea" to submit.
Hope this helps you share your ideas with others! If you have any questions, please send a direct message to @Vadim.Aniskin.
---------------------
* Please take into account that ideas and comments should be in English.* Ideas Portal admins can ask questions using Developer Community direct messages to clarify the idea and its category. Please answer these questions to make your idea visible to all users.* When you create an idea you automatically subscribe to e-mail notifications related to your idea, including:
changes in the status of your idea
comments on your idea posted by portal admins (you don't get notifications about comments from other users)
Announcement
Raj Singh · May 10, 2023
InterSystems is committed to providing a high quality developer experience including a great IDE (Integrated Developer Experience). For the past several years we have been evolving Visual Studio Code's ObjectScript tooling in parallel with our long-standing IDE, InterSystems Studio. There have been over 46,000 downloads of the VSCode-ObjectScript plugin, and the feedback from developers is that this is a great developer experience, and now superior to InterSystems Studio.
With our 2023.2 release we are deprecating InterSystems Studio (deprecated designates a feature or technology that InterSystems no longer actively develops, and for which better options exist). We will continue to maintain, release, and support InterSystems Studio for quite some time, and we appreciate that it is still an important tool for some customers. However, customers should be advised that we are not investing in it. Our focus is on VSCode.
I've appreciated all the customer input as we've grown our ecosystem of VS Code extensions, keeping them open source while offering full support, and maintaining regular and rapid release cycles.As we continue this journey, I look forward to continuing to hear your feedback, ideas, and code.
Feel free to comment below or direct message me. As always, your voice plays a major role in this important process. For development VS Code is great. Indent is a lot better and also the syntax checking. One small functionality I'm missing is to open classes fast with a shortcut. We're using HealthConnect and at an item the Class Name is shown in the management portal:
I can copy the class name, go to studio press Ctrl-O and paste the class name:
In VS code I can open also with Ctrl-O but then I need to fill in /HS/FHIRServer/Interop.HTTPOperation.cls
Could be a small improvement? I still can't see my csp files or .bas routines in VS Code. What do I have wrong? @David.Hockenbroch We just improved the UI for creating a new server-side editing workspace folder. If you have no workspace open, you can follow the steps here to create a new one. If you do, you can add a new folder to you workspace by right-clicking in the file explorer and selecting "Add Server Namespace to Workspace..". That command will follow steps 4 and on. To see CSP files, select "Web Application Files" in the menu from step 8. To see basic files, select "code files in <NS>", then select "Filter", then make sure your custom filter contains the "*.bas" pattern. It can include other file types as well. @Menno.Voerman Unfortunately I don't think this is possible. That command is for opening files, and technically the name of the file is "HS/FHIRServer/Interop.HTTPOperation.cls", not "HS.FHIRServer.Interop.HTTPOperation.cls". VS Code is great for development, while InterSystems extensions behavior is disappointing sometimes.
E.g., since some update was installed, <Ctrl/Mouse pointer> stopped referencing the methods of another class. <Right button menu -> Goto Definition> stopped working as well. Is it a bug or a feature? @Alexey.Maslov That sounds like a bug to me. Can you file a GitHub issue with steps to reproduce? It would also help to know the versions of the extensions you have installed, the version of IRIS you're connected to, and the text of the file that you see the bug in. Done, expecting that InterSystems ObjectScript was the right choice for the kind of issue. The export capabilities in VSCode are very poor compared to Studio. There should be a possibility to export multiple classes/routines in xml format.
VSCode also lacks the different Wizards (SOAP/XML) and the guided way (File/New...) to create classes, processes, DTLs, BPLs and so on. You can't edit BPLs and DTLs with VSCode. Why would you need XML? Hi @Mikko.Taittonen! Why would you need XML export for classes/routines? Why not UDL? UDL is much more readable? @Mikko.Taittonen The vscode-objectscript extension does provide New File commands for Interoperability classes. You can read the documentation here. The SOAP wizard can be accessed from the Server Actions menu. BPL and DTL classes can be edited textually. Support for the graphical editors will be added when they are rewritten using Angular. For a preview of how that would work, you can try out the new Angular Rule Editor in VS Code. For what it is worth: I don't agree.Studio is a Objectscript dedicated IDE (Editor), rather clean and unobtrusive.It could be enhanced quite simple.The only problem is Windows only (which is still 57.37% of all desktops) Fully support this!It's a sense less attack to the traditional installed base !
With our 2023.2 release we are deprecating InterSystems Studio (deprecateddesignates a feature or technology that InterSystems no longer actively develops, and for which better options exist). We will continue to maintain, release, and support InterSystems Studio for quite some time, and we appreciate that it is still an important tool for some customers. However, customers should be advised that we are not investing in it. Our focus is on VSCode.
In what sense is this an "attack"? C'mon! we heard this to often: WebLink, Caché, ...- We will continue to maintain, release, and support ....- and every IRIS Version needs a new Studio version Hi:
This is possible . How to quickly open class in VS Code | InterSystems Developer Community |. The answer to this is here: if you are using the server-side editing paradigm, which matches what you're used to with Studio, make sure you follow the instructions in the "Enable Proposed APIs" section of the extension's README (also available here). you can then use Crtl+P to find the item you need.
my setup instructions
The additional features (and the APIs used) are:
Server-side searching across files being accessed using isfs (TextSearchProvider)
Quick Open of isfs files (FileSearchProvider).
Download and install a beta version from GitHub. This is necessary because Marketplace does not allow publication of extensions that use proposed APIs.
Go to https://github.com/intersystems-community/vscode-objectscript/releases
Locate the beta immediately above the release you installed from Marketplace. For instance, if you installed 2.4.3, look for 2.4.4-beta.1. This will be functionally identical to the Marketplace version apart from being able to use proposed APIs.
Download the VSIX file (for example vscode-objectscript-2.4.4-beta.1.vsix) and install it. One way to install a VSIX is to drag it from your download folder and drop it onto the list of extensions in the Extensions view of VS Code.
From Command Palette choose Preferences: Configure Runtime Arguments.
In the argv.json file that opens, add this line (required for both Stable and Insiders versions of VS Code):
"enable-proposed-api": ["intersystems-community.vscode-objectscript"]
Exit VS Code and relaunch it.
Verify that the ObjectScript channel of the Output panel reports this:
intersystems-community.vscode-objectscript version X.Y.Z-beta.1 activating with proposed APIs available.
After a subsequent update of the extension from Marketplace you will only have to download and install the new vscode-objectscript-X.Y.Z-beta.1 VSIX. None of the other steps above are needed again. To see CSP files with server side editing if you hold shift before clicking the pencil icon it will load up a web files view Is there a recent and detailed article comparing VSCode plugin and the latest Studio? Something like this is what Studio still does better and this is what plugin does better. Hi @jaroslav.rapp. Attack may be a strong word, but I understand the feeling when beloved tools get less attention than others. We'd love to never leave a technology behind, but the reality is that with limited resources we sometimes have to devote more effort to technologies that will have bigger benefits for our users going forward. It's not always an easy decision, but I believe the short-term pain is well worth the long-term benefits. Hi @Anna.Golitsyna. This recent discussion, First Community Roundtable: VSCode vs Studio, may be useful. I've never used VS Code, and I don't know anything about it. I am wondering whether there are equivalents for a couple things we often do in InterSystems Studio:
Export (to XML) and import classes.
Find in Files.
Also, is there an equivalent to the Namespace Workspace that is available in Studio? That's the arrangement we use the most. Yeah,moving code between Studio instances to other namespaces on same or different serversimply by drag & drop Yes, you can do this in VS Code. Even better, when using the server-side editing paradigm a single VS Code instance can connect simultaneously to multiple namespaces on one or more servers, and you can drag/drop and copy/paste between them. We use the TrackWare SCCS, for both our Cache/IRIS and Delphi application development.
TrackWare has the unique distinction of being written in both Delphi and Cache. It integrates well with Studio SCCS.
We now own the TrackWare package and I have had to make updates to the Cache/IRIS side to keep it current.
(It was formerly owned by Jorma Sinnoma (sp?) who is now a member of the InterSystems team.)
It works for us and we would want to stay with it for as long as Studio functionality and SCCS integration stays intact.
A real bugaboo for us, besides the ease with which we can check out and check in our Cache/IRIS code elements, is that we would have to come up with another product to maintain source control on the Delphi side.
So, please do not stray from what Studio offers us now both as an IDE and its SCCS integration.
Thank you
Rich Filoramo
InTempo Software
I feel that I have not explained clearly enough the Studio features that we use. We often export classes to XML files. Similarly, we import XML files, compiling them as classes during import. I didn't mean to imply that this was done simply to transfer classes to other namespaces or servers. Rather, it's a way to archive classes as XML files. (We don't use a source control system.)
We use the Find in Files utility to determine the classes and other files in which a given string exists.
I mentioned the Namespace variation of the Studio Workspace pane because that is the arrangement of files (classes, lookup tables, message schemas, etc.) that we most often use. @Richard.Filoramo The server-side editing paradigm in VS Code is conceptually similar to Studio in that the files you edit are "virtual" and do not need to be exported to the file system. This mode supports Studio source control classes, so you can continue using them if you want. We have a documentation page that describes some useful features for migrating from Studio that describes where the Studio source control integration can be found in the VS Code UI. @Anna.Golitsyna We have a documentation page that describes some useful features for migrating from Studio that you may find useful. @Larry.Overkamp VS Code does not support exporting or importing source code as XML files. There are existing tools in the terminal and SMP for that. The server-side editing paradigm that John mentioned above is conceptually similar to Studio in that the files you edit are "virtual" and do not need to be exported to the file system to be edited. It supports viewing all files in a namespace like in Studio, and it also supports importing local .cls, .mac, .int and .inc files into that namespace.
Searching for text across those virtual files is supported and the UI is much better than Studio since you can click on the match and jump right to that location in the file. Enabling that feature requires some extra steps due to a VS Code core limitation, but this is something that we anticipate will be resolved in the future. @Larry.Overkamp my reply was focused on what @jaroslav.rapp had posted.
You also wrote:
(We don't use a source control system.)
Have you considered changing this situation? If so please take a look at Deltanji from George James Software (my employer). Some videos are available at https://www.youtube.com/playlist?list=PLGnA3ZIngG4oOUCEmplIYgSknQpNOtF08 Thanks, @Brett.Saviano . I'd like to assess what are plusses and minuses of migrating first. I'll watch it, thanks, Raj. I have to say though that I really prefer searchable text that can be visually scanned diagonally in 5 minutes as opposed to watching a 45-minute video. Oh well... What is the benefit of exporting to XML vs exporting to CLS? A thought about moving from *.xml source controlled files to *.CLS (UDL) file content with VSCode.
The exported source content is different from existing in source control
The export name is different and may be seen as an unrelated resource
A while back I had shared community application ompare
I mention as this utility allows the comparison of what is installed in a namespace, regardless if the external format was XML or UDL. Some may find useful to see the real differences between their own historic product releases and a current release. At least until have a few releases / baseline in their new source repository.
Have notice other community apps like xml-to-udl that might be appropriate for need. I'll add to this, we use the same "embedded" source control behavior across Studio and VSCode for my team within InterSystems, and haven't had issues.
@Richard.Filoramo , one question re: TrackWare - do you know offhand which "Actions" and "Other Studio actions" it uses in the UserAction method from %Studio.Extension.Base (see class reference)? There are some limitations/differences between VSCode and Studio but they're on things we see as either less common or undesirable to support from VSCode. One such case we've previously deemed "undesirable" is Action = 3, "Run an EXE on the client. The Target is the name of an executable file on the client machine. It is the responsibility of the customer to ensure this EXE is installed in a suitable location." Your statement that your source control system is written in ObjectScript and Delphi makes me think this might matter to you.
More generally, @Brett.Saviano , there may be other aspects of complete support for the interface defined in %Studio.Extension.Base to consider. @Alexander.Woodhead
Just to clarify for everyone, Studio source control classes that import/export XML files will continue to work and be supported. There just won't be a menu option in VS Code to export files as XML. @Anna.Golitsyna The big plus is that VS Code is in active development, while Studio hasn't seen enhancements in years and now is deprecated. Other than that, here are some benefits of VS Code:
Supports Mac, Linux and Alpine Linux in addition to Windows.
Much faster release cycles so you get new features and fixes faster.
Always forward compatible without needing to install a new version.
Much better intellisense support. (a large list of features can be found here)
Modern UI with fully customizable themes.
Can be connected to multiple server-namespaces in the same window.
Debugging supports expanding OREF's to see their properties.
Thanks Brett!
I will add the support of:
Docker
Git source control
Github/Gitlub interactive plugins
Embedded Python
Co-pilot (when VScode writes ObjectScript for you instead of you)
Full-functional IRIS Terminal
And options to use 3rd-party plugins, e.g. from George James, @John.Murray mentioned earlier. Find in Files works with the Enable proposed APIs and valid version working.
Was not aware of drag/drop for export/import.
I find even though a lot of the discussions around source control are great and easier using vs code I still don't feel there is an easy jumping in point to really get started to use a control system. With the discussion of Intersystem IRIS Studio or VS Code, I understand the initial reaction to think you are losing something using VS Code. Will you miss some things, I'm sure you will, but measured on the whole I feel like VS Code is a much better environment even if all you ever do is objectscript work. I feel like that's the critical thing to remember, there may be some differences but measured on the whole it is better, give it a try and give it a chance, especially learning the full capabilities.
I agree. Adopting source control is like committing to a workout regimen. It's hard to get started and feels like a big hassle, but eventually you can't imagine living any other way, and you love the way your body of code looks ;) @Brett.Saviano @Evgeny.Shvarov You listed VSCode plusses, thanks. How about what Studio does that VSCode plus free plugins still do not? One of our programmers complained about debugging as of July 2022. Is it on par now? @Anna.Golitsyna I think the debugging experience in VS Code is actually better than Studio since you can view the properties of OREF's in the variables view in VS Code but not Studio. You can have that developer file an issue report on GitHub, or contact the WRC if you have a support contract.
As for features in Studio but not VS Code, the biggest one is the Inspector. It is very unlikely that it will ever be implemented in VS Code. VS Code also does not support integration with the legacy Zen BPL and DTL editors. VS Code will support integration with the new Angular versions of those editors when they are implemented. VS Code also doesn't support syntax coloring for Cache Basic or MultiValue Basic, but you can still edit those files. Will you miss some things?Now working 20 years with Studio I'm kind of "married" knowing all good and bad features.I dislike the idea to face a "forced divorce". Dictated from outside. It smells like very bad times in past. The opposite of freedom of decision. some context about me.. I've been working with InterSystems technologies since 1991 so I've been thru all of the changes for the past 32 years. Unless you give it a chance you won't have an opportunity to see what are the good features in VS code. There is no doubt that working with VSCode is more productive than working with ISC Studio. I've been working with ISC Cache since 1997 and thank God ISC finally has a normal development IDE. But there are small things that need to be fine-tuned:1. When overriding, be able to include the original implementation as well2. The terminal cannot process Home End keys, etc., and it throws lines.3. Close automatically opened routines/classes when debugging is finished4. In the debug console, the zwrite option5. Option to switch from read only file to edit file on GIT disk6. Debugging INT source codes does not work well7. jump to label+offset
and a lot of little things will certainly appear.
Maybe the problem is that I'm still working on Ensemble 2018
Josef Hi @Josef.Zvonicek, I'm glad that VS Code is making you more productive, and thanks for the feedback. I have some comments about your fine-tuning list:
The "override class members" functionality is implemented by the Language Server extension. If you file an issue on its GitHub repository I would be happy to consider this enhancement request.
The VS Code integrated terminal is part of the core product, and not our extensions, so I'm not sure we can do anything about this. Can you provide more details about how you started the terminal and the expected vs actual behavior?
Newer versions of the vscode-objectscript extension should avoid opening that extra copy of the file when debugging. If you're using recent version like 2.8.0 or 2.8.1 and this isn't working, please file a GitHub issue in that extension's repository and I will take a look at it.
The debug console can only evaluate expressions. It's not a full terminal and cannot execute commands, so this isn't possible unfortunately.
I'm not sure what a GIT disk is. Are you editing files on your local file system?
Can you describe what doesn't work well, and what we could do to make things better?
There is a command called "Open Error Location..." that you can execute from the command palette. It prompts you to enter a "label+offset^routine" string and then opens that location. It only works for INT routines though.
and just today spotted https://blog.postman.com/introducing-the-postman-vs-code-extension/ . This isn't full formed as they report
We will continuously ship features and improvements—such as support for Postman Collections and environments—so be on the lookout. We value your input, so if you have any feedback on this beta version, please feel free to leave it in the comments.
but just one more example why measured on the whole VS Code really is a better environment IMHO. For me it's a horrible news 😭 I really prefer to use Studio when explaining how to create properties (particularly relationships) and queries (particularly Class Queries based on COS) to students who see IRIS for the first and the last time during my classes. And when something goes wrong (and it does a lot of the time) it's usually easier to ask them to delete the code that produces error and rewrite it while I'm looking than to figure out what's wrong with it. And if it something more complicated than simple properties it can take a lot of time.
Besides, not all students know (and want/need to learn) how to use VS Code and look for proper plug-ins, extensions etc. It will really make my life that much harder. one of my (former) customers suggested this approach:
Train COS not on IRIS but on some Caché/ENS 2018 instances with Studio
or on some older IRIS version
As they run pure COS they have Studio for Training. An no need of new features
once all the logic works they may move the result to some final IRIS
(if ever they migrated)
With #2 (at least for me anyway), the issue seems to be related to running iris session when using the Windows version of ssh.exe (called from VS Code, configured in settings under node terminal.integrated.profiles.windows). Home and End work normally at the Linux shell prompt, but when running iris session the effect is that either key produces the same result as pressing the Enter key. The current command is executed and a new IRIS prompt is generated.
It doesn't seem to be a VS Code problem so much as an ISC problem, at least on Windows. Please, give VSCode a try.
Regarding of extensions, you can give students a repository with .vscode/extensions.json, that will already contain examples. E.g. here is how my extensions.json looks like:
{
"recommendations": [
"eamodio.gitlens",
"georgejames.gjlocate",
"github.copilot",
"intersystems-community.servermanager",
"intersystems-community.sqltools-intersystems-driver",
"intersystems-community.vscode-objectscript",
"intersystems.language-server",
"mohsen1.prettify-json",
"ms-azuretools.vscode-docker",
"ms-python.python",
"ms-python.vscode-pylance",
"ms-vscode-remote.remote-containers"
]
}
This will not install extensions automatically, but they will be shown as recommended for the workspace, like that:
Here is the template they can start from and here is an example extensions.json file. @Irene.Mikhaylova please also look at VS Code's recently-introduced Profiles feature. I imagine this could be useful in learning contexts. Maybe my VS Code setup isn't correct, but if I need to explore % class files to learn more about behavior of some methods (let's face it, the documentation isn't always revealing), I use Studio. A recent example was learning what methods of %CSP.Page I could override and which order the override-ed methods were called in the code.
I know in the past I've referenced other things. It's not often but it's helpful when we can. I haven't found a way to view % classes in VS Code.
Maybe someone can help me if that's something I've missed! Hi @Michael.Davidovich, I can show you how to configure VS Code to see system classes. Are you using client-side or server-side editing? @Michael.Davidovich you might find this extension useful for exploring inherited methods etc:
https://marketplace.visualstudio.com/items?itemName=georgejames.objectscript-class-view One quick/dirty(?) way is to do a "View Code in Namespace" for "%SYS". Under the %SYS namespace the percent classes are not filtered out. In the interest of accuracy, Studio does allow looking at OREFS to see their properties (View As > Object, or Dump object).
I've also used Studio for 20+ years. I can still remember how much better it was than what we had before. We can all still use Studio if we want; it's not a forced divorce. But we hope that VS Code -- ObjectScript's features will make you comfortable enough to decide to do a conscious uncoupling. And as Frank Sinatra sang: "Love's much lovelier, the second time around." And he could have sung that at the Diplomat Hotel in Fort Lauderdale in 1974, where coincidentally InterSystems is hosting our Global Summit this year! We sometimes export classes from one environment and import them to another. I don't see .cls as a supported file type for those activities. Hi @Larry.Overkamp !
You can use $system.OBJ.ExportUDL() to export in CLS or MAC or INC.
and $System.OBJ.Load or $system.OBJ.LoadDir() to import CLS or XML or MAC.
As for transferring code between systems I'd recommend to maintain code in repositories (e.g. git) and deploy code via InterSystems Package Manager.
The format doesn't matter as long as you can easily package multiple classes/routines in a single file. By easily I that mean that you should be able to choose the exported classes/routines eg. from a class tree. If I can guess you need export several classes/routines/macro in one file e.g. to deploy to another server.
I'd recommend to use InterSystems Package Manager for it @Brett.Saviano Sorry I didn't see this until just now. We are editing on client-side. Is there documentation for this? I searched "package manager" in the documentation and didn't get anything in the results.... Package Manager is going to be a part of a product in near future, but right now it is not a part and can be installed with the following command:
s r=##class(%Net.HttpRequest).%New(),r.Server="pm.community.intersystems.com",r.SSLConfiguration="ISC.FeatureTracker.SSL.Config" d r.Get("/packages/zpm/latest/installer"),$system.OBJ.LoadStream(r.HttpResponse.Data,"c")
Caution! This is for IRIS only.
The package manager.
It has documentation, a bunch of videos, and there is a tag here. If for Interoperability productions it may also be appropriate to weigh up existing capabilities available for Production deployment. See: https://docs.intersystems.com/irisforhealth20232/csp/docbook/DocBook.UI.Page.cls?KEY=EGDV_deploying
It provides functionality for detecting implementation used by a production.
Have created idea for Production exports to generate IPM modules as well as the usual XML export deployment. https://ideas.intersystems.com/ideas/DPI-I-382 Readers of this thread may be interested in this discussion I just started about the export/import topic:
https://community.intersystems.com/post/studio-vs-code-migration-addressing-exportimport-pain-point
Article
Michael Braam · Aug 17, 2022
Being interoperable is more and more important nowadays. InterSystems IRIS 2022.1 comes with a new messaging API to communicate with event streaming platforms like Kafka, AWS SQS/SNS, JMS and RabbitMQ.This article shows how you can connect to Kafka and AWS SQS easily.We start with a brief discussion of the basic concepts and terms of event streaming platforms.
Event streaming platforms purpose and common terms
Event streaming platforms like Kafka or AWS SQS are capable to consume a unbound stream of events in a very high frequency and can react to events. Consumers read the data from streams for further processing. They are often used in IoT environments.
Common terms in this arena are:
Topic/Queue, the place where data is stored
Producer, creates and sends data (events, messages) to a topic or queue
Consumer, reads events/messages from one or more topics or queues
Publish/Subscribe, producers send data to a queue/topic (publish), consumers subscribe to a topic/queue and get automatically notified if new data arrives
Polling, consumers have to actively poll a topic/queue for new data
Why are they used?
Decoupling of producers and consumers
Highly scalable for real time data
Do I really need them? As a InterSystems IRIS developer probably not, but you are not alone...
The external messaging API
The new API classes are located in the %External.Messaging package. It contains generic Client-, Settings- and Message classes. The specialized classes for Kafka, AWS SQS/SNS, JMS, RabbitMQ are subclasses of these generic classes.
The basic communication flow is:
Create a settings object for your target platform. This is also responsible for the authentication against the target platform.
Create a specific client object and pass the settings object to it
Create a message object and send it to the target.
The following sections demonstrate how you can communicate with Kafka and AWS SQS (Simple Queue Service).
Interacting with Kafka
Let's start with a Kafka example. First we create a class which leverages the new %External Messaging API to create a topic, send and receive a message to and from Kafka.
It first creates a Kafka settings object
set tSettings = ##class(%External.Messaging.KafkaSettings).%New()
set tSettings.servers = $$$KAFKASERVER
set tSettings.groupId = "iris-consumer"
After setting the Kafka server address it sets a Kafka group id.With these settings a Kafka client object is created:
set tClient = ##class(%External.Messaging.Client).CreateKafkaClient(tSettings.ToJSON(),.tSc)
It then creates a topic by invoking the CreateTopic() method of the Kafka client:
Set tSC = tClient.CreateTopic(pTopicName,tNumberOfPartitions,tReplicationFactor)
Below is the full code sample:
Include Kafka.Settings
Class Kafka.api [ Abstract ]
{
ClassMethod CreateTopic(pTopicName As %String) As %Status
{
#dim tSc as %Status = $$$OK
try {
set tSettings = ##class(%External.Messaging.KafkaSettings).%New()
set tSettings.servers = $$$KAFKASERVER
set tSettings.groupId = "iris-consumer"
set tClient = ##class(%External.Messaging.Client).CreateKafkaClient(tSettings.ToJSON(),.tSc)
$$$ThrowOnError(tSc)
Set tNumberOfPartitions = 1
Set tReplicationFactor = 1
Set tSC = tClient.CreateTopic(pTopicName,tNumberOfPartitions,tReplicationFactor)
$$$ThrowOnError(tSC)
$$$ThrowOnError(tClient.Close())
}
catch tEx {
set tSc = tEx.AsStatus()
}
return tSc
}
}
After creating a topic we can send and receive messages from Kafka. The code is similiar to the above code
ClassMethod SendMessage(pMessage As %String, pTopic As %String) As %Status
{
#dim tSettings as %External.Messaging.KafkaSettings
#dim tClient as %External.Messaging.KafkaClient
#dim tMessage as %External.Messaging.KafkaMessage
#dim tSc as %Status = $$$OK
try {
set tSettings = ##class(%External.Messaging.KafkaSettings).%New()
set tSettings.servers = $$$KAFKASERVER
set tSettings.groupId = "iris-consumer"
set tMessage = ##class(%External.Messaging.KafkaMessage).%New()
set tMessage.topic = pTopic
set tMessage.value = pMessage
set tClient = ##class(%External.Messaging.Client).CreateKafkaClient(tSettings.ToJSON(),.tSc)
$$$ThrowOnError(tSc)
Set producerSettings = "{""key.serializer"":""org.apache.kafka.common.serialization.StringSerializer""}"
$$$ThrowOnError(tClient.UpdateProducerConfig(producerSettings))
$$$ThrowOnError(tClient.SendMessage(tMessage))
$$$ThrowOnError(tClient.Close())
}
catch tEx {
set tSc = tEx.AsStatus()
}
return tSc
}
ClassMethod ReceiveMessage(pTopicName As %String, pGroupId As %String = "iris-consumer", Output pMessages) As %Status
{
#dim tSettings as %External.Messaging.KafkaSettings
#dim tClient as %External.Messaging.KafkaClient
#dim tMessage as %External.Messaging.KafkaMessage
#dim tSc as %Status = $$$OK
try {
set tSettings = ##class(%External.Messaging.KafkaSettings).%New()
set tSettings.servers = $$$KAFKASERVER
set tSettings.groupId = pGroupId
set tClient = ##class(%External.Messaging.Client).CreateKafkaClient(tSettings.ToJSON(),.tSc)
$$$ThrowOnError(tSc)
Set producerSettings = "{""key.serializer"":""org.apache.kafka.common.serialization.StringSerializer""}"
$$$ThrowOnError(tClient.UpdateProducerConfig(producerSettings))
$$$ThrowOnError(tClient.ReceiveMessage(pTopicName, .pMessages))
$$$ThrowOnError(tClient.Close())
}
catch tEx {
set tSc = tEx.AsStatus()
}
return tSc
}
Let's try it. I have a Kafka instance running and first we create a topic community with the CreateTopic method above:
Please ignore the log4j warnings here. The method returns a status code OK. So the topic was created. Next let's send a message to this topic. To verify that the message is sent to the topic, I have a generic Kafka consumer running. This consumer listens to the topic community:
So let's send a message to this topic. I'll send a JSON-String to it, but basically you can send any message format to a topic.
Let's check if the consumer received the message:
The message was successfully received by the consumer.
Receiving messages and deleting topics is similiar to the above sample. Below is the full sample implementation. The include file Kafka.settings only contains a macro definition: #define KAFKASERVER <Kafka server location and port>.
Include Kafka.Settings
Class Kafka.api [ Abstract ]
{
ClassMethod CreateTopic(pTopicName As %String) As %Status
{
#dim tSc as %Status = $$$OK
try {
set tSettings = ##class(%External.Messaging.KafkaSettings).%New()
set tSettings.servers = $$$KAFKASERVER
set tSettings.groupId = "iris-consumer"
set tClient = ##class(%External.Messaging.Client).CreateKafkaClient(tSettings.ToJSON(),.tSc)
$$$ThrowOnError(tSc)
Set tNumberOfPartitions = 1
Set tReplicationFactor = 1
Set tSC = tClient.CreateTopic(pTopicName,tNumberOfPartitions,tReplicationFactor)
$$$ThrowOnError(tSC)
$$$ThrowOnError(tClient.Close())
}
catch tEx {
set tSc = tEx.AsStatus()
}
return tSc
}
ClassMethod DeleteTopic(pTopicName As %String) As %Status
{
#dim tSc as %Status = $$$OK
try {
set tSettings = ##class(%External.Messaging.KafkaSettings).%New()
set tSettings.servers = $$$KAFKASERVER
set tSettings.groupId = "iris-consumer"
set tClient = ##class(%External.Messaging.Client).CreateKafkaClient(tSettings.ToJSON(),.tSc)
$$$ThrowOnError(tSc)
Set tNumberOfPartitions = 1
Set tReplicationFactor = 1
Set tSC = tClient.DeleteTopic(pTopicName)
$$$ThrowOnError(tSC)
$$$ThrowOnError(tClient.Close())
}
catch tEx {
set tSc = tEx.AsStatus()
}
return tSc
}
ClassMethod SendMessage(pMessage As %String, pTopic As %String) As %Status
{
#dim tSettings as %External.Messaging.KafkaSettings
#dim tClient as %External.Messaging.KafkaClient
#dim tMessage as %External.Messaging.KafkaMessage
#dim tSc as %Status = $$$OK
try {
set tSettings = ##class(%External.Messaging.KafkaSettings).%New()
set tSettings.servers = $$$KAFKASERVER
set tSettings.groupId = "iris-consumer"
set tMessage = ##class(%External.Messaging.KafkaMessage).%New()
set tMessage.topic = pTopic
set tMessage.value = pMessage
set tClient = ##class(%External.Messaging.Client).CreateKafkaClient(tSettings.ToJSON(),.tSc)
$$$ThrowOnError(tSc)
Set producerSettings = "{""key.serializer"":""org.apache.kafka.common.serialization.StringSerializer""}"
$$$ThrowOnError(tClient.UpdateProducerConfig(producerSettings))
$$$ThrowOnError(tClient.SendMessage(tMessage))
$$$ThrowOnError(tClient.Close())
}
catch tEx {
set tSc = tEx.AsStatus()
}
return tSc
}
ClassMethod ReceiveMessage(pTopicName As %String, pGroupId As %String = "iris-consumer", Output pMessages) As %Status
{
#dim tSettings as %External.Messaging.KafkaSettings
#dim tClient as %External.Messaging.KafkaClient
#dim tMessage as %External.Messaging.KafkaMessage
#dim tSc as %Status = $$$OK
try {
set tSettings = ##class(%External.Messaging.KafkaSettings).%New()
set tSettings.servers = $$$KAFKASERVER
set tSettings.groupId = pGroupId
set tClient = ##class(%External.Messaging.Client).CreateKafkaClient(tSettings.ToJSON(),.tSc)
$$$ThrowOnError(tSc)
Set producerSettings = "{""key.serializer"":""org.apache.kafka.common.serialization.StringSerializer""}"
$$$ThrowOnError(tClient.UpdateProducerConfig(producerSettings))
$$$ThrowOnError(tClient.ReceiveMessage(pTopicName, .pMessages))
$$$ThrowOnError(tClient.Close())
}
catch tEx {
set tSc = tEx.AsStatus()
}
return tSc
}
}
Interacting with AWS SQS
How would you communicate with AWS SQS (Simple Queue Service)? The basic procedure is pretty similiar. But AWS requires authentication and AWS doesn't use the term topic. They talk about queues. You can send a message to a queue and consumers can receive messages from one or more queues.
Similar to my api class above I've created something for AWS SQS.
Class AWS.SQS.api [ Abstract ]
{
ClassMethod SendMessage(pMessage As %String, pQueue As %String) As %Status
{
#dim tSettings as %External.Messaging.SQSSettings
#dim tMessage as %External.Messaging.SQSMessage
#dim tClient as %External.Messaging.SQSClient
#dim tSc as %Status = $$$OK
try {
$$$ThrowOnError(##class(AWS.Utils).GetCredentials(.tCredentials))
set tSettings = ##class(%External.Messaging.SQSSettings).%New()
set tSettings.accessKey = tCredentials("aws_access_key_id")
set tSettings.secretKey = tCredentials("aws_secret_access_key")
set tSettings.sessionToken = tCredentials("aws_session_token")
set tSettings.region = "eu-central-1"
set tMessage = ##class(%External.Messaging.SQSMessage).%New()
set tMessage.body = pMessage
set tMessage.queue = pQueue
set tClient = ##class(%External.Messaging.Client).CreateSQSClient(tSettings.ToJSON(),.tSc)
$$$ThrowOnError(tSc)
$$$ThrowOnError(tClient.SendMessage(tMessage))
}
catch tEx {
set tSc = tEx.AsStatus()
}
return tSc
}
ClassMethod ReceiveMessage(pQueueName As %String, Output pMessages) As %Status
{
#dim tSettings as %External.Messaging.SQSSettings
#dim tClient as %External.Messaging.SQSClient
#dim tSc as %Status = $$$OK
try {
$$$ThrowOnError(##class(AWS.Utils).GetCredentials(.tCredentials))
set tSettings = ##class(%External.Messaging.SQSSettings).%New()
set tSettings.accessKey = tCredentials("aws_access_key_id")
set tSettings.secretKey = tCredentials("aws_secret_access_key")
set tSettings.sessionToken = tCredentials("aws_session_token")
set tSettings.region = "eu-central-1"
set tClient = ##class(%External.Messaging.Client).CreateSQSClient(tSettings.ToJSON(),.tSc)
$$$ThrowOnError(tSc)
$$$ThrowOnError(tClient.ReceiveMessage(pQueueName, .pMessages))
}
catch tEx {
set tSc = tEx.AsStatus()
}
return tSc
}
ClassMethod DeleteMessage(pQueueName As %String, pReceiptHandle As %String) As %Status
{
#dim tSettings as %External.Messaging.SQSSettings
#dim tClient as %External.Messaging.SQSClient
#dim tSc as %Status = $$$OK
try {
$$$ThrowOnError(##class(AWS.Utils).GetCredentials(.tCredentials))
set tSettings = ##class(%External.Messaging.SQSSettings).%New()
set tSettings.accessKey = tCredentials("aws_access_key_id")
set tSettings.secretKey = tCredentials("aws_secret_access_key")
set tSettings.sessionToken = tCredentials("aws_session_token")
set tSettings.region = "eu-central-1"
set tClient = ##class(%External.Messaging.Client).CreateSQSClient(tSettings.ToJSON(),.tSc)
$$$ThrowOnError(tSc)
$$$ThrowOnError(tClient.DeleteMessage(pQueueName, pReceiptHandle))
}
catch tEx {
set tSc = tEx.AsStatus()
}
return tSc
}
ClassMethod CreateQueue(pQueueName As %String) As %Status
{
#dim tSettings as %External.Messaging.SQSSettings
#dim tClient as %External.Messaging.SQSClient
#dim tSQSSettings as %External.Messaging.SQSQueueSettings
#dim tSc as %Status = $$$OK
try {
$$$ThrowOnError(##class(AWS.Utils).GetCredentials(.tCredentials))
set tSettings = ##class(%External.Messaging.SQSSettings).%New()
set tSettings.accessKey = tCredentials("aws_access_key_id")
set tSettings.secretKey = tCredentials("aws_secret_access_key")
set tSettings.sessionToken = tCredentials("aws_session_token")
set tSettings.region = "eu-central-1"
set tClient = ##class(%External.Messaging.Client).CreateSQSClient(tSettings.ToJSON(),.tSc)
$$$ThrowOnError(tSc)
set tSQSSettings = ##class(%External.Messaging.SQSQueueSettings).%New()
$$$ThrowOnError(tClient.CreateQueue(pQueueName,tSQSSettings))
}
catch tEx {
set tSc = tEx.AsStatus()
}
return tSc
}
ClassMethod DeleteQueue(pQueueName As %String) As %Status
{
#dim tSettings as %External.Messaging.SQSSettings
#dim tClient as %External.Messaging.SQSClient
#dim tSQSSettings as %External.Messaging.SQSQueueSettings
#dim tSc as %Status = $$$OK
try {
$$$ThrowOnError(##class(AWS.Utils).GetCredentials(.tCredentials))
set tSettings = ##class(%External.Messaging.SQSSettings).%New()
set tSettings.accessKey = tCredentials("aws_access_key_id")
set tSettings.secretKey = tCredentials("aws_secret_access_key")
set tSettings.sessionToken = tCredentials("aws_session_token")
set tSettings.region = "eu-central-1"
set tClient = ##class(%External.Messaging.Client).CreateSQSClient(tSettings.ToJSON(),.tSc)
$$$ThrowOnError(tSc)
$$$ThrowOnError(tClient.DeleteQueue(pQueueName))
}
catch tEx {
set tSc = tEx.AsStatus()
}
return tSc
}
}
It contains methods for creating and deleting queues and sending and receiving messages to and from a queue.
One of the key points here is how to authenticate. I didn't want to have my credentials in my code. So I created a little helper method to retrieve the credentials from my local credentials file and return it as subscripted array to use it in my api methods:
ClassMethod GetCredentials(Output pCredentials) As %Status
{
#dim tSc as %Status = $$$OK
set tFilename = "/dur/.aws/credentials"
try {
set tCredentialsFile = ##class(%Stream.FileCharacter).%New()
$$$ThrowOnError(tCredentialsFile.LinkToFile(tFilename))
// first read the header
set tBuffer = tCredentialsFile.ReadLine()
for i=1:1:3 {
set tBuffer = tCredentialsFile.ReadLine()
set pCredentials($piece(tBuffer," =",1)) = $tr($piece(tBuffer,"= ",2),$c(13))
}
}
catch tEx {
set tSc = tEx.AsStatus()
}
return tSc
}
To complete this article, let's create a queue community in the AWS region "eu-central-1" (Frankfurt, Germany).
The queue has been successfully created and is visible in the AWS console for my account:
Next, let's send a message to this queue:
The method call returns 1. So the message has been successfully sent.
Finally let's poll the queue from the AWS console:
The message has been successfully delivered to the queue.
Conclusion
The external messaging api in InterSystems IRIS 2022.1 makes it really simple to communicate with event streaming platforms.Hope you find this useful.
Announcement
Anastasia Dyubaylo · Sep 22, 2022
Hi Developers,
Enjoy watching the new video on InterSystems Developers YouTube:
⏯ Analytics with InterSystems New and Next 2022 @ Global Summit 2022
In this video, you'll see an overview of new and improved capabilities for analytics and artificial intelligence on data managed by InterSystems. We'll take a look at analytics across InterSystems IRIS and InterSystems IRIS for Health platforms, HealthShare, and TrakCare. This session will help you determine which additional sessions about analytics and artificial intelligence are relevant to you.
Presenters:🗣 @Carmen.Logue, Product Manager, Analytics & AI, InterSystems🗣 @Thomas.Dyar, Product Specialist, Machine Learning, InterSystems🗣 @Benjamin.DeBoe, Product Manager, InterSystems
Enjoy watching and stay tuned! 👍🏼
Announcement
Anastasia Dyubaylo · Dec 13, 2022
Hey Developers,
Watch this video to learn how to use InterSystems IRIS Cloud IntegratedML:
⏯ InterSystems IRIS Cloud IntegratedML @ Global Summit 2022
🗣 Presenter: @Kinshuk.Rakshith, Sales Engineer, InterSystems
Subscribe to InterSystems Developers YouTube to stay tuned!
Announcement
Anastasia Dyubaylo · Dec 9, 2022
Hi Developers,
Enjoy watching the new video on InterSystems Developers YouTube:
⏯ Using Variables in InterSystems ObjectScript
Learn about different variable types in InterSystems ObjectScript and what they are used for, and how to assign values to the most common variable types. You will also see how operator precedence works.
Enjoy it and stay tuned! 👍
Announcement
Anastasia Dyubaylo · Dec 12, 2022
Hello and welcome to the Developer Ecosystem Fall News!
This fall we've had so much fun and activities online and offline in the InterSystems Developer Ecosystem.
In case you missed something, we've prepared for you a selection of the hottest news and topics to catch up on!
News
🫂 InterSystems Developer Ecosystem Team
📊 Online Analytics Dashboard for Community Members
🎃 Halloween season on Global Masters
🔗 Developer Connections on GM
💡 InterSystems Ideas News
🔥 Back to school on FHIR with DC FR
🎉 InterSystems a Leader in Latest Forrester Wave Report: Translytical Data Platforms Q4 2022
📝 Updated Learning Path "Getting Started with InterSystems ObjectScript"
✅ InterSystems IRIS System Administration Specialist Certification Exam is now LIVE
📦 ZPM is now InterSystems Package Manager (IPM)
Contests & Events
InterSystems Interoperability Contest: Building Sustainable Solutions
Contest Announcement
Kick-off Webinar
Technology Bonuses
Time to Vote
Technical Bonuses Results
Winners Announcement
Meetup with Winners
Your Feedback
InterSystems IRIS for Health Contest: FHIR for Women's Health
Contest Announcement
Kick-off Webinar
Technology Bonuses
Time to Vote
Technical Bonuses Results
Winners Announcement
Community Roundtables
1. VSCode vs Studio
2. Best Source Control
3. Developing with Python
InterSystems Idea-A-Thon
Contest Announcement
Winners Announcement
📄 [DC Contest] 1st Tech Article Contest on Chinese Developer Community
⏯ [Webinar] What’s New in InterSystems IRIS 2022.2
⏯ [Webinar] Building and Enabling Healthcare Applications with HL7 FHIR
⏯ [Webinar] Deployments in Kubernetes with High Availability
👥 [Conference] InterSystems Iberia Summit 2022
👥 [Conference] InterSystems UK & Ireland Summit 2022
👥 [Conference] InterSystems at Big Data Minds DACH 2022 in Berlin
👥 [Conference] InterSystems Partnertage DACH 2022
👥 [Conference] InterSystems at data2day
👥 [Conference] InterSystems at Global DevSlam in Dubai
👾 [Hackathon] InterSystems at HackMIT
👾 [Hackathon] InterSystems at CalHacks hackathon
👾 [Hackathon] InterSystems at TechCrunch Disrupt
👾 [Hackathon] InterSystems at the European Healthcare Hackathon in Prague
☕️ [Meetup] InterSystems Developer Meetup in San Francisco
☕️ [Meetup] The 1st Spanish Developer Community Meetup in Valencia
☕️ [Meetup] InterSystems <> Mirantis Developer Meetup on Kubernetes in Boston
👋 InterSystems FHIR Healthtech Incubator Caelestinus Final Demo Day
Latest Releases
⬇️ Developer Community Release, September 2022
⬇️ Open Exchange - ObjectScript Quality status
⬇️ New Multi-Channel View on Global Masters
⬇️ InterSystems IRIS, IRIS for Health, HealthShare Health Connect, & InterSystems IRIS Studio 2022.2
⬇️ InterSystems IRIS, IRIS for Health, & HealthShare Health Connect 2022.1.1
⬇️ InterSystems IRIS, IRIS for Health, & HealthShare Health Connect 2022.3 developer previews
Preview 1
Preview 2
⬇️ InterSystems IRIS, IRIS for Health, & HealthShare Health Connect 2022.2 developer previews
Preview 7
Preview 8
Preview 9
Preview 10
⬇️ InterSystems Package Manager
Release 0.5.0
Release 0.4.0
⬇️ InterSystems extension for Docker Desktop
⬇️ VS Code Server Manager 3.2.1
⬇️ SAM (System Alerting & Monitoring) 2.0
⬇️ InterSystems Container Registry web user interface
Best Practices & Key Questions
🔥 Best Practices of Autumn 2022
GitHub Codespaces with IRIS
Using Grafana directly from IRIS
Uploading and downloading files via http
Adding VSCode into your IRIS Container
Reading AWS S3 data on COVID as SQL table in IRIS
IRIS Embedded Python with Azure Service Bus (ASB) use case
BILLIONS - Monetizing the InterSystems FHIR® with Google Cloud's Apigee Edge
Let's fight against the machines!
Top 10 InterSystems IRIS Features
The way to launch Jupyter Notebook + Apache Spark + InterSystems IRIS
Apache Web Gateway with Docker
❓ Key Questions of Autumn 2022: September, October, November
People and Companies to Know About
👋 Muhammad Waseem - New Developer Community Moderator
👋 Tete Zhang - New Developer Community Moderator
👋 New Partner - PainChek® Ltd
🌟 Global Masters of Autumn 2022: September, October, November
Job Opportunities
💼 InterSystems HealthShare Architect (Remote Opportunity)
💼 InterSystems HealthShare Practice Lead (Remote Opportunity)
💼 REMOTE InterSystems Object Developer with Docker Experience
💼 Integration Developer OR Business Analyst with IRIS/Ensemble
💼 Looking for InterSystems Developer
So...
Here is our take on the most interesting and important things!
What were your highlights from this past season? Share them in the comments section and let's remember the fun we've had!
@Anastasia.Dyubaylo, I am always amazed at what you and your team accomplishes ... great work! As usual, lots of things are going on in the Community! Thanks, Ben! Glad to hear!
Your feedback is always greatly appreciated :) Even more ahead ;)