Search

Clear filter
Article
Eduard Lebedyuk · Feb 5, 2016

Class Queries in InterSystems IRIS

Class Queries in InterSystems IRIS (and Cache, Ensemble, HealthShare) is a useful tool that separates SQL queries from Object Script code. Basically, it works like this: suppose that you want to use the same SQL query with different arguments in several different places.In this case you can avoid code duplication by declaring the query body as a class query and then calling this query by name. This approach is also convenient for custom queries, in which the task of obtaining the next row is defined by a developer. Sounds interesting? Then read on!Basic class queriesSimply put, basic class queries allow you to represent SQL SELECT queries. SQL optimizer and compiler handle them just as they would standard SQL queries, but they are more convenient when it comes to executing them from Caché Object Script context. They are declared as Query items in class definitions (similar to Method or Property) in the following way:Type: %SQLQueryAll arguments of your SQL query must be listed in the list of argumentsQuery type: SELECTUse the colon to access each argument (similar to static SQL)Define the ROWSPEC parameter which contains information about names and data types of the output results along with the order of fields(Optional) Define the CONTAINID parameter which corresponds to the numeric order if the field containing ID. If you don't need to return ID, don't assign any values to CONTAINID(Optional) Define the COMPILEMODE parameter which corresponds to the similar parameter in static SQL and specifies when the SQL expression must be compiled. When this parameter is set to IMMEDIATE (by default), the query will be compiled simultaneously with the class. When this parameter is set to DYNAMIC, the query will be compiled before its first execution (similar to dynamic SQL)(Optional) Define the SELECTMODE parameter which specifies the format of the query resultsAdd the SqlProc property, if you want to call this query as an SQL procedure.Set the SqlName property, if you want to rename the query. The default name of a query in SQL context is as follows: PackageName.ClassName_QueryNameCaché Studio provides the built-in wizard for creating class queriesSample definition of the Sample.Person class with the ByName query which returns all user names that begin with a specified letter Class Sample.Person Extends %Persistent { Property Name As %String; Property DOB As %Date; Property SSN As %String; Query ByName(name As %String = "") As %SQLQuery (ROWSPEC="ID:%Integer,Name:%String,DOB:%Date,SSN:%String", CONTAINID = 1, SELECTMODE = "RUNTIME", COMPILEMODE = "IMMEDIATE") [ SqlName = SP_Sample_By_Name, SqlProc ] { SELECT ID, Name, DOB, SSN FROM Sample.Person WHERE (Name %STARTSWITH :name) ORDER BY Name } } You can call this query from Caché Object Script in the following way: Set statement=##class(%SQL.Statement).%New() Set status=statement.%PrepareClassQuery("Sample.Person","ByName") If $$$ISERR(status) { Do $system.OBJ.DisplayError(status) } Set resultset=statement.%Execute("A") While resultset.%Next() { Write !, resultset.%Get("Name") } Alternatively, you can obtain a resultset using the automatically generated method queryNameFunc: Set resultset = ##class(Sample.Person).ByNameFunc("A") While resultset.%Next() { Write !, resultset.%Get("Name") } This query can also be called from SQLcontext in these two ways: Call Sample.SP_Sample_By_Name('A') Select * from Sample.SP_Sample_By_Name('A') This class can be found in the SAMPLES default Caché namespace And that's all about simple queries. Now let's proceed to custom ones. Custom class queries Though basic class queries work fine in most cases, sometimes it is necessary to execute full control over the query behavior in applications, e.g.: Sophisticated selection criteria. Since in custom queries you implement a Caché Object Script method that returns the next row on your own, these criteria can be as sophisticated as you require.If data is accessible only via API in a format that you don't want to useIf data is stored in globals (without classes)If you need to escalate rights in order to access dataIf you need to call an external API in order to access dataIf you need to gain access to the file system in order to access dataYou need to perform additional operations before running the query (e.g. establish a connection, check permissions, etc.) So, how do you create custom class queries? First of all, you should define 4 methods which implement the entire workflow for your query, from initialization to destruction: queryName — provides information about a query (similar to basic class queries)queryNameExecute — constructs a queryqueryNameFetch — obtains the next row result of a queryqueryNameClose — destructs a query Now let's analyze these methods in more detail. The queryName method The queryName method represents information about a query. Type: %QueryLeave body blankDefine the ROWSPEC parameter which contains the information about names and data types of the output results along with the field order(Optional) Define the CONTAINID parameter which corresponds to the numeric order if the field containing ID. If you don't return ID, don't assign any value to CONTAINID For example, let's create the AllRecords query (queryName = AllRecords, and the method is simply called AllRecords) which will be outputting all instances of the new persistent class Utils.CustomQuery, one by one. First, let's create a new persistent class Utils.CustomQuery: Class Utils.CustomQuery Extends (%Persistent, %Populate){ Property Prop1 As %String; Property Prop2 As %Integer; } Now let's write the AllRecords query: Query AllRecords() As %Query(CONTAINID = 1, ROWSPEC = "Id:%String,Prop1:%String,Prop2:%Integer") [ SqlName = AllRecords, SqlProc ] { } The queryNameExecute methodThe queryNameExecute method fully initializes a query. The signature of this method is as follows: ClassMethod queryNameExecute(ByRef qHandle As %Binary, args) As %Status where: qHandle is used for communication with other methods of the query implementationThis method must set qHandle into the state which will then be passed to the queryNameFetch methodqHandle can be set to OREF, variable or a multi-dimensional variableargs are additional parameters passed to the query. You can add as many args as you need (or don't use them at all)The method must return query initialization status But let's get back to our example. You are free to iterate through the extent in multiple ways (I will describe the basic working approaches for custom queries below), but as for this example let's iterate through the global using the $Order function. In this case, qHandle will be storing the current ID, and since we don't need any additional arguments, the arg argument is not required. The result looks like this: ClassMethod AllRecordsExecute(ByRef qHandle As %Binary) As %Status { Set qHandle = "" Quit $$$OK } The queryNameFetch methodThe queryNameFetch method returns a single result in $List form. The signature of this method is as follows: ClassMethod queryNameFetch(ByRef qHandle As %Binary, ByRef Row As %List, ByRef AtEnd As %Integer = 0) As %Status [ PlaceAfter = queryNameExecute ] where: qHandle is used for communication with other methods of the query implementationWhen the query is executed, qHandle is being assigned values specified by queryNameExecute or by previous call of queryNameFetch.Row will be set either to a value of %List or to an empty string, if all data has been processedAtEnd must be set to 1, once the end of data is reached.The PlaceAfter keyword identifies the method's position in the int code . The "Fetch" method must be positioned after the "Execute" method, but this is important only for static SQL, i.e. cursors inside queries. In general, the following operations are performed within this method: Check whether we've reached the end of dataIf there is still some data left: Create a new %List and assign a value to the Row variableOtherwise, set AtEnd to 1Prepare qHandle for the next result fetchReturn the status This is how it will look like in our example: ClassMethod AllRecordsFetch(ByRef qHandle As %Binary, ByRef Row As %List, ByRef AtEnd As %Integer = 0) As %Status { #; Iterating through ^Utils.CustomQueryD #; Writing the next id to qHandle and writing the global's value with the new id into val Set qHandle = $Order(^Utils.CustomQueryD(qHandle),1,val) #; Checking whether there is any more data left If qHandle = "" { Set AtEnd = 1 Set Row = "" Quit $$$OK } #; If not, create %List #; val = $Lb("", Prop1, Prop2) see Storage definition #; Row =$lb(Id,Prop1, Prop2) see ROWSPEC for the AllRecords request Set Row = $Lb(qHandle, $Lg(val,2), $Lg(val,3)) Quit $$$OK } The queryNameClose methodThe queryNameClose method terminates the query, once all the data is obtained. The signature of this method is as follows: ClassMethod queryNameClose(ByRef qHandle As %Binary) As %Status [ PlaceAfter = queryNameFetch ] where: Caché executes this method after the final call to the queryNameFetch methodIn other words, this is a query destructorTherefore you should dispose all SQL cursors, queries and local variables in its implementationThe methods return the current status In our example, we have to delete the local variable qHandle: ClassMethod AllRecordsClose(ByRef qHandle As %Binary) As %Status { Kill qHandle Quit $$$OK } And here we are! Once you compile the class, you will be able to use the AllRecords query from %SQL.Statement – just as the basic class queries. Iteration logic approaches for custom queries So, what approaches can be used for custom queries? In general, there exist 3 basic approaches: Iteration through a globalStatic SQLDynamic SQL Iteration through a globalThe approach is based on using $Order and similar functions for iteration through a global. It can be used in the following cases: Data is stored in globals (without classes)You want to reduce the number of glorefs is the codeThe results must be/can be sorted by the global's subscript Static SQLThe approach is based on cursors and static SQL. This is used for: Making the int code more readableMaking the work with cursors easierSpeeding up the compilation process (static SQL is included into the class query and is therefore compiled only once). Note: Cursors generated from queries of the %SQLQuery type are named automatically, e.g. Q14.All cursors used within a class must have different names.Error messages are related to the internal names of cursors which have an additional characters at the end of their names. For example, an error in cursor Q140 is actually caused by cursor Q14.Use PlaceAfter and make sure that cursors are used in the same int routinewhere they have been declared.INTO must be used in conjunction with FETCH, but not DECLARE. Example of static SQL for Utils.CustomQuery: Query AllStatic() As %Query(CONTAINID = 1, ROWSPEC = "Id:%String,Prop1:%String,Prop2:%Integer") [ SqlName = AllStatic, SqlProc ] { } ClassMethod AllStaticExecute(ByRef qHandle As %Binary) As %Status { &sql(DECLARE C CURSOR FOR SELECT Id, Prop1, Prop2 FROM Utils.CustomQuery ) &sql(OPEN C) Quit $$$OK } ClassMethod AllStaticFetch(ByRef qHandle As %Binary, ByRef Row As %List, ByRef AtEnd As %Integer = 0) As %Status [ PlaceAfter = AllStaticExecute ] { #; INTO must be with FETCH &sql(FETCH C INTO :Id, :Prop1, :Prop2) #; Check if we reached end of data If (SQLCODE'=0) { Set AtEnd = 1 Set Row = "" Quit $$$OK } Set Row = $Lb(Id, Prop1, Prop2) Quit $$$OK } ClassMethod AllStaticClose(ByRef qHandle As %Binary) As %Status [ PlaceAfter = AllStaticFetch ] { &sql(CLOSE C) Quit $$$OK } Dynamic SQLThe approach is based on other class queries and dynamic SQL. This is reasonable when in addition to an SQL query itself, you also need to perform some additional operations, e.g. execute an SQL query in several namespaces or escalate permissions before running the query. Example of dynamic SQL for Utils.CustomQuery: Query AllDynamic() As %Query(CONTAINID = 1, ROWSPEC = "Id:%String,Prop1:%String,Prop2:%Integer") [ SqlName = AllDynamic, SqlProc ] { } ClassMethod AllDynamicExecute(ByRef qHandle As %Binary) As %Status { Set qHandle = ##class(%SQL.Statement).%ExecDirect(,"SELECT * FROM Utils.CustomQuery") Quit $$$OK } ClassMethod AllDynamicFetch(ByRef qHandle As %Binary, ByRef Row As %List, ByRef AtEnd As %Integer = 0) As %Status { If qHandle.%Next()=0 { Set AtEnd = 1 Set Row = "" Quit $$$OK } Set Row = $Lb(qHandle.%Get("Id"), qHandle.%Get("Prop1"), qHandle.%Get("Prop2")) Quit $$$OK } ClassMethod AllDynamicClose(ByRef qHandle As %Binary) As %Status { Kill qHandle Quit $$$OK } Alternative approach: %SQL.CustomResultSet Alternatively, you can create a query by subclassing from the %SQL.CustomResultSet class. Benefits of this approach are as follows: Slight increase in speedROWSPEC is unnecessary, since all metadata is obtained from the class definitionCompliance with the object-oriented design principles To create query from the subclass of %SQL.CustomResultSet class, make sure to perform the following steps: Define the properties corresponding to the resulting fieldsDefine the private properties where the query context will be storedOverride the %OpenCursor method (similar to queryNameExecute) which initiates the context. In case of any errors, set %SQLCODE and %Message as wellOverride the %Next method (similar to queryNameFetch) which obtains the next result. Fill in the properties. The method returns 0 if all the data has been processed and 1 if some data is still remainingOverride the %CloseCursor method (similar to queryNameClose) if necessary Example of %SQL.CustomResultSet for Utils.CustomQuery: Class Utils.CustomQueryRS Extends %SQL.CustomResultSet { Property Id As %String; Property Prop1 As %String; Property Prop2 As %Integer; Method %OpenCursor() As %Library.Status { Set ..Id = "" Quit $$$OK } Method %Next(ByRef sc As %Library.Status) As %Library.Integer [ PlaceAfter = %Execute ] { Set sc = $$$OK Set ..Id = $Order(^Utils.CustomQueryD(..Id),1,val) Quit:..Id="" 0 Set ..Prop1 = $Lg(val,2) Set ..Prop2 = $Lg(val,3) Quit $$$OK } } You can call it from Caché Object Script code in the following way: Set resultset= ##class(Utils.CustomQueryRS).%New() While resultset.%Next() { Write resultset.Id,! } Another example is available in the SAMPLES namespace – it's the Sample.CustomResultSet class which implements a query for Samples.Person. Summary Custom queries will help you to separate SQL expressions from Caché Object Script code and implement sophisticated behavior which can be too difficult for pure SQL. References Class queries Iteration through a global Static SQL Dynamic SQL %SQL.CustomResultSet The Utils.CustomQuery class The Utils.CustomQueryRS class The author would like to thank Alexander Koblov for his assistance in writing this article. In regards to the first part of your write-up, as a developer, if you want a quick and easy way to test the query from your terminal window (i.e. Cache Terminal, PowerTerm, Putty, Reflections, etc...), then you can run the query as follows: Do ##class(%ResultSet).RunQuery({className},{queryName}) - if you have input parameters, then you would pass those values as subscripts 3 - n; Example of running query with two input parameters: Do ##class(%ResultSet).RunQuery({className},{queryName},{inputValue1},{inputValue2}) So an example of running the first query in your write-up from a terminal window might be: Do ##class(%ResultSet).RunQuery("Sample.Person","ByName","A") The output from the query might be: ID:Name:DOB:SSN:53:Adam,Ralph O.:41730:657-43-6149:33:Adam,Zoe X.:56117:982-36-6928:80:Ahmed,Edgar Q.:33719:546-61-2688:110:Alton,Umberto B.:30116:877-79-1057:146:Avery,Valery P.:39515:310-11-8847: Hope this Helps and Have a Great Day!!! Happy Coding/Testing those queries! Quick followup note - so folks don't freak out about the output from the query example - the names and birthdates and SSN's displayed in the example were from a SAMPLES testing environment where data was auto-generated at random - so the data is fake and not for real people :) Thanks! I often need to run queries from a terminal, so I extended Caché ObjectScript with zsql command to run queries and display the results. Here's how it works: zsql "SELECT TOP 2 Name FROM Sample.Person" Would output: Name Adams,Chris Z. Adams,Danielle P To achieve it I created %ZLANGC00 mac routine with the following code: ; %ZLANGC00 ; custom commands for ObjectScript ; http://docs.intersystems.com/cache20141/csp/docbook/DocBook.UI.Page.cls?KEY=GSTU_customize Quit /// Execute Query and display the results /// Call like this: /// zsql "SELECT TOP 10 Name FROM Sample.Person" ZSQL(Query) #Dim ResultSet As %SQL.StatementResult Set ResultSet = ##class(%SQL.Statement).%ExecDirect(, Query) Do ResultSet.%Display() Quit Save and compile it and then you can execute sql (class queries too) in a terminal with zsql command: zsql "SELECT * FROM Sample.SP_Sample_By_Name('Z')" That said I myself prefer executing SQL queries in SMP because you don't need to type them there (drag&drop from the left panel or copy&paste from the code) - it's very convenient. I typically use the SQL shell in the terminal when doing this. It is very powerful.SAMPLES>do $system.SQL.Shell()SQL Command Line Shell----------------------------------------------------The command prefix is currently set to: <<nothing>>.Enter q to quit, ? for help.SAMPLES>>?Interactive SQL Shell (c) InterSystems Corporation 1999-2009 All Rights Reserved----------------------------------------------------To execute an SQL statement, type it in and press ENTER.To execute a multiline SQL statement, press <enter> to entermultiline statement mode, enter the each statement line andenter GO to exit multiline statement mode and execute the statement.etc...This uses %SQL.Statement to do all the work and has a number of options including saving of queries, etc. I would also recommend to try Caché Web Terminal SQL mode:This mode appeared with 4.0 version. for me personally, the greatest benefit of custom SQL queries is that you don't need any (persistent) data, you can just generate it on the fly, either computing them or bringing from any arbitrary external system. Dan Fantastic post @Eduard.Lebedyuk Alternative approach: %SQL.CustomResultSet is icing on the cake which is not covered by Intersystems documentation. It really should be. This is exactly what I needed for my current project. Thanks. 💡 The article is considered as InterSystems Data Platform Best Practice. For 2017.1CE and later as well as all IRIS versions, %SQL.CustomResultSet should not be used. Instead, use %SQL.CustomQuery. There are several good reasons for this. There is good class documentation available. I am happy to post examples if anyone is interested. Hi @Daniel.Pasco It would be great if you can share some examples with us. Best Regards,Henrique First, keep in mind that all implementations (faithful implementations that is) of %SQL.CustomQuery are also projected as table-valued functions. That means you can include the function in the FROM clause of a SELECT statement. The process of implementing a custom query is simple. These steps are described in the %SQL.CustomQuery class documentation so I'll just summarize here. Define a new class that extends %SQL.CustomQuery; Override the SQLNAME parameter, assign a valid SQL identifier that is to be the name of the TVF; Define properties, in order, that are the columns of each row returned by this query. Let's call these "result columns". Each result column is defined as a non-private property; Define properties that you will need to maintain the source data, pointers, etc. that you will use to manage the data used to produce rows. These properties are defined as "private"; Override %OpenCursor. Add parameters to this method override that correspond to the input parameters that will be passed when instantiating the custom query; Override %FetchCursor. In this method, check for end of data. If not at the end then populate all of the result properties with data and return 1 (true). Otherwise, clear all result properties and return 0; Override %CloseCursor. In this override, release any resources acquired during instantiation and perform any necessary cleanup. I won't post the version of the class that produces this output since the version of %Net.Http in current versions of CE/IRIS do not have a working GetJSON() method. The version of the class I'm posting simply passes in the raw JSON data as an argument. The query: SELECT top 5 stateCode,name,population FROM example_custom.sample_custom_query('https://api.census.gov/data/2014/pep/natstprc?get=STNAME,POP&for=state:*&DATE_=7','Default') ORDER BY population DESC and the results: stateCode name population 06 California 38802500 48 Texas 26956958 12 Florida 19893297 36 New York 19746227 17 Illinois 12880580 5 row(s) affected Using this version of a custom query class: Class example.custom.JsonQuery Extends %SQL.CustomQuery { Parameter SQLNAME As String = "sample_custom_json_query"; Property data As %Library.DynamicAbstractObject [ Private ]; Property iterator As %Iterator.AbstractIterator [ Private ]; Property atEnd As %Integer [ InitialExpression = 0, Private ]; Property stateCode As %String; Property name As %String; Property population As %Integer; Method %OpenCursor(data As %String(MAXLEN="")) [ Private ] { try { if $isobject(data) { set ..data = data } else { set ..data = [].%FromJSON(data) } set ..iterator = ..data.%GetIterator() if '..iterator.%GetNext(.key,.value) { set ..atEnd = 0 set ..iterator = "" } } catch exception { // this is just a place holder, this method reports errors by throwing an exception // but a catch can allow the user to log errors or perform some self-healing action throw exception } } Method %FetchCursor() As %Library.Integer { set response = 0 if ($isObject(..iterator)) && ('..atEnd) { if ..iterator.%GetNext(.key,.value) { set ..name = value.%Get(0) set ..population = value.%Get(1) set ..stateCode = value.%Get(3) set response = 1 } else { set ..atEnd = 1 set ..iterator = "" } } else { set ..name = "" set ..population = "" set ..stateCode = "" } return response } Method %CloseCursor() [ PlaceAfter = %Next, Private ] { // not really necessary as %OnClose will automatically close the cursor during destruction // but users can place code here to clean up other resources allocated for this query instance // that are external to the query instance. Like a temporary global. set ..iterator = "" set ..data = "" } } and this query SELECT top 5 stateCode,name,population FROM example_custom.sample_custom_json_query('[["STNAME","POP","DATE_","state"],["Alabama","4849377","7","01"],["Alaska","736732","7","02"],["Arizona","6731484","7","04"],["Arkansas","2966369","7","05"],["California","38802500","7","06"],["Colorado","5355866","7","08"],["Connecticut","3596677","7","09"],["Delaware","935614","7","10"],["District of Columbia","658893","7","11"],["Florida","19893297","7","12"],["Georgia","10097343","7","13"],["Hawaii","1419561","7","15"],["Idaho","1634464","7","16"],["Illinois","12880580","7","17"],["Indiana","6596855","7","18"],["Iowa","3107126","7","19"],["Kansas","2904021","7","20"],["Kentucky","4413457","7","21"],["Louisiana","4649676","7","22"],["Maine","1330089","7","23"],["Maryland","5976407","7","24"],["Massachusetts","6745408","7","25"],["Michigan","9909877","7","26"],["Minnesota","5457173","7","27"],["Mississippi","2994079","7","28"],["Missouri","6063589","7","29"],["Montana","1023579","7","30"],["Nebraska","1881503","7","31"],["Nevada","2839099","7","32"],["New Hampshire","1326813","7","33"],["New Jersey","8938175","7","34"],["New Mexico","2085572","7","35"],["New York","19746227","7","36"],["North Carolina","9943964","7","37"],["North Dakota","739482","7","38"],["Ohio","11594163","7","39"],["Oklahoma","3878051","7","40"],["Oregon","3970239","7","41"],["Pennsylvania","12787209","7","42"],["Rhode Island","1055173","7","44"],["South Carolina","4832482","7","45"],["South Dakota","853175","7","46"],["Tennessee","6549352","7","47"],["Texas","26956958","7","48"],["Utah","2942902","7","49"],["Vermont","626562","7","50"],["Virginia","8326289","7","51"],["Washington","7061530","7","53"],["West Virginia","1850326","7","54"],["Wisconsin","5757564","7","55"],["Wyoming","584153","7","56"],["Puerto Rico Commonwealth","3548397","7","72"]]') ORDER BY population DESC produces the same result: stateCode name population 06 California 38802500 48 Texas 26956958 12 Florida 19893297 36 New York 19746227 17 Illinois 12880580 5 row(s) affected I am happy to post other examples if you wish. Dan Custom queries can also be instantiated without using SQL. Simply call %New and pass in the arguments that are defined by the %OpenCursor method. There is one difference here - the first argument of %New is the SELECTMODE and subsequent arguments correspond to the %OpenCursor arguments. Once instantiated, the interface is like any other %SQL.IResultSet. USER>set result = ##class(example.custom.Query).%New(,"https://api.census.gov/data/2014/pep/natstprc?get=STNAME,POP&for=state:*&DATE_=7","Default") USER>write result.%Next() 1 USER>write result.name Alabama USER>write result.population 4849377 USER>while result.%Next() { write !,result.name,": ",result.population } Alaska: 736732 Arizona: 6731484 Arkansas: 2966369 California: 38802500 Another unknown feature that isn't mentioned anywhere else. I am a fan Robert...but... https://community.intersystems.com/post/new-video-sql-%E2%80%93-things-you-should-know Somewhere around the 36 minute mark of this video. Buried perhaps. Still, %SQL.CustomQuery has a lot of interesting capabilities. Incredible! I really missed something important!@Daniel.Pasco Thanks for the pointer. I never stop learning. This is a great feature Dan! I'm using it to write SQL queries to Pandas dataframes in Python (code). Some of the reasons why I focus on utilizing class queries include Studio and other editors are much better at providing coloring/syntax checking vs a strategy of setting a tSQL string as some arbitrary SQL statement As mentioned in the original post it provides the separation that allows for easy re-use and testing. If I have a class query I can decide to allow it to be reflected as a stored procedure, I can expose it to ZEN reports, ODBC/JDBC, JSON projection, %XML.DataSet, I can use it to satisfy a control in ZEN that allows for binding to a class query, as well as others. Basically, it provides for great separation. I also like the idea of considering writing the statement using SQL as %Query. If for some reason I run into performance issues that I cannot overcome I can change the implementation to %SQLQuery and implement COS code, the calling application would see the improved performance but does not have to understand the internal implementation. However, I've only done this on extremely rare occasions. I think one of the most important aspects of SQL to always read the query plan, it's there where you can understand what the optimizer is going to do. I care all about the query plan and almost always validate what it reports. Having a class query allows for easy Show Plan examination whereas it's generally hard to do if you take the approach of setting a tSQL string(sometimes with concatenation over several lines). Class Queries are really worth investing in IMHO.
Announcement
Evgeny Shvarov · Dec 7, 2016

InterSystems Developer Community Birthday!

Hi, Community!Today, the 7th of December is the very good day - today is InterSystems Developer Community birthday!One year ago this day we sent tons of email invitations to you, dear members, to join DC.And you joined and hope you like the site!Please feel free to provide your feedback! Please provide only positive below in the comments, for the negative ones we have this page.So, what do we have after a year?How large are we?Here is the picture:We have 2,200 registered members from all over the world, a set of really helpful articles and the working Q&A forum.We have the site which gives us knowledge, experience and sometimes fun!Thank you for your contribution and participation!We can make the place better, please share your ideas too.We have great plans for the next year in Developer Community, stay tuned! Congratulations! Congratz Gratz! Congrats!!the articles and discussions are very useful for all levels of Caché developers. I find it interesting, when I think of Dec 7, I don't think of the Developers Community, I think of the sneak attach on Pearl Harbor, but, maybe I am just getting old. You asked for ideas, one thing I asked for a long time ago and still think it is needed is a Help function on how all of this works. It is not obvious to be sure. I must be getting old also... Actually that is an event that should always be remembered. Good point, Mike! Thank you! We'll do soon. A day that shall live in infamy. You're not old, just an American. I actually baked a cake to commemorate this occasion, and totally not because I'm fat and wanted cake. Hard to believe it's already been a year
Question
Sébastien Demoustiez · Dec 15, 2016

GROUP_CONCAT() With InterSystems Caché

Hello, Is there some way to reproduce the SQL GROUP_CONCAT(http://sql.sh/fonctions/group_concat) with the Caché DB ?Thanks Sébastien Exactly that :) Thanks a lot. Strange but with this function the perfermance are really bad :(0.04s without and 13s with it ... If you want to investigate the performance of the query you can always open a WRC Case (wrc.intersystems.com :-D).Otherwise, send out your class definition and query plan and I can take a cursory look at it. I tried with an other table with just 2 properties ( but more than 1million records too) and that's the same problem.see:http://imgbox.com/Df2WlAPohttp://imgbox.com/tEFMR5V8 Looks like, it works exactly as LIST function in Caché Looking at that documentation, one difference between LIST and GROUP_CONCAT is that GROUP_CONCAT lets you specify the separator, while LIST always uses a comma. If you wanted to use a different separator, and your data will never contain commas, then it's as easy as (for example): select home_city as "City", count(*) as "Count", REPLACE(LIST(Name),',',' ') as "Names" from sample.person group by home_city If "your data will never contain commas" is a bad assumption (as it is in the case of Name in Sample.Person), the solution is to use %DLIST and $ListToString. select home_city as "City", count(*) as "Count", $ListToString(%DLIST(Name),' ') as "Names" from sample.person group by home_city %DLIST builds a $ListBuild list, and $ListToString joins the list elements with the specified separator. %DLIST is useful in other cases too - for example, if your data might contain commas and you want to iterate over the aggregated data after running dynamic SQL.
Question
Joaquin Montero · Mar 2, 2020

InterSystems Kubernetes Operator documentation

Hi Everyone, I've been working on deploying an IRIS for Health environment in EKS. There is a video session in the InterSystems learning portal about this feature but I have not succeeded in finding the proper documentation and resources to use this in my Kubernetes cluster. Has this been deprecated/discontinued? Any idea where can I find the resources? Should I stick to StatefulSets instead of using the IrisCluster resource type provided by this operator? Tagging @Luca.Ravazzolo and @Steven.LeBlanc I cannot find any real information about the Kubernetes operator either. They said in the video that it's ready and working, but it does not seem that way. Hi Guys Thank you for your interest on the InterSystems Kubernetes Operator. We are working hard at preparing it to be available for 2020.2 timeframe. Stay tuned.
Discussion
Eduard Lebedyuk · Mar 6, 2020

Temporary Tables in InterSystems IRIS

Temporary tables are tables available for a current process only (and destroyed when process ends). What are you approaches to creating temporary tables? Here's the two I know: Process-private Globals storage can be used as a data global in storage definition. That way, each process can have its own objects for the class with ppg storage. Here's how. Here's how 2. InterSystems TSQL supports #tablename temporary tables. A #tablename temporary table is visible to the current procedure of the current process. It is also visible to any procedure called from the current procedure. #tablename syntax is only supported in TSQL procedures (class methods projected as procedures with language tsql). Any other ways to achieve similar functionality? CREATE GLOBAL TEMPORARY TABLE <mytable> {} has similar effects as process-private globals... Indeed, that's the recommended SQL way of achieving what Eduard described about PPGs. Drawbacks are that queries on these tables cannot be parallelized (as that implies multiple processes, of course). Our TSQL support is meant for Sybase customers wishing to redeploy their TSQL applications on IRIS (especially now that SAP/Sybase is terminating support for those platforms). Just temporary table support by itself wouldn't be a reason to start building TSQL applications and abandon IRIS SQL/ObjectScript, of course :-). However, for a recent TSQL migration we did some work on our TSQL temp table support and were considering to roll that out to regular IRIS SQL, so this thread is a good place to share your experiences and requirements so we can make sure to do that properly, as needed. Hi, I recently needed a temporary class (not just a table) to store data while it was manipulated (imported, queried, modified, etc. and eventually exported) . I eventually set up all the storage locations with PPG refs as noted above, which works, but did wonder if there was a class parameter I had missed that just indicated the extent was temporary. Or maybe an alternative to extending %Persistent? Mike I'm not really sure what do you mean by if there was a class parameter I had missed that just indicated the extent was temporary what do you want to achieve with this? Hi, I don't want to achieve anything else, it's just that "cache being cache" there's often another way to do the same thing and it might be easier. :-) Mike
Announcement
Derek Robinson · Mar 26, 2020

New Podcast: InterSystems Certification

In Episode 6 of Data Points, certification manager Jamie Kantor joins the podcast to tell us all about the InterSystems Certification program, what exams are currently being offered (and what ones are in development), and why it matters for developers and enterprises. Check it out! For more information and to subscribe on your favorite podcast app, head over to https://datapoints.intersystems.com. There, you can also find links to the previous episodes. Make sure to rate the podcast if you're using Apple Podcasts, as well!
Announcement
RB Omo · Mar 12, 2020

InterSystems COVID-19 Update

In light of the COVID-19 situation worldwide, InterSystems is working to ensure continuous support for our customers, promote the health and well-being of our employees, and protect the communities where we live and work. Please view our statement and frequently asked questions. In addition, we have also established a special email address, COVID-19@InterSystems.com, should you have any questions or concerns. As we have for 41 years, we stand by our commitment to customers and are prepared to support you during this challenging time. Sincerely, John Paladino Vice President, Client Services
Announcement
Anastasia Dyubaylo · Jan 13, 2020

InterSystems Benelux Symposium 2020

Dear Community, We're pleased to invite you to the InterSystems Benelux Symposium 2020, which will take place from February 11th to 12th in Antwerp, Belgium! At the Symposium, both InterSystems experts and external thought leaders will discuss what it takes to make your IT innovation work. You're more than welcome to join us in the Radisson Blu Astrid Hotel in Antwerp. Fastest Path to Possible - With Digital Innovation "The future lies in the hands of those who use data the best", said InterSystems CEO & Founder Terry Ragon at the InterSystems Global Summit 2019. "However, this data is spread across an enormous set of databases and hardware devices, meaning we need to find ways to connect these systems together." There is, of course, a way: interoperability. But how do you approach interoperability? And most importantly: how can you use it to create better applications and be a better partner to your customers? Be part of the discussions! Don’t miss out and register for the event by clicking right here. Can’t wait to see you in Antwerp! So, remember: ⏱ Time: February 11-12, 2020 📍Venue: Radisson Blu Astrid Hotel, Antwerp, Belgium ✅ Registration: SAVE YOUR SEAT TODAY I'm finally going to Antwerp this year. So, you have a chance to meet me there, to see VSCode-ObjectScript in action, get quick help with the migration process, and give your feedback. See you there. Will have two sessions on Feb 12th on during InterSystems Benelux Summit on behalf of CUG meetup. See you there! So, today, I'm presenting a demo of VSCode-ObjectScript at CUG meetup here in Antwerp ObjectScript Package Manager presentation is attached Please welcome a brief video overview of the InterSystems Benelux Symposium 2020: Enjoy! 👍🏼
Announcement
Vadim Aniskin · Oct 2, 2023

InterSystems Ideas News #8

Hey Community! Welcome to the 8th edition of the InterSystems Ideas news bulletin! Here's what you can expect from it: ​​​​✓ New video describing all pages of the Ideas Portal ✓ Ideas implemented since the previous news bulletin ⏯Watch the video: Join the Hall of Fame Ideas Portal contains, in addition to the "Ideas" page, 4 new pages: Portal Guide News Polls Hall of Fame Check them out to always be up to date about what's going on on Ideas Portal! Since the launch of the Ideas Portal, 24 of the ideas have already been implemented by the Community members. And 6 of them were realized during the last 3 months: Idea Implementation of the idea (project) Developer Include support for GraphQL by @Jose-Tomas.Salvador GraphQL @Gevorg.Arutiunian9096 REST API for Security Package by @Evgeny.Shvarov Config-API @Lorenzo.Scalese Jupyter Notebook by @Alex.MacLeod Jupyter Server Proxy for VS Code @John.Murray Java Hibernate support for IRIS by @Dmitry.Maslennikov IRIS Hibernate Dialect @Yuri.Gomes Example of Flask application with SQLAlchemy IRIS by @Dmitry.Maslennikov IRIS-FlaskBlog @Muhammad.Waseem Support for Liquibase by @Yuri.Gomes liquibase-iris @Dmitry.Maslennikov 👏 Thank you for implementing and posting these ideas 👏 Stay tuned to read our next announcements! In the meantime post your brilliant ideas, vote for existing ideas, and comment on them on our InterSystems Ideas Portal! Hi Developers! Based on the portal user's requests we added a new page to the Ideas Portal recently: Idea-A-Thon Winners Idea-a-Thon winners: @Andre.LarsenBarbosa , @Yuri.Gomes , @Heloisa.Paiva , @Scott.Roth , @Sharafat.Hussain , @Danny.Wijnschenk I hope you like this page, and thanks again for your ideas! Vote for ideas to make them real!
Announcement
Brenna Quirk · Nov 14, 2023

Learn to code in InterSystems IRIS!

New to coding in InterSystems IRIS® data platform? Try this brand-new learning program to get the basics, and see how to develop an application with InterSystems ObjectScript alongside your language of choice: Java, .NET, Python, or Node.js.
Announcement
Vadim Aniskin · Dec 27, 2023

InterSystems Ideas News #10

Hi Developers! Welcome to Issue 10 of the InterSystems Ideas newsletter! Here's what you can expect from it: ​​​​✓ Month of ideas about "FHIR and Digital Health Interoperability" ✓ Reached 300 ideas on the Ideas Portal 🎉 ✓ Recently implemented ideas Since the new programming contest is dedicated to FHIR and Digital Health Interoperability, we've decided to gather all the related ideas in one place (see the table below). To show your interest and support for your fellow members, we invite everyone to vote for existing ideas or post new ones before the start of the Contest. Create new ideas devoted to "FHIR and Digital Health Interoperability" and promote your ideas on the Community by creating discussions. The more votes an idea receives, the greater the chances are of its implementation. Vote and comment on existing ideas related to FHIR and Digital Health Interoperability: Idea Author RPMShare - Database solution for remote patient | InterSystems Ideas @Dmitry.Alexeev HL7 test message generator @Vadim.Aniskin Introduce an Interoperability adapter to import apple health data to IRIS @Evgeny.Shvarov Intellisense for FHIR resources @Jose-Tomas.Salvador Interoperability Production: "Add a 'Type-to-Filter' ability in dropdown selections" @Victoria.Castillo2990 Data Analyzer @Stella.Tickler HealthShare Patient Unmerge Ooopses @Sawyer.Butterfield Editable HL7 Editor in the Message Viewer @Evgeny.Shvarov Iterative build of TrakCare configuration/code tables utilising FHIR and HL7 Messaging. Linda McKay FHIR step by step setup @Scott.Roth Introduce an Interoperability Adapter for POCT1A and POCT1A-2 standards @Evgeny.Shvarov Speech Recognition to Prepopulate Incoming Patient Data Jayesh Gupta When we launched the portal last year, we had very few ideas. A year and a half later, the number of ideas on the portal exceeded 300. 🌟 Thank you so much @Armin.Gayl5931 for posting the 300th idea Optimization of the HL7 Schema documentation to the Ideas Portal. 👏 Thanks to all Community members who publish, comment, vote, and implement ideas. Looking forward to your new bright ideas in 2024.👏 To round up this newsletter, here are 4 ideas implemented during the last 2 months. Idea Implementation of the idea (project) Developer Credly badge for VIP into GM by @Yuri.Gomes Global Masters Credly Badges - new way to share your accomplishment! InterSystems Run Jasperreports from ObjectScript and IRIS Adapter by @Yuri.Gomes JIRISReport @Yuri.Gomes Journal file analysis/visualization by @Timothy.Leavitt journal-file-indexer @Lorenzo.Scalese Implement samples of $system.external interface (aka "InterSystems External Servers") by @Enrico.Parisi samples-dynamicgateway-dotnet, samples-dynamicgateway-java @Daniel.Pasco, @Summer.Gerry 👏 Thank you for implementing and posting these ideas 👏 Happy New Year! 🎄In 2024 post your brilliant ideas, vote for existing ideas, and comment on them on our InterSystems Ideas Portal!💡 Thank you for all of the ideas which make all of our products and online offerings better :-) Hi Developers! @Muhammad.Waseem with the idea Web Interface to convert HL7 V2 to FHIR​​​​@Maksym.Shcherban with the idea Implement support for FHIRPath Patch resources on InterSystems FHIR Serverthank you for creating your ideas related to the "FHIR and Digital Health Interoperability" topic. 300 ideas! Wow! Congratulations @Vadim.Aniskin and the community itself! 300 ideas! Another milestone! 🎊🎊🎊 Community! Thank you all for sharing your ideas, suggestions and feedback on the Ideas Portal 👏
Announcement
Raj Singh · Nov 13, 2023

InterSystems Studio deprecation news

As first announced in May, and after many presentations and robust discussions at Global Summit 2023, we are continuing with the plans to deprecate InterSystems Studio (deprecated designates a feature or technology that InterSystems no longer actively develops, and for which better options exist). InterSystems Studio was deprecated in 2023.2 but will continue to ship with versions 2023.3 and 2024.1. The latter will be the last release that includes Studio as part of the Windows kit, and it will be the only version that allows connections to future versions of InterSystems IRIS. This version that ships with 2024.1 will continue to be available as a standalone tool on the WRC component download page as it is today. As always, VS Code with the InterSystems ObjectScript Extension Pack is recommended for all InterSystems' development. Check out the Studio migration guide for help getting started. And if you haven't tried VS Code recently, note a new feature, XML import/export, makes adoption even easier. Thank you deeply for your in-person feedback this past June. I'm always more than happy to hear your experiences and challenges, and look forward to you joining us in enhancing our new flagship IDE.
Article
Yuri Marx · Nov 20, 2023

Parquet files and InterSystems IRIS

In the world of Big Data, selecting the right file format is crucial for efficient data storage, processing, and analysis. With the massive amount of data generated every day, choosing the appropriate format can greatly impact the speed, cost, and accuracy of data processing tasks. There are several file formats available, each with its own set of advantages and disadvantages, making the decision of which one to use complex. Some of the popular Big Data file formats include CSV, JSON, Avro, ORC, and Parquet. The last one, Parquet, is a columnar storage format that is highly compressed and separated, making it ideal for Big Data problems. It is optimized for the paradigm Write Once Read Many (WORM) and is good for heavy workloads when reading portions of data. The format provides predicate pushdown and projection pushdown, reducing disk I/O and overall performance time. Parquet is self-describing, has built-in support in Spark, and can be read and written using the Avro API and Avro Schema. (source: https://www.linkedin.com/pulse/big-data-file-formats-which-one-right-fit-amandeep-modgil). Have you ever used pd.read_csv() in pandas? Well, that command could have run ~50x faster if you had used parquet instead of CSV. (source: https://towardsdatascience.com/demystifying-the-parquet-file-format-13adb0206705). The reasons and benefits of use Parquet are (source: https://www.linkedin.com/pulse/big-data-file-formats-which-one-right-fit-amandeep-modgil): Optimized for Big Data, but readability and writing speed are quite poor. Parquet is the best choice for performance when choosing a data storage format in Hadoop, considering factors like integration with third-party applications, schema evolution, and support for specific data types. Compression algorithms play a significant role in reducing the amount of data and improving performance. CSV is typically the fastest to write, JSON the easiest to understand for humans, and Parquet the fastest to read a subset of columns. Columnar formats like Parquet are suitable for fast data ingestion, fast random data retrieval, and scalable data analytics. Parquet is used for further analytics after preprocessing, where not all fields are required. Currently, IRIS does not support the Parquet format, even though it is a very important standard for Data Fabric, Big Data, Data Ingestion and Data Interoperability projects. But now, it is possible, through the iris-parquet application (https://openexchange.intersystems.com/package/iris-parquet) to write or read data in IRIS in Parquet files. Just follow the steps below: 1. Install parquet-iris using Docker or ZPM. If Docker: docker-compose build docker-compose up -d If ZPM: USER> zpm install iris-parquet Install hadoop files and set ENV variable to HADOOP_HOME (for ZPM installation only): wget https://dlcdn.apache.org/hadoop/common/hadoop-3.3.6/hadoop-3.3.6.tar.gz && \ tar -xzf hadoop-3.3.6.tar.gz && \ echo "export HADOOP_HOME=//hadoop-3.3.6" 2. Open http://localhost:<port>/swagger-ui/index.html and explore the parquet-api _spec endpoint: 3. Run the method /generate-persons one or more to generate sample person fake data: 4. Run the method /sql2parquet with this query on body: select * from dc_irisparquet.SamplePerson: 5. Download the parquet file on the link Download file: 6. Open the Parquet file on VSCode (install the parquet-viewer extension to see the parquet content from VSCode - https://marketplace.visualstudio.com/items?itemName=dvirtz.parquet-viewer): Excellent! Can you explain your inspiration for this project? Were you inspired 😉? Have you considered a project to migrate parquet lake to Delta Lake? Thanks Mike, my inspiration is a target to looking for IRIS current gaps on important themes and help to eliminate the gaps with new DC projects. IRIS with sharding enabled is great option for Lake projects
Article
Yuri Marx · Feb 23, 2023

IoT with InterSystems IRIS

IoT (Internet of Things) is a network of interconnected things, including vehicles, machines, buildings, domestic devices or any other thing with embedded TCP/IP remote connection available, allowing it to receive and send execution instructions and data. Each thing provides one or more services to the IoT network. For instance, smart light bulbs provide services of turning off and turning on the lights; smart air conditioners maintain the environment temperature; smart cameras send notifications when capturing movement. Yet, all those devices can operate together to provide building security, for example.When the camera detects movement, the lights turn on. The orchestration of those processes that can deliver business and integrated services, such as building security, is done by a server software controller. It can receive and send data and instructions from a dozen smart devices to a TCP/IP network (like the Internet) to operate these things as major services. There are two options of protocols used by things and server software to interoperate data and instructions: MQTT (Message Queuing Telemetry Transport) and REST (Representational State Transfer) API. The first one is more common because it has a simpler structure. Some devices have limited memory and processor features and don’t have enough resources to use REST. Thus, this article will detail how to implement the server-side software to interoperate the IoT using MQTT. About MQTT MQTT is a lightweight, publish-subscribe, machine to machine network protocol for message queue/message queuing service. It is designed for connections with remote locations that have devices with resource constraints or limited network bandwidth. It must run over a transport protocol that provides ordered, lossless, bi-directional connections—typically, TCP/IP. It is an open OASIS standard and an ISO recommendation (ISO/IEC 20922). See more on https://en.wikipedia.org/wiki/MQTT. The MQTT protocol has two actors to interoperate data between things: the message broker and the clients. The first one is a server software used to receive and publish messages to be produced or consumed by one or more clients. The last ones are the smart devices and server softwares used to provide major services using the thing’s services, such as a server software to provide a building automation service, or a software to control the traffic in the city using smart semaphores and public cameras services. MQTT Broker There are various MQTT message brokers on the market, but the most popular one is the Eclipse Mosquitto (https://mosquitto.org/), so we will use it in our example.The MQTT Broker is similar to JMS or MQ brokers. It is an intermediate employed to receive messages and post them into topics. Thus, the connected clients can subscribe to one or more topics to produce or consume messages from those topics. The topics allow the clients (software and devices) to do their job in an uncoupled and asynchronous way. It means that each client observes a message and takes action if necessary. They can also post new messages themselves. For instance, if a thermostat sends a message indicating high temperature in the topic “building A”, the air conditioning installed in that building and listed in the topic “building A” can take this message from there and turn itself on to reduce the temperature. However, it is common to see server-side software with business rules automated to turn on the air conditioning when the temperature reaches a specific value. The MQTT broker can also guarantee message delivery if it is configured to retain the messages when down and restore them when on again. Other advantages of the MQTT Broker include (from Wikipedia): Eliminating vulnerable and insecure client connections, if configured to do so. Being able to easily scale from a single device to thousands of gadgets. Managing and tracking all client connection states, including security credentials and certificates, if configured to do so. Reducing network strain without compromising the security, if configured to do so (when working with a cellular or satellite network). MQTT messages There are 3 types of messages: Connect: establish a connection between the broker and clients in a TCP/IP session. Disconnect: disconnect the server and the client from a TCP/IP session. Publish: put a message into a topic and send a copy of that message to each client that subscribed to the topic. Check out an example of messages in action: InterSystems IRIS support to IoT and MQTT protocol According to the InterSystems IRIS documentation, the currently supported version is MQTT 3.1. This MQTT specification is defined as an OASIS standard, MQTT Version 3.1.1. The Interoperability module from IRIS defines an Inbound (to consume MQTT messages) and an Outbound (to produce MQTT messages) MQTT Adapter. It is possible to develop custom business services and operations using these adapters, or you can use the built-in business service and operation, EnsLib.MQTT.Service.Passthrough and EnsLib.MQTT.Operation.Passthrough, respectively. If you want to use the MQTT protocol outside of an interoperability production, you can use the lower-level %Net.MQTT classes. The MQTT classes use the Eclipse Paho MQTT C Client Library. To create a Business Service to consume MQTT messages Class packagename.NewService1 Extends Ens.BusinessService { Parameter ADAPTER = "EnsLib.MQTT.Adapter.Inbound"; Method OnProcessInput(pInput As EnsLib.MQTT.Message, pOutput As %RegisteredObject) As %Status { set tsc=$$$OK set messageContent = pInput.StringValue …. Quit tsc } } To create a Business Operation to produce/send MQTT messages Class packagename.NewOperation1 Extends Ens.BusinessOperation { Parameter ADAPTER = "EnsLib.MQTT.Adapter.Outbound"; Parameter SETTINGS = "-SendSuperSession"; Method OnMessage(pRequest As packagename.Request, Output pResponse As packagename.Response) As %Status { #dim tSC As %Status = $$$OK #dim e As %Exception.AbstractException Try { Set message = ##class(EnsLib.MQTT.Message).%New() Set message.Topic = ..Adapter.Topic Set jsonValue = {} Set jsonValue.message = "Response message” Set message.StringValue = jsonValue.%ToJSON() Set tSC=..Adapter.Send(message.Topic,message.StringValue) Set pResponse = ##class(packagename.Response).%New() Set pResponse.message = “Message response” } Catch e { Set tSC=e.AsStatus() } Quit tSC } Settings for the Inbound and Outbound MQTT Adapter Both Business Service and Business Operation can be configured with the following parameters (source: https://docs.intersystems.com/irisforhealthlatest/csp/docbook/DocBook.UI.Page.cls?KEY=EMQTT_reference): Client ID: This is the string that identifies this client to the broker. It must be ASCII-encoded and contain between 1 and 23 characters. Connect Timeout: This is the connection timeout. Connecting to a busy server may take some time, so this timeout can be used to avoid a premature connection failure. It specifies the number of seconds to wait before a connection attempt fails. Credentials Name: This is the ID name of the set of credentials values used to access the MQTT broker. The username and password defined in your Credentials item must be ASCII-encoded. It is not needed if the broker does not require login credentials. Keep Alive: Specifies the maximum number of seconds to pass between messages sent by the client to the broker. QOS: This determines the quality of service required. It can have either of these two values: 0 - QOSFireAndForget: Do not wait for a response from the broker and 1 - QOSWaitForDelivery: Wait for a response from the broker and issue an error if the broker does not respond. Retained: This is the flag that indicates to the broker whether the message should be retained. Topic: This is the name of the topic to which you wish to publish or subscribe. The topic must be ASCII-encoded. The topic is typically a hierarchical string with levels of subtopics separated by a / (forward slash). In a subscription, a topic can have wildcards as a topic level. URL: This is the URL of the broker with which you wish to communicate. The scheme is either “tcp” or “ssl” followed by the domain name and the port delimited by a “:”, for example, “tcp://BIGBADAPPLE.local:1883”. Typically TLS-enabled endpoints are configured with a port of 8883, but this is not mandatory. Download and Install the Sample application The IRIS IoT Sample is a simple application to show you how to consume (receive) and produce (send) MQTT messages using an Interoperability Production. To get it, go to https://openexchange.intersystems.com/package/IoT-Sample. Now, execute the next steps. 1. Clone/git pull the repo into any local directory $ git clone https://github.com/yurimarx/iris-iot-sample.git 2. Open the terminal in this directory and run: $ docker-compose build 3. Run the IRIS container with your project: $ docker-compose up -d If you want to install it using ZPM, follow the next steps: 1. Open IRIS Namespace with Interoperability Enabled.2. Open Terminal and call: USER>zpm "install iris-iot-sample" Run the Sample Application. 1. Open the production and start it. It will begin observing the MQTT topic /DeviceStatusInputTopic and produce responses to the MQTT topic /DeviceStatusOutputTopic. Check out: 2. Use an MQTT client to send a message and test the production. To do it, go to https://chrome.google.com/webstore/detail/mqttbox/kaajoficamnjijhkeomgfljpicifbkaf on your Google Chrome browser. It is the Chrome Plugin MQTTBox. Just click Add to Chrome and then Add App. 3. In your Google Chrome browser go to chrome://apps/ and select MQTTBox (if required, click Open Anyway). 4. Click Create MQTT Client button. 5. Configure the MQTT connection with these settings: Client name: Local Protocol: mqtt / tcp Host: localhost:1883 Username: admin Password: admin All other settings stay the default values. 6. Configure the MQTT topics to send and receive MQTT messages: Topic to publish: /DeviceStatusInputTopic Topic to subscribe: /DeviceStatusOutputTopic Payload: { "deviceId":"Air Conditioner Level 1", "statusDate":"2023-01-07 14:03:00", "status": 0 } 7. Click the button Subscribe to check the messages on the topic /DeviceStatusOutputTopic. 8. Click the button Publish to send a message to /DeviceStatusInputTopic and see the results produced by IRIS production on /DeviceStatusOutputTopic. 9. Check out the message processing session on IRIS Management Portal Visual Trace The Sample Application source code The Dockerfile ARG IMAGE=intersystemsdc/irishealth-community ARG IMAGE=intersystemsdc/iris-community FROM $IMAGE WORKDIR /home/irisowner/irisbuild ARG TESTS=0 ARG MODULE="iris-iot-sample" ARG NAMESPACE="USER" RUN --mount=type=bind,src=.,dst=. \ iris start IRIS && \ iris session IRIS < iris.script && \ ([ $TESTS -eq 0 ] || iris session iris -U $NAMESPACE "##class(%ZPM.PackageManager).Shell(\"test $MODULE -v -only\",1,1)") && \ iris stop IRIS quietly The latest version of InterSystems IRIS Community is used to create a docker instance of InterSystems IRIS with a namespace called USER. The docker-compose file version: '3.6' services: mosquitto: image: eclipse-mosquitto:2 container_name: mosquitto user: root volumes: - ./mosquitto/config/:/mosquitto/config/ - ./mosquitto/log/:/mosquitto/log/ - ./mosquitto/data/:/mosquitto/data/ ports: - 1883:1883 - 9001:9001 iris: build: context: . dockerfile: Dockerfile restart: always command: --check-caps false ports: - 1972 - 52795:52773 - 53773 volumes: - ./:/irisdev/app Two docker container instances are now created, and up and running. The first one is for the MQTT broker (mosquitto service), based on the Eclipse Mosquitto product, and the second one is for the MQTT server-side software, responsible for consuming and producing MQTT messages in order to provide Device Monitoring services.The mosquitto docker instance is configured in the file /mosquitto/config/mosquitto.conf. The port, message persistence, security and log questions are defined there. The file /mosquitto/config/password.txt determines the user as admin with the password admin too. However, it is encrypted by the command mosquitto_passwd -U password.txt (you can read more about this at https://mosquitto.org/man/mosquitto_passwd-1.html). The iris.script file zn "%SYS" Do ##class(Security.Users).UnExpireUserPasswords("*") zn "USER" Do ##class(EnsPortal.Credentials).SaveItem(0, "mosquitto_cred","mosquitto_cred","admin","admin","") zpm "load /home/irisowner/irisbuild/ -v":1:1 halt This file creates the credentials for the business service and business operation. Login to the MQTT Broker and run the ZPM file module.xml. The module.xml This file is used to compile the source code on the server and to install the sample application when using ZPM. <?xml version="1.0" encoding="UTF-8"?> <Export generator="Cache" version="25"> <Document name="iris-iot-sample.ZPM"> <Module> <Name>iris-iot-sample</Name> <Description>A simple IRIS interoperability application - for IoT using MQTT.</Description> <Version>1.0.8</Version> <Packaging>module</Packaging> <Dependencies> <ModuleReference> <Name>sslclient</Name> <Version>1.0.1</Version> </ModuleReference> </Dependencies> <SourcesRoot>src</SourcesRoot> <Resource Name="dc.irisiotsample.PKG"/> <SystemRequirements Version=">=2020.1" Interoperability="enabled" /> </Module> </Document> </Export> On the SourceRoot tag there are source packages to be compiled. The DeviceStatus persistent class Class dc.irisiotsample.DeviceStatus Extends %Persistent { Property deviceId As %String; Property statusDate As %TimeStamp; Property status As %Boolean; Storage Default { <Data name="DeviceStatusDefaultData"> <Value name="1"> <Value>%%CLASSNAME</Value> </Value> <Value name="2"> <Value>deviceId</Value> </Value> <Value name="3"> <Value>statusDate</Value> </Value> <Value name="4"> <Value>status</Value> </Value> </Data> <DataLocation>^dc.irisiotsample.DeviceStatusD</DataLocation> <DefaultData>DeviceStatusDefaultData</DefaultData> <IdLocation>^dc.irisiotsample.DeviceStatusD</IdLocation> <IndexLocation>^dc.irisiotsample.DeviceStatusI</IndexLocation> <StreamLocation>^dc.irisiotsample.DeviceStatusS</StreamLocation> <Type>%Storage.Persistent</Type> } } This class is used to persist the DeviceStatus in an SQL table. The DeviceStatusRequest class Class dc.irisiotsample.DeviceStatusRequest Extends Ens.Request { Property deviceId As %String; Property statusDate As %TimeStamp; Property status As %Boolean; Storage Default { <Data name="DeviceStatusRequestDefaultData"> <Subscript>"DeviceStatusRequest"</Subscript> <Value name="1"> <Value>deviceId</Value> </Value> <Value name="2"> <Value>statusDate</Value> </Value> <Value name="3"> <Value>status</Value> </Value> </Data> <DefaultData>DeviceStatusRequestDefaultData</DefaultData> <Type>%Storage.Persistent</Type> } } This class is employed to exchange data between interoperability components. The DeviceStatusService Class dc.irisiotsample.DeviceStatusService Extends Ens.BusinessService { Parameter ADAPTER = "EnsLib.MQTT.Adapter.Inbound"; Method OnProcessInput(pInput As EnsLib.MQTT.Message, pOutput As Ens.StringResponse) As %Status { set tsc=$$$OK set DeviceStatusValue = ##class(%DynamicAbstractObject).%FromJSON(pInput.StringValue) set DeviceStatus = ##class(dc.irisiotsample.DeviceStatusRequest).%New() set DeviceStatus.deviceId = DeviceStatusValue.deviceId set DeviceStatus.statusDate = DeviceStatusValue.statusDate set DeviceStatus.status = DeviceStatusValue.status set tsc =..SendRequestSync("DeviceStatusProcess", DeviceStatus, .Response, -1, "Device Status Process") set pOutput = Response quit tsc } } The parameter adapter is indicating the use of the MQTT adapter when it comes to receiving MQTT messages. The method OnProcessInput receives the MQTT message on pInput and sends it to the Business Process, calling the method SendRequestSync as a synchronous message. The DeviceStatusProcess Class dc.irisiotsample.DeviceStatusProcess Extends Ens.BusinessProcess { Method OnRequest(request As dc.irisiotsample.DeviceStatusRequest, Output response As Ens.StringResponse) As %Status { Set tsc = 1 Set response = ##class(Ens.StringResponse).%New() Set DeviceStatus = ##class(dc.irisiotsample.DeviceStatus).%New() Set DeviceStatus.deviceId = request.deviceId Set DeviceStatus.status = request.status Set DeviceStatus.statusDate = request.statusDate Set tsc = DeviceStatus.%Save() If $$$ISOK(tsc) { Set tsc =..SendRequestSync("DeviceStatusOperation", request, .pResponse, -1, "Device Status Operation") Set response.StringValue = "Device id "_pResponse.deviceId_" has the status "_pResponse.status } Else { Set response.StringValue = "Error on save the device status" Set SuspendMessage = 1 } quit tsc } Storage Default { <Type>%Storage.Persistent</Type> } } This class receives the message from the business service, saves it to the database, and calls the DeviceStatusOperation to send (produce) a message with the results. The DeviceStatusOperation Class dc.irisiotsample.DeviceStatusOperation Extends Ens.BusinessOperation { Parameter ADAPTER = "EnsLib.MQTT.Adapter.Outbound"; Parameter SETTINGS = "-SendSuperSession"; Method NotifyDeviceStatus(pRequest As dc.irisiotsample.DeviceStatusRequest, Output pResponse As dc.irisiotsample.DeviceStatusResponse) As %Status { #dim tSC As %Status = $$$OK #dim e As %Exception.AbstractException Try { Set message = ##class(EnsLib.MQTT.Message).%New() Set message.Topic = ..Adapter.Topic Set jsonValue = {} Set jsonValue.message = "Device "_pRequest.deviceId_" has status "_pRequest.status Set message.StringValue = jsonValue.%ToJSON() Set tSC=..Adapter.Send(message.Topic,message.StringValue) Set pResponse = ##class(dc.irisiotsample.DeviceStatusResponse).%New() Set pResponse.deviceId = pRequest.deviceId Set pResponse.status = pRequest.status } Catch e { Set tSC=e.AsStatus() } Quit tSC } XData MessageMap { <MapItems> <MapItem MessageType="dc.irisiotsample.DeviceStatusRequest"> <Method>NotifyDeviceStatus</Method> </MapItem> </MapItems> } } This class receives a message from the Business Process and produces an MQTT message to a topic with the process response. The DeviceStatusProduction Class dc.irisiotsample.DeviceStatusProduction Extends Ens.Production { XData ProductionDefinition { <Production Name="dc.irisiotsample.DeviceStatusProduction" LogGeneralTraceEvents="false"> <Description></Description> <ActorPoolSize>2</ActorPoolSize> <Item Name="DeviceStatusService" Category="" ClassName="dc.irisiotsample.DeviceStatusService" PoolSize="1" Enabled="true" Foreground="false" Comment="" LogTraceEvents="false" Schedule=""> <Setting Target="Adapter" Name="ClientID">InterSystemsIRIS</Setting> <Setting Target="Adapter" Name="Topic">/DeviceStatusInputTopic</Setting> <Setting Target="Adapter" Name="Url">tcp://mosquitto:1883</Setting> <Setting Target="Adapter" Name="CredentialsName">mosquitto_cred</Setting> </Item> <Item Name="DeviceStatusProcess" Category="" ClassName="dc.irisiotsample.DeviceStatusProcess" PoolSize="1" Enabled="true" Foreground="false" Comment="" LogTraceEvents="false" Schedule=""> </Item> <Item Name="DeviceStatusOperation" Category="" ClassName="dc.irisiotsample.DeviceStatusOperation" PoolSize="1" Enabled="true" Foreground="false" Comment="" LogTraceEvents="false" Schedule=""> <Setting Target="Adapter" Name="ClientID">InterSystemsIRIS</Setting> <Setting Target="Adapter" Name="Topic">/DeviceStatusOutputTopic</Setting> <Setting Target="Adapter" Name="Url">tcp://mosquitto:1883</Setting> <Setting Target="Adapter" Name="CredentialsName">mosquitto_cred</Setting> </Item> </Production> } } This class configures the Business Service, the Business Process, and the Business Operation with the required parameters, and puts these components to work together. The main configured parameters are: URL: to set the MQTT Broker address. Topic: the queue to receive and send MQTT messages. DeviceStatusInputTopic for the business service and DeviceStatusOutputTopic for the Business Operation. CredentialsName: set the credentials with the username and password to connect with the MQTT broker. ClientID: it is a logical name assigned for the InterSystems IRIS and required by the broker to identify MQTT clients. As you can see, the IRIS interoperability adapter for MQTT is a very easy and powerful tool for consuming and producing MQTT messages and automating the business flow when it comes to IoT devices. So, enjoy it. Very good article, Yuri. Wonder when Intersystems will support AMQP protocol, to integrate seamlessly with ASB (Azure Service Bus) Thanks! Yaron, I'd appreciate if you submit it as an idea This is cool. Thank you very much. This gives you the opportunity to do a "Smart Home"" server in COS :-) - funny. Many thanks Thanks Yuri for this article and app. As MQTTBox doesn't work for me (no longer compatible), I'm using MQTT Explorer app to send messages :https://apps.apple.com/app/apple-store/id1455214828 Thanks! Of course MQTT Explorer also works on Windows, Ubuntu and any Linux platforms. http://mqtt-explorer.com/ https://github.com/thomasnordquist/MQTT-Explorer/releases
Announcement
Vadim Aniskin · Feb 27, 2023

InterSystems Ideas News #4

Hey Community! Welcome to the 4th edition of the InterSystems Ideas News bulletin! Here's what you can expect from it: Support your teammates with their ideas and votes. Users implemented ideas and gained tech bonuses for a contest. Dark version for InterSystems Developer Community: Yay or Nay? Recently added ideas. The most important piece of news is that in the last programming contest, 5 participants got extra points for implementing ideas from the InterSystems Ideas portal! Here they are: Developer(s) Application name Implemented idea @Lorenzo.Scalese OpenAPI-Suite Add a wizard similar to the SOAP wizard to generate a REST client from OpenAPI specification @José.Pereira @Henrique @henry iris-tripleslash Add a project that helps to generate unittests for an ObjectScript class @MikhailenkoSergey gateway-sql, apptools-admin Create a UI for convenient and easy transfer of projects (classes, globals, applications, users, roles, privileges, grants, namespace mapping, SQLgateways, libraries, etc.) to other system instances for fast deployment. We added a new filter "My organization". From now on, people from the same company can filter the ideas and votes of people from the same organization. The other day, @Guillaume.Rongier7183 posted an idea concerning adding a dark theme for the Developer Community. We would appreciate your feedback on this idea via "polls" on the Ideas Portal home page, or by voting and commenting on this idea. Recently added ideas Dark version of InterSystems Community Add job location for job opportunities section on DC Searchable Index of New Features Set password through environment variable Schedule the article publication on DC site Support JSON on DC text editor Support linux bash language on DC text editor Change data capture from IRIS to kafka using SQL commands Cross-production Interoperability Messages, Service and Operation Additional Data Types for ISC Products Automatic XML Formatting of XData in DTL, BPL and Rulesets Assert CodeAssist prompting IRIS classes for OpenAI API Display UserValues contents on message property tab IRIS as a service Allow graphical editing of Interoperability components BPL, DTL and Business Rules in VS Code Add Source control for all the IRIS interoperability components Audit Lookuptables Add a "watch list" to WRC tickets (problems, issues) Connect InterSystems Documentation and Developer Community Publish the InterSystems IRIS Native SDK for Node.js on npm As usual, post your ideas on InterSystems Ideas, vote and comment on existing ideas, and implement Community Opportunity ideas. And stay tuned for our next News bulletin! ![awesome](https://media0.giphy.com/media/v1.Y2lkPTc5MGI3NjExNTZjMGNhYmQ5NmY2Y2NlOWNhYTkwYjQwN2QyZTc5Nzg0MGQyZGQ3NSZjdD1n/mXnO9IiWWarkI/giphy.gif)