OK. Lets just do this already !!

This example uses Properties and Settings to expose the Request and Response Types so they are viewable from the Management Portal:

This is achieved with the following code:

Class Test.Operation Extends Ens.BusinessOperation
{
Parameter ADAPTER = "EnsLib.HTTP.OutboundAdapter";
Property Adapter As EnsLib.HTTP.OutboundAdapter;
Parameter INVOCATION = "Queue";
Property RequestTypes As %String(MAXLEN = 4000) [ InitialExpression = {..GetRequestTypes()}, Transient ];
Property ResponseTypes As %String(MAXLEN = 4000) [ InitialExpression = {..GetResponseTypes()}, Transient ];
Parameter SETTINGS = "RequestTypes:Info,ResponseTypes:Info";

ClassMethod GetRequestTypes()
{
  set ret=""
  set messageTypes=..GetMessageList()
  quit:messageTypes="" ret
  for i=1:1:$LL(messageTypes) {
    set ret=ret_$C(13,10)_$LI(messageTypes,i)
  }
  quit ret
}

ClassMethod GetResponseTypes()
{
  set ret=""
  set messageTypes=..GetResponseClassList()
  quit:messageTypes="" ret
  for i=1:1:$LL(messageTypes) {
    set ret=ret_$C(13,10)_$LI(messageTypes,i)
  }
  quit ret
}

XData MessageMap
{
<MapItems>
  <MapItem MessageType="Ens.StringRequest"> 
    <Method>DoOne</Method>
  </MapItem>
  <MapItem MessageType="EnsLib.HL7.Message"> 
    <Method>DoTwo</Method>
  </MapItem>
</MapItems>
}

Method DoOne(request As Ens.StringRequest, response As Ens.StringResponse)
{
}

Method DoTwo(request As EnsLib.HL7.Message, response As EnsLib.HL7.Message)
{
}
}

How it works.

The Getter methods populate the Properties ( RequestTypes, ResponseTypes ) via InitialExpression

These Properties are exposed to the User Interface via the SETTINGS parameter

Note also I have set the MAXLEN of the properties over 2000K characters. This forces the Management Portal to render a TextArea instead of a single line Text input control.

The reuse pattern is to have this code in a class.

Then can add behavior via extends to inherit this behavior.

Class Test.NewOperation2 Extends (Ens.BusinessOperation, Test.Operation)
{
Parameter INVOCATION = "Queue";
}

IRIS is so flexible :)

Yes. Needs to use an SSL Configuration for HTTPS. Otherwise connection will close as handshake fails.

I tend to isolate to test a connection from server context if possible:

set request=##class(%Net.HttpRequest).%New()
set request.Server="www.intersystems.com"
set request.Port=443
set request.SSLConfiguration="TEST"
set request.Https=1
set tSC=request.Get("/",2)
do $SYSTEM.Status.DisplayError(tSC)

Switching the SSL off causes the error:

set request.Https=0
set tSC=request.Get("/",2)
do $SYSTEM.Status.DisplayError(tSC)

ERROR #6097: Error '<READ>Read+28^%Net.HttpRequest.1' while using TCP/IP device '9999'

Hi Tommy,

Something that might be a placeholder to this facility could be: https://openexchange.intersystems.com/package/UnitTest-RuleSet

The idea is a self-contained UnitTest class containing a Vanilla Starting HL7 message defined in a XData block.

Each test method constructs a clone of the starting HL7 message, makes required modifications, before sending to a target via a named Service.

I suppose it is simpler without the routing rule, check. Interested what Assertions are needed.

For example:

* Checking a MessageHeader was created for the same SessionId with a particular Target Name

* Checking a response was received from an end-point

From your question; Is there a particular reason to send over TCP to a Service? Is the message generator and sending being used for end-to-end / volumes / message types stressing?

Is this about message content variation (optional values) as well as the full message with a value in every field.

Hi Doug,

Have you considered creating a new Rule FunctionSet.

This would then appear in as a new available function in Rule Editor when compiled.

For example:

Class Test.Fun Extends Ens.Rule.FunctionSet
{
ClassMethod SegmentFieldContains(pDocument As EnsLib.HL7.Message, segmentName = "", fieldNumber = 0, value As %String = "")
{
  // validation
  quit:segmentName'?3AN 0
  quit:fieldNumber<1 0
  quit:value="" 0
  set isFound=0
  // get count of segments
  set segCount=pDocument.GetValueAt("*")
  // loop through all of the segments
  for i=1:1:segCount {
    set seg=pDocument.GetSegmentAt(i)
    continue:'$IsObject(seg)
    // skip wrong segment
    continue:seg.GetValueAt(0)'=segmentName
    // skip if field does not contain value
    continue:seg.GetValueAt(fieldNumber)'[value
    set isFound=1
    quit
  }
  quit isFound
}

}

Thinking it is useful for an IPM repo to provide different "visibility" of "available" deployed modules, depending on the account used to access the IPM server.

This avoids having a separate repo per customer. So one common repo service, where account-permissions to access different modules (and versions) is configured.

Can this work for a subscription business model ie: Once installed the software runs forever. Maybe this is limited to some extent with byte-code being published per IRIS version. So a "subscription" is on the IPM repo by security configuration continuing to enable access to module updates.

In that case the listing / search of modules, might give hints on whether subscription is enabled.

Speculating then a "10 seat / process" license would translate to a specific module byte-code build? To upgrade the "license capacity" is installing a different module version? This has potential to cause service disruption, if only interested in a change in capacity behavior.

Maybe a product would be split into two parts, an IPM "license module" that changes, and an IPM "main software module" that remains unaffected.

Hi Mark,

Some thoughts on this:

Anticipate the online backup (CBK) may tie-in with operating system and Cache version to run the restore

It may be better to have "at rest" CACHE.DAT files:

They can be:

  • Renamed to IRIS.DAT
  • Mounted and "upgraded"
  • Endian converted if needed

Having md5 checksum of CBK / DAT files can help identify transfer issues to / from offline media.

An integrity check can also be a useful confirmation tool to trace back if the original backup had issue.

Having undeployed SQL table definitions maybe useful future proofing.

For older versions of Cache install media contact the wrc.intersystems.com

Anticipate IRIS will be here after another 45 years, so will easily meet the 10 year requirement.

Hi Robbie,

I had created a OpenExchange PythonHelper : https://openexchange.intersystems.com/package/PyHelper

// Create an IRIS List turn into a Python List
set pyList=##class(alwo.PyHelper).toPyListOrString($LB(1,2,3,4,5,6))
// Write to terminal to check content
zw pyList
pyList=3@%SYS.Python  ; [1, 2, 3, 4, 5, 6]  ; <OREF>
// Convert the Python List to a IRIS List
set myIRISList=##class(alwo.PyHelper).ListFrompyList(.pyList)
// Output to terminal to confirm
zw myIRISList
myIRISList=$lb(1,2,3,4,5,6)

Any feedback welcome.

Hi Evgeny,

Not saying this is best way but this indirection can be achieved with python eval. For example:

$Classmethod equivalent

classname="%SYSTEM.SYS"
methodname="ProcessID"
eval(f"iris.cls(\"{classname}\").{methodname}()")

$Property equivalent

Instantiating a Python exception and then iterate over properties printing out:

myerror=iris.cls("%Exception.PythonException")._New("MyOops",123,"def+123^XYZ","SomeData")

for propertyname in ["Name","Code","Data","Location"]:
    propvalue=eval(f"myerror.{propertyname}")

    print(f"Property Name {propertyname} has value {propvalue}\n")

output was:

Property Name Name has value MyOops
 
Property Name Code has value 123
 
Property Name Data has value SomeData
 
Property Name Location has value def+123^XYZ

Hi Evgeny,

Not saying this is best way but this indirection can be achieved with python eval. For example:

$Classmethod equivalent

classname="%SYSTEM.SYS"
methodname="ProcessID"
eval(f"iris.cls(\"{classname}\").{methodname}()")

$Property equivalent

Instantiating a Python exception and then iterate over properties printing out name and value:

myerror=iris.cls("%Exception.PythonException")._New("MyOops",123,"def+123^XYZ","SomeData")

for propertyname in ["Name","Code","Data","Location"]:
    propvalue=eval(f"myerror.{propertyname}")

    print(f"Property Name {propertyname} has value {propvalue}\n")

output was:

Property Name Name has value MyOops
 
Property Name Code has value 123
 
Property Name Data has value SomeData
 
Property Name Location has value def+123^XYZ

$Extract ($E) and $Translate ($TR) can provide a string manipulation

> set inDate="1997-08-09 10:38:39.700000000"
> set outDate=$TR($E(inDate,1,19)," ","T")_"Z"
> write !,outDate
1997-08-09T10:38:39Z

So steps are:

  • Grab the first 19 characters
  • Convert space " " to "T"
  • Append a "Z" to the end

Ignore if already the case, but would also recommend familiarity with $Piece, $Select and "[" (Contains operator) . $Find can also be efficient in string searches.

Hi Evgeny,

The function $Parameter in conjunction with $This, to access these.

For example:

value='$Parameter($This, "[Parameter Name]")'

For example adding to Stream example yesterday.

Vanilla Source class.

Class Test.Str2Stream.ExampleIn Extends (%Persistent, %XML.Adaptor)
{
Property Filename As %String;
Property Content As %String;
}

DTL class. Where the magic happens:

Class Test.Str2Stream.Trans Extends Ens.DataTransformDTL [ DependsOn = (Test.Str2Stream.ExampleIn, Ens.StreamContainer) ]
{
Parameter Salutation = "YeaBuddy";
Parameter Reaction = "LightWeight";
Parameter IGNOREMISSINGSOURCE = 1;
Parameter REPORTERRORS = 1;
Parameter TREATEMPTYREPEATINGFIELDASNULL = 0;
XData DTL [ XMLNamespace = "http://www.intersystems.com/dtl" ]
{
<transform sourceClass='Test.Str2Stream.ExampleIn' targetClass='Ens.StreamContainer' create='new' language='objectscript' >
<assign value='##class(Test.Str2Stream).StringToStream(source.Content,source.Filename)' property='target.Stream' action='set' />
<assign value='target.Stream.WriteLine($Parameter($This,"Salutation"))' property='x' action='set' />
<assign value='target.Stream.WriteLine("These plates are "_$Parameter($This,"Reaction"))' property='x' action='set' />
</transform>
}
}

Testing In:

<ExampleIn>
  <Filename>ABC</Filename>
  <Content>The session begins...
</Content>
</ExampleIn>

Testing output:

After a bit of digging I came up with the following equivalents.

$Horolog

>>> iris.cls("%SYSTEM.SYS").Horolog()
'66647,85547'

Equivalent access:

>>> var=iris.cls("%Library.UTC").NowLocal()
>>> var
'2023-06-22 23:50:04.386'
>>> iris.cls("%Library.UTC").ConvertTimeStampToHorolog(var)
'66647,85804.386'

$NAMESPACE ($ZNSPACE)

>>> iris.cls("%SYSTEM.SYS").NameSpace()
'USER'

ZN [Namespace] - aka change namespace

Keep your object fingers in the car at all times!!

Any created object script references need to be cleared BEFORE changing back. Is it necessary.

>>> iris.cls("%SYSTEM.Process").SetNamespace("%SYS")
'%SYS'

$JOB

>>> iris.cls("%SYSTEM.SYS").ProcessID()
'1548'

$SYSTEM - Instance name

>>> iris.cls("%SYS.System").GetInstanceName()
'IRIS123'

But you might have same name on different operating system / container so:

>>> iris.cls("%SYS.System").GetUniqueInstanceName()
'THEMACHINE.YOURDOMAIN.COM:IRIS123'

$ZTIMESTAMP

>>> iris.cls("%SYSTEM.SYS").TimeStamp()
'66647,81615.3832864'

$ZTIMEZONE – Contains the time zone offset from the Greenwich meridian

>>> iris.cls("%SYSTEM.SYS").TimeZone()
0

$ZVERSION – Contains a string describing the current version of InterSystems IRIS

>>> iris.cls("%SYSTEM.Version").Format(0)
'IRIS for Windows (x86-64) 202x.x.0 (Build xxU) Thu xx 2023 06:22:16 EDT'

Thanks @Evgeny Shvarov , this is the optimal approach. As on reading the documentation it suggests you can target different IRIS builds (with the help of docker code compiling hosts) and the IPM repository can host the same module version for different $ZVersion IRIS targets transparently.

Is there intention to support code signing, and with pre-install validation, to ensure code cannot be modified at rest, between being uploaded to a repo and then downloaded to a third-party instance?

Wondering practically about an IRIS upgrade process. Should the IPM client have ability to:

  • List installed modules that have a previous IPM bytecode install
  • List installed modules that would be affected by an IRIS upgrade. Is the newer source code even available in registered repos? (Postpone the IRIS upgrade)
  • Provide a batch upgrade option, to upgrade installed modules to the SAME module version but with the newer version of IRIS $ZVersion Byte code

Restating advice from above.

A mapping rule is required per database with data.

For example.

------------ Mapping one -----------------

Global Database Location = 201606_HIPAA

Global Name: HISTORY

Global Subscripts to be Mapped: (201606)

------------ Mapping Two -----------------

Global Database Location = 201607_HIPAA

Global Name: HISTORY

Global Subscripts to be Mapped: (201607)

-------------------------------------

Can see Global "History" is stated in question.

Suggest review is needed to double-check where indexes and rowID counters are stored.

Adding some code collateral to help explore challenge:

Class definition top

Include (%occInclude, %syConfig)

Some code to listing maps in a database

ZN "%SYS"
set pNamespace="HSCUSTOM"
set cns="Map."_pNamespace
set map=""
set found=0
for {
  set map=$O(^CONFIG(cns,map),+1,db)
  quit:map=""
  set len=$L($P(map,"("),"_")
  set globalMap=$P(map,"_",len,999)
  write !,"global match:""",globalMap,""" to Database:",db}
}

Example output:

...
global match:"IRIS.MsgNames("EnsSearchTable")" to Database:ENSLIB
global match:"IRIS.MsgNames("EnsWf")" to Database:ENSLIB
global match:"IRIS.MsgNames("EnsXPATH")" to Database:ENSLIB
global match:"IRIS.MsgNames("EnsebXML")" to Database:ENSLIB
global match:"IRIS.MsgNames("Ensemble")" to Database:ENSLIB
global match:"IRIS.MsgNames("ITK")" to Database:ENSLIB
global match:"IRIS.MsgNames("RuleEditor")" to Database:ENSLIB
....

Delete a map

set tSC=##Class(Config.MapGlobals).Delete(pNamespace,globalMatch,,$$$CPFSave)

Create a map

set global="IRIS.MsgNames("EnsSearchTable")"  // example like variable "gm" above.

kill params
sset params("Database")="CUSTOMLIB"  // The database you want to use
set params("Collation")=""
set tSC = ##Class(Config.MapGlobals).Create(pNamespace,global,.params,,$$$CPFSave)

// Always apply any pending changes
// Always confirm in testing that the configuration "sticks" after a system restart

do ##class(Config.CPF).Activate()

Alternatives to programmatic approach

Extended global syntax can be useful to copy start / end data between previously mapped and currently mapped database.

Hi Evgeny,

The following is a tool System Management Specialists may use.

You can achieve Jobbing and also running arbitrary ObjectScript code by employing RunLegacyTask.

For example run arbitrary code:

USER>zw ^ABC

USER>D $SYSTEM.Python.Shell()
 
Python 3.9.5 (default, Mar 14 2023, 06:58:44) [MSC v.1927 64 bit (AMD64)] on win32
Type quit() or Ctrl-D to exit this shell.
>>> myjob=iris.cls("%SYS.Task.RunLegacyTask")._New()

>>> myjob.ExecuteCode="Set ^ABC=""123"""

>>> res=myjob.OnTask()

>>> res
1
>>> quit()

USER>zw ^ABC
^ABC=123

For example Jobbing a Routine.

Routine (Test.mac) source code:

 Test
  Set ^ABC=345
  Quit

Use legacy Task to Job it off:

USER>zw ^ABC
^ABC=123
 
USER>D $SYSTEM.Python.Shell()
 
Python 3.9.5 (default, Mar 14 2023, 06:58:44) [MSC v.1927 64 bit (AMD64)] on win32
Type quit() or Ctrl-D to exit this shell.

>>> myjob=iris.cls("%SYS.Task.RunLegacyTask")._New()
>>> myjob.ExecuteCode="Job Test^Test"
>>> res=myjob.OnTask()
>>> res
1
>>> quit()
 
USER>zw ^ABC
^ABC=345

The status result can be useful to understand problems.

For example the Routine line label "Test" was miss-typed as "TEST"

>>> myjob=iris.cls("%SYS.Task.RunLegacyTask")._New()
>>> myjob.ExecuteCode="Job TEST^Test"
>>> res=myjob.OnTask()
>>> iris.cls("%SYSTEM.Status").DisplayError(res)
 
ERROR #5001: <NOLINE>zexecuteCode+3^%SYS.Task.RunLegacyTask.11

Enjoy IRIS. No limits :)