For maintenance releases, we use the document you linked to for a few key purposes:

  • Are there any items reflected that we're impacted by potentially, even if unknowingly? (i.e. something causing a data integrity issue or unclean data in Interoperability msg)
  • Are there any items that brought to light features/capabilities we may not have even know but ultimately decided to go after because of the release notes?
  • If yes to either of the above...
    • Are there any fixes that will require extensive retesting of our integrations in PROD/TEST(STAGE)?
      • If so, does that outweigh the benefits of bullets 1 and 2 and thus wait for a major EM release?

For example, I recently upgraded our IRIS for Health 2021.1 instances to 2021.1.2 to take advantage of the (enhanced) debugging in VS Code IDE after determining the other items wouldn't 'break' anything else we were doing and/or happy with... while also seeing a few other issues we've noted but were not show stoppers for us were also addressed!

So yes, very much like the format and in fact, would love to see MORE like this for the EM releases. (i.e., 2021.1.0) as some of us nerdy folks like to see the low-level changes/updates. :-)

Thanks for all you do!

I may be misunderstanding your complete use-case but we are also doing something similar for 21st Century Cures and the easiest way to handle this, that I identified anyway, is through dynamic terminology mapping.

So DS can go in for the DocumentType.Code per usual but if you look at the FHIR Annotations/Mappings, you'll note DocumentType is a CodeTableDetail item - HS.SDA3.CodeTableDetail.DocumentType - so you can setup a simple CodeTable to map from DS to the appropriate LOINC without needing to mess with extensions and custom pairs. Just ensure you have an appropriate Translation Profile setup for your access gateway that serves out FHIR (ODS likely) so that it picks up the map to get from your HL7v2 DocumentType code table to LOINC.

EDIT to add - thus far, I have not once had to modify a built in transform to support US CDI Requirements. I suspect with the UCR release coming out in the next month or two, most of the US CDI v2/v3 mappings will also be handled better with standard CodeTable/Terminology Maps filling in any gaps.

Unclear what you mean exactly but if you're wondering if IRIS will install on something like a Synology Disk Station - yes - it works just fine using either Docker w/ containers or full VM Server experience where you first install an OS like RedHat or Ubuntu into a VM and then install the appropriate kit.

Obviously would only be done for development purposes and not a production like scenario. Most NAS servers can't push the required IOPS to properly support a production IRIS server.

I have tackled a challenge like this - with additional complex wrinkles where the related records were all in separate files in one big zip-file pulled down via SFTP by my production Service and I needed to generate a combined file for the Complex Record Map to process with the appropriate prefixes in place.

I hope this will be useful to you - after getting it working, the vendor I had to write it for went belly up so I didn't finish cleaning up my traces or comments but this was fully working.  The general flow is an SFTP pickup of a zip file daily that contains 4 files... a PAT (patient) file, a data element 1, data element 2 and data element 3 file - all comma separated. the Patient Identifier in the first file could link to the other 3 files in one of the columns.

At the end, I up with a single file I pushed into CRM that looks like (where PAT ID is 123456):

PAT|123456,SMITH,JOHN,M,moredata,etc
DATA1|49492,123456,data1data,data1moredata,data1etc
DATA2|577545,123456,data1data,data1moredata,data1etc
DATA3|454543,123456,data1data,data1moredata,data1etc

I hope it's useful to you to at least get an idea of how to get started on your particular use case. Happy to try and clarify anything if needed.

 

/// Custom business service to handle ingesting multiple related delimited flat files (contained in a single ZIP!) and 
/// combining them into a single message per Patient that is then fed into a Complex RecordMap.
Class MyPkg.Services.CRM.BatchZip Extends EnsLib.RecordMap.Service.ComplexBatchStandard [ Final ]
{

Parameter ADAPTER = "EnsLib.FTP.InboundAdapter";

Parameter SETTINGS = "ZipUtility:Basic";

Parameter Slash = {$Case($System.Version.GetOS(),"Windows":"\",:"/")};

/// Operating System Utility, with parameters, that will be executed to extract zip file from vendor.
/// Use {filename} as placeholder for the dynamic ZIP file name within the parameters.
/// 
/// Note that any utility used must write the filenames of the contents to stdout for interrogation
/// by this service.
/// 
/// Default: unzip (GNU) linux utility for unix/linux based operating systems
Property ZipUtility As %String [ InitialExpression = "unzip -o {filename}" ];

Method OnProcessInput(
    pInput As %FileBinaryStream,
    pOutput As %RegisteredObject,
    ByRef pHint As %String) As %Status
{
    Set tSC = $$$OK
    Set instanceID = $System.Util.CreateDecimalGUID()
    $$$TRACE("Unique InstanceID: "_instanceID)
    Set ^MyPkg.Temp(instanceID) = "Creating CRM-compatible masterfile for this batch file input: "_pInput.Filename
    $$$TRACE("Starting to process "_##class(%File).GetFilename(pInput.Filename))
    $$$TRACE("Executing GetZipContents")
    Set tSC = ..GetZipContents(pInput.Filename, .files)
    If $$$ISERR(tSC) Quit tSC

    // Process each sub-file into a temporary global so we can add our CRM fixed leading data and join the records together
    $$$TRACE("Processing each file into a temporary global: ^MyPkg.Temp("_instanceID_")")
    Set ptr=0
    While $ListNext(files,ptr,file)
    {
        $$$TRACE("Processing file "_file)
        If ..startsWith($P(file,..#Slash,*,*),"patients_")
        {
            Set tSC = ..ProcessFile(file, ",", instanceID)
        }
        ElseIf ..startsWith($P(file,..#Slash,*,*),"dataElement1_")
        {
            Set tSC = ..ProcessFile(file, ",", instanceID)
        }
        ElseIf ..startsWith($P(file,..#Slash,*,*),"dataElement2_")
        {
            Set tSC = ..ProcessFile(file, ",", instanceID)
        }
        ElseIf ..startsWith($P(file,..#Slash,*,*),"dataElement3_")
        {
            Set tSC = ..ProcessFile(file, ",", instanceID)
        }
        Else
        {
            Do ##class(%File).Delete(file)
        }
    }
    
    $$$TRACE("Creating MasterInputFile that we'll feed into a Complex Record Map.")
    Set tSC = ..CreateMasterInputFile(pInput.Filename, instanceID, .masterInputFile)
    
    $$$TRACE("MasterInputFile: "_masterInputFile)
    $$$TRACE("Now processing MasterInputFile into Complex RecordMap.")
    Try {
        Set masterInputFileStream = ##class(%FileBinaryStream).%New()
        Set masterInputFileStream.Filename = masterInputFile
        Set tLookAhead = ""
        Set tIOStream = ##class(EnsLib.RecordMap.Service.FileServiceStream).%New(masterInputFileStream)
        Set tIOStream.Name = ..GetFileName(masterInputFileStream)
        
        While 'tIOStream.AtEnd {
            Set tPosition = tIOStream.Position
            Set tSC = ..GetBatch(tIOStream, .tBatch,,.tLookAhead)
            If $$$ISERR(tSC) || (tPosition=tIOStream.Position) Quit
            
            Set ..%SessionId = ""
            Set tStatus = ..ForceSessionId()
            If $$$ISERR(tStatus) Quit
            
            Set tSC = ..SendRequest(tBatch,'..SynchronousSend)
            If $$$ISERR(tSC) Quit
        }
        If $$$ISERR(tSC) Quit
        
        If 'tIOStream.AtEnd {
            $$$LOGWARNING($$$FormatText($$$Text("Failed to advance record stream. Stopped reading file '%1' at position %2, not at end.","Ensemble"),tIOStream.Name,tIOStream.Position))
        }
    }
    Catch ex {
        Set tSC = $$$EnsSystemError
    }
    If $get(tLookAhead) '= "" {
        $$$LOGINFO("Discarding trailing characters: '"_tLookAhead_"'")
    }
    
    $$$TRACE("Cleaning up the temporary global we created.")
    Set tSC = ..CleanUp(instanceID)
    $$$TRACE("Completed "_##class(%File).GetFilename(pInput.Filename))
    Quit tSC
}

Method ProcessFile(
    pFilename As %String,
    pDelimiter As %String = ",",
    pInstanceID As %String) As %Status [ Private ]
{
    Set tSC = $$$OK
    Set skipHeader = 1
    
    Set file=##class(%File).%New(pFilename)
    Set tSC = file.Open("RU")
    While 'file.AtEnd
    {
        Set line = file.ReadLine()
        If skipHeader
        {
            Set skipHeader = 0
            Continue    
        }
        
        If line '[ pDelimiter Continue
        
        If ..startsWith($P(pFilename,..#Slash,*,*),"patients_")
        {
            // How do we identify the 'key' value to link up the other pieces? Get a piece of the row and store it as a part of the global key!
            Set key = $Piece(line,pDelimiter,1,1)_","_$Piece(line,pDelimiter,2,2)
            // Let's give ourselves a prefix! PAT|
            Set ^MyPkg.Temp(pInstanceID,"PAT|",key) = "PAT|"_line
        }
        ElseIf ..startsWith($P(pFilename,..#Slash,*,*),"dataElement1_")
        {
            // Each dataElement has a key for itself but also a linking key to the PAT
            Set ^MyPkg.Temp(pInstanceID,"DATA1|",$Piece(line,pDelimiter,1,1),$Piece(line,pDelimiter,2,2)) = "DATA1|"_line
        }
        ElseIf ..startsWith($P(pFilename,..#Slash,*,*),"dataElement2_")
        {
            // Each dataElement has a key for itself but also a linking key to the PAT
            Set ^MyPkg.Temp(pInstanceID,"DATA2|",$Piece(line,pDelimiter,2,2),$Piece(line,pDelimiter,1,1)) = "DATA2|"_line
        }
        ElseIf ..startsWith($P(pFilename,..#Slash,*,*),"dataElement3_")
        {
            // Each dataElement has a key for itself but also a linking key to the PAT
            Set ^MyPkg.Temp(pInstanceID,"DATA3|",$Piece(line,pDelimiter,3,3),$Piece(line,pDelimiter,5,5)) = "DATA3|"_line
        }
    }
    
    Do file.Close()
    Do ##class(%File).Delete(pFilename)
    Quit tSC
}

/// Let's start putting everything together into one big file that CRM will process!
Method CreateMasterInputFile(
    pSourceFilename As %String,
    pInstanceID As %String,
    Output MasterInputFilename) As %Status [ Private ]
{
    Set tSC = $$$OK
    Set MasterInputFilename = $Replace(pSourceFilename,".zip",".txt")
    
    Set fileObj = ##class(%File).%New(MasterInputFilename)
    Set tSC = fileObj.Open("WSN")
    If ($SYSTEM.Status.IsError(tSC)) {
        Do $System.Status.DisplayError(tSC)
        Quit $$$NULLOREF
    }
    
    Set key=$Order(^MyPkg.Temp(pInstanceID,"PAT|",""))
    While key'=""
    {
        Set patID = $Piece(key,",",2,2)
        
        // Write out PAT| 
        Do fileObj.WriteLine(^MyPkg.Temp(pInstanceID,"PAT|",key))
        
        // Get dataElement1 for that PAT next... patID Key 1, dataElement1 Key 2
        Set data1Key = $Order(^MyPkg.Temp(pInstanceID,"DATA1|",""))
        While data1Key'=""
        {
            If data1Key = patID
            {
                Set data1Key2 = $Order(^MyPkg.Temp(pInstanceID,"DATA1|",data1Key,""))
                While data1Key2'=""
                {
                    Do fileObj.WriteLine(^MyPkg.Temp(pInstanceID,"DATA1|",data1Key,data1Key2))    
                    Set data1Key2 = $Order(^MyPkg.Temp(pInstanceID,"DATA1|",data1Key,data1Key2))
                }
            }
            Set data1Key = $Order(^MyPkg.Temp(pInstanceID,"DATA1|",data1Key))
        }
        
        // Get dataElement2 for that PAT next... patID Key 1, dataElement2 Key 2
        Set data2Key = $Order(^MyPkg.Temp(pInstanceID,"DATA2|",""))
        While data2Key'=""
        {
            If data2Key = patID
            {
                Set data2Key2 = $Order(^MyPkg.Temp(pInstanceID,"DATA2|",data2Key,""))
                While data2Key2'=""
                {
                    Do fileObj.WriteLine(^MyPkg.Temp(pInstanceID,"DATA2|",data2Key,data2Key2))    
                    Set data2Key2 = $Order(^MyPkg.Temp(pInstanceID,"DATA2|",data2Key,data2Key2))
                }
            }
            Set data2Key = $Order(^MyPkg.Temp(pInstanceID,"DATA2|",data2Key))
        }

        // Get dataElement3 for that PAT next... patID Key 1, dataElement3 Key 2
        Set data3Key = $Order(^MyPkg.Temp(pInstanceID,"DATA3|",""))
        While data3Key'=""
        {
            If data3Key = patID
            {
                Set data3Key2 = $Order(^MyPkg.Temp(pInstanceID,"DATA3|",data2Key,""))
                While data3Key2'=""
                {
                    Do fileObj.WriteLine(^MyPkg.Temp(pInstanceID,"DATA3|",data3Key,data3Key2))    
                    Set data3Key2 = $Order(^MyPkg.Temp(pInstanceID,"DATA3|",data3Key,data3Key2))
                }
            }
            Set data3Key = $Order(^MyPkg.Temp(pInstanceID,"DATA3|",data3Key))
        }
        
        Set key = $Order(^MyPkg.Temp(pInstanceID,"PAT|",key))
    }
    
    Do fileObj.Close()
    
    Quit tSC
}

/// Using full path, will extract Zip file using $ZF(-100) - OS-level execution - and read in the filenames
/// of what was extracted for further processing, returning as a list to OnProcessInput
Method GetZipContents(
    pFilename As %String,
    Output pContentFilenames As %List) As %Status [ Private ]
{
    Set tSC = $$$OK, tempFilenames = ""
    Set stdoutFilename = ##class(%File).TempFilename("myTempCRMBatch")
    
    Set unzipCmd = $Replace(..ZipUtility,"{filename}",pFilename)
    $$$TRACE("Executing OS command: "_unzipCmd)
    Set workingDirectory = $Piece(pFilename,..#Slash,1,*-1)
    Set unzipCmd = "cd "_workingDirectory_";"_unzipCmd
    Set sc = $ZF(-100,"/SHELL /NOQUOTE /STDOUT+="""_stdoutFilename_""" /STDERR+="""_stdoutFilename_"""",unzipCmd)
    
    Set stdout=##class(%File).%New(stdoutFilename)
    Set stdout.LineTerminator = $char(10)
    Set tSC = stdout.Open("RU")
    While 'stdout.AtEnd
    {
        Set stdoutLine = stdout.ReadLine()
        If stdoutLine [ ".csv"
        {
            Set temp = $LFS(stdoutLine," ")
            Set ptr = 0
            While $ListNext(temp,ptr,piece)
            {
                If $ZStrip(piece,"*W") [ ".csv"
                {
                    //$$$TRACE("Found file in zip: "_$ZStrip(piece,"*W"))
                    If tempFilenames '= "" Set tempFilenames = tempFilenames_","
                    Set tempFilenames = tempFilenames_workingDirectory_..#Slash_$ZStrip(piece,"*W")
                }
            }
            
        }
    }
    
    Do stdout.Close()
    Set tSC = ##class(%File).Delete(stdoutFilename)
    Set pContentFilenames = $LFS(tempFilenames,",")
    Quit tSC
}

Method CleanUp(pInstanceID As %String) As %Status [ Private ]
{
    Kill ^MyPkg.Temp(pInstanceID)
    Quit $$$OK
}

Method startsWith(
    value As %String,
    string As %String) As %Boolean [ CodeMode = expression, Internal, Private ]
{
($E($g(value),1,$L($g(string)))=$g(string))
}

}

Hey Scott - Your questions require a bit of clarification to best answer but I can help a bit as I just went through this for both internally served and secured IRIS Management Portal and externally served and secured IRIS-hosted web services. 
 

There’s two layers to securing to consider and that’s where I would need clarification on which part your questions are after:

  • Mutual TLS 1.2 encryption to/from the web gateway module installed on Apache that acts as a reverse proxy of sorts between the web server and the IRIS server’s SuperServer port. (Actual users don’t use this port directly in a web browser) 
  • HTTPS/SSL Encryption on the Apache Web Server that encrypts the traffic between the client browser and web server itself.

For the a production quality/secure setup, you want to always achieve both of these in my opinion.

For the first bullet, if you control both sides of the equation (the IRIS server and the web server), you could easily do a self-signed cert using your redhat server’s CA as you can specify the CA Chain of Authority that validates the signed cert on both sides. 
 

For the second bullet, you really want to use a certificate authority that your user’s web browsers will natively trust. Eg if youre just serving up internally and all your users are joined to an internal domain, that domain’s CA could generate a web server cert you could install to be used by port 443 on apache httpd and your user’s browsers will likely be a-ok with it as domain CAs generally update their domain members Keystore on login (keyword being usually.) That CA could also be used to generate appropriate Server/Client profile certs to be used for the mutual tls of the first bullet  

But easiest approach for the second bullet is using a external trusted CA (think Thawte, VeriSign, and many others) as browsers will generally trust these “out of the box.” External CAs can also be used for the mutual TLS piece but generally overkill if the web gateway and iris server are all on the same internal network (again, in my opinion) - proper securing of private keys is important with use of internal CA for mutual TLS especially but really should be doing that anyway.

Reference for the mutual tls: https://docs.intersystems.com/irisforhealthlatest/csp/docbook/DocBook.UI...

Yea branches are still possible and yes users should log in as their own users (this extension allows users to enter their own git user.name and git user.email to track commits properly)  - there is stash support to a degree and the ability for branching but given the nature of IRIS being “trunk-based development” at its core, of course those branches still compile against the same IRIS namespace. 

But this extension will continue to improve and expand! I’ve been helping out where I can and providing suggestions from the standpoint of organizations that are traditional integration teams belonging to patient-care-centric healthcare orgs and Tim and his team are doing an amazing job. 
 

please be sure to join the discussions occurring at GitHub, try out the extension to get a feel for it and provide your voice to the continuing expansion of the extension! We’ve made some tremendous strides via this process and I’m excited to see the evolution!

From the OS side in AIX, I can see it in parameters.isc (example from a QA env I'm playing with)

security_settings.iris_user: irisusr
security_settings.iris_group: irisusr
security_settings.manager_user: irisusr
security_settings.manager_group: irisusr

I do not recall how to see it in IRIS itself (or if it's even possible) but I remember wanting to figure out how to change the values after installation (due to someone goofing up an entry on a dev environment) and without a lot of effort, it is pretty difficult.

I posted this on a similar question a couple weeks back - this worked for me on my MBP M1:

If you prefer or need to use a kit (like myself), give multipass a look. Super easy to spin up a ubuntu VM in seconds.

Note that you need to pull down the arm64 version of ubuntu AND the arm64 iris kits, not traditional x86_64 architecture.

https://github.com/canonical/multipass
https://9to5linux.com/canonical-makes-it-easier-to-run-ubuntu-vms-on-app...

I'm surprised you got an answer as I was unable to get one over the weekend until ISC makes any official statement. However, re: the 1.x comment:

2031667 – (CVE-2021-4104) CVE-2021-4104 log4j: Remote code execution in Log4j 1.x when application is configured to use JMSAppender (redhat.com)

The only usage of log4j I could find within an ISC platform was on Clinical Viewer. Curious if you could share where it is otherwise seen as being used? Maybe compiled into one of their own libraries and not directly exposed however.

Agree with this 100% - the link @Timothy Leavitt shared is my post on this topic and would love to have your voice added to the discussion.

@Eduard Lebedyuk Generally agree your approach would be a good approach as well but the goal - having talked to Tim a bit - from my perspective/approach is to make this as seamless as possible for our low-code/no-code integration analysts and even engineers to a degree who can manage our shared-code libraries that are mapped to our integration namespaces from within the same repo as they are tightly intertwined.

Hmm my next course of action if I were running into this would be to review the System Audits and System Logs (Application Log for example).

if you have the audits all enabled, you should be able to see if it’s hitting a security issue. 
 

also please look at the last bullet on this page ( re seeing login screen) and try its suggestion

https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls...

If you prefer or need to use a kit (like myself), give multipass a look. Super easy to spin up a ubuntu VM in seconds:

(edit: i should note that it would be using arm64 version of ubuntu and the iris kits that support arm64, not traditional x86_64 architecture like it sounds you were used to with intel based macs)

https://github.com/canonical/multipass
https://9to5linux.com/canonical-makes-it-easier-to-run-ubuntu-vms-on-app...

Here is an approach I used to solve this issue when I was first learning ObjectScript. Outcome I think is the same as what WRC helped you with but maybe a little less verbose. This method would have been in a Utils class that extends Ens.Rule.FunctionSet and called at the end of a DTL like:

set       target              ..StripMessage(target, "\x7e")

Note I had it doing all segments in reverse order and never doing the MSH segment (tCount > 1) so we didn't accidently remove a encoding character like & or \.

/// Engineer: Craig Regester <br/>
/// Date: 12/09/2016<br/>
/// This function will loop through each segment in the supplied HL7 message EXCEPT MSH <br/>
/// and strip out each instance of the character passed in parameter InvalidChar. <br/>
/// It then returns the transformed message. Call from a DTL passing in the target and setting the target.

ClassMethod StripMessage(pHL7 As EnsLib.HL7.Message, pInvalidChar As %String) As EnsLib.HL7.Message
{
    Set tCount = pHL7.SegCount
    While tCount > 1 {
        Set tSegment = pHL7.GetValueAt(tCount)
        Set tSegment = $ZSTRIP(tSegment,"*",pInvalidChar)
        Do pHL7.SetValueAt(tSegment,tCount)
        Set tCount = tCount-1
    }
    Quit pHL7
}

Anyway, interesting to see alterative approaches when learning so thought I'd share. Good luck!

Yeah, understood on the projection (. -> _) but I see I reflected it wrong on my response - was trying it too quickly this morning when I got an auto-email bugging me to evaluate the reply as an answer. :)

So I tried this numerous ways actually and here's the errors I get (they vary based on the approach):

GRANT SELECT ON MyPkg_Messages.* TO myRole

ERROR #5540: SQLCODE: -1 Message:  IDENTIFIER expected, * found^GRANT SELECT ON MyPkg_Messages.*

GRANT SELECT ON MyPkg_Messages_* TO myRole

ERROR #5540: SQLCODE: -1 Message:  TO expected, * found^GRANT SELECT ON MyPkg_Messages_*

So then I thought... well, maybe if I just grant to MyPkg_Messages and leave off the * it'll cascade?

GRANT SELECT ON MyPkg_Messages TO myRole

ERROR #5475: Error compiling routine: %sqlcq.MYNS.cls41.  Errors:  %sqlcq.MYNS.cls41.cls
ERROR:%sqlcq.MYNS.cls41.1(14) : SQLCODE=-30 : Table 'SQLUSER.MYPKG_MESSAGES' not found

:( This leads me to why I think I need to 'roll up the packaging' of the classes to a higher level as really the message class itself might be:

MyPkg_Messages_REST.AddPatientRequest or something like that.

Appreciate the responses though - I haven't had time to play with this more and the first approach you called out will work for most of my cases anyway. 

Thanks, this does work... it's not quite as granular as I'd like it to be...  Example, I can can do the following:

GRANT SELECT ON * TO MyAnalystRole

Within the appropriate namespace and that indeed grants SELECT to all tables in that namespace. but if I have packages like MyPkg.Messages.VendorA, MyPkg.Messages.VendorB, I can't do:

GRANT SELECT ON MyPkg.Messages.* TO MyAnalystRole

I think I would have to update the Pkging of VendorA and VendorB to roll up to the MyPkg.Messages schema.