Hi Vandrei.

Have a look at the learning site: https://learning.intersystems.com

Here you will find learning paths for what you want such as Building Your First Application with Caché and Learn Caché for Developers.

There is also classroom training that can be arranged with Intersystems, but it might be good to give yourself exposure with the above before considering such courses. 

It might be unrelated, but I recently had a similar issue when installing healthconnect on a windows 10 machine with Kaspersky Endpoint Security.

Kaspersky was being triggered by some temp files created by the installer, and placing them in quarantine. This was then leaving the installer showing it had completed, but with the progress bar still showing. I never did leave it for as long as you did to then get the error you didn't save.

The weird thing was that Kaspersky only reported a virus with the installer on specific versions of Windows 10.

I ended up submitting the installer to Kaspersky so that they could white list it. 

Long story short, check the reports section within Kaspersky and see if it interfered with your install. smiley

I have a task that deletes txt files  older than x number of days:

 Class PROD.Schedule.PurgeTxtFiles Extends %SYS.Task.Definition
{

Parameter TaskName = "Purge TXT Files";

Property Directory As %String;

Property Daystokeep As %Integer(VALUELIST = ",5,10,15,20,25,30,35,40,45,50,55,60") [ InitialExpression = "30" ];

Method OnTask() As %Status
{

Set tsc = ..PurgeSentFolder(..Directory,..Daystokeep,"txt")
Quit tsc
}

Method PurgeSentFolder(Directory As %String, DaysToKeep As %Integer, Extention As %String) As %Status
{
// Calculate the oldest date to keep files on or after
set BeforeThisDate = $zdt($h-DaysToKeep_",0",3)

// Gather the list of files in the specified directory
set rs=##class(%ResultSet).%New("%File:FileSet")
Set ext = "*."_Extention
do rs.Execute(Directory,ext,"DateModified")

// Step through the files in DateModified order
while rs.Next() {
set DateModified=rs.Get("DateModified")
if BeforeThisDate]DateModified {
// Delete the file
set Name=rs.Get("Name")
do ##class(%File).Delete(Name)
}
// Stop when we get to files with last modified dates on or after our delete date
if DateModified]BeforeThisDate 
set tSC = 1
}
quit tSC
}

}

Hopefully you can make this work for your needs smiley

Assuming you're talking about the online backup function within ensemble/HS/etc - I use a task that will run a purge based on the age of the file, and then runs the backup. The order of the two methods is important if you set your retention period to 0, as you'll end up deleting the backup you just made (I'm neither confirming or denying if this happened to me).

Class Live.Schedule.BackupPurge Extends %SYS.Task.BackupAllDatabases{

Parameter TaskName = "Backup With Purge";

Property Daystokeep As %Integer(VALUELIST = ",0,1,2,3,4,5") [ InitialExpression = "1" ];

Method OnTask() As %Status{
    //Call PurgeBackup Method, Return Status
    Set tsc = ..PurgeBackups(..Device,..Daystokeep)
    Set tsc = ..RunBackup()
    Quit tsc
}

Method PurgeBackups(Directory As %String, DaysToKeep As %Integer) As %Status{
    // Calculate the oldest date to keep files on or after
    set BeforeThisDate = $zdt($h-DaysToKeep_",0",3)

    // Gather the list of files in the specified directory
    set rs=##class(%ResultSet).%New("%File:FileSet")
    do rs.Execute(Directory,"*.cbk","DateModified")

    // Step through the files in DateModified order
    while rs.Next() {
        set DateModified=rs.Get("DateModified")
        if BeforeThisDate]DateModified {
            // Delete the file
            set Name=rs.Get("Name")
            do ##class(%File).Delete(Name)
        }
        // Stop when we get to files with last modified dates on or after our delete date
        if DateModified]BeforeThisDate 
        set tSC = 1
    }
    quit tSC
}

Method RunBackup() As %Status{
    d $zu(5,"%SYS")
    Set jobbackup = 0
    Set quietflag = 1
    Set Device = ..Device
    Set tSC = ##class(Backup.General).StartTask("FullAllDatabases", jobbackup, quietflag, Device, "0")
    Quit tSC
 }

}

The downside to this is you will end up with an extra backup file in your backup location if you run the backup manually as the purge is based on the file age. Not a massive problem unless you're storing in a location with a finite amount of disk space.

After working with WRC, I now have an answer.

If the DataSet property points to the MutabaleDateSet property then GetValueAt will return a stream if more than 32k.

If (as in my situation) the DataSet property points to the FixedDateSet property then GetValueAt will return a string of 32648.

The workaround provided by WRC did the trick for me:

               Try {
                   Set setStatus = $$$OK, getStatus = $$$OK
                   If 'pInput.Modified{
                       Set setStatus = pInput.SetValueAt(pInput.GetValueAt("DataSet.DocumentTitle",,.getStatus),"DataSet.DocumentTitle")}}
                       Catch e {
                           Set setStatus = e.AsStatus()}
                   If setStatus && getStatus{
                       Set X = pInput.GetValueAt("DataSet.EncapsulatedDocument",,.tSC)}

However there was an alternative of using the CreateFromDataSetFileStream method of the class EnsLib.DICOM.Document:

set tSC = ##class(EnsLib.DICOM.Document).CreateFromDataSetFileStream(pInput.DataSet.FileStream,pInput.DataSet.TransferSyntax,.dicomDocFromStream)

If tSC Set X = dicomDocFromStream.GetValueAt("DataSet.EncapsulatedDocument",,.tSC)

In both of these options, the next step is to then check tSC to see if X is a stream or string and then work from there.

Within the Production under Production Settings, you should currently find a Document button which will produce a report of everything within your production. However, depending on the production size, this could easily be overkill.

2019 brings a new option called "Interface Maps" where you can get a graphical view of message flows, along with the processes, routers, and rules for the individual sections of your production. It's a lot cleaner than using the Documentation generator, but if you're needing to show multiple routes, you're likely to want to go through each one and take a screenshot. I also found that where I have a router with lots of rules/transforms, I need to scroll the screen to see the bottom of the display.

Information on this can be found here.

I haven't come across anything built in as standard that would do this in itself, but I guess it's something you could create within your environment.

I have something a bit similar, but the index is populated by a CSV received daily from another organisation, and then the process compares the HL7 messages against the index and sends if the patient is present in that table before sending.

I had this exact issue last week, and this is how I got around it. For clarity, I wanted to pass the Dynamic Object from a process to an operation.

I created my dynamic object within the Process, and then used the %ToJSON Method to put the JSON within a GlobalBinaryStream (which can be passed through the request).

In the Operation, I then use the %FromJSON Method of DynamicAbstractObject to then have the Dynamic Object within the operation.

Hi Andrew.

I don't think the operation to a downstream system would be the appropriate place for adjusting the HL7 content.

Using Transforms within a router will be the best approach for this, and while it might seem a little cumbersome creating a transform per message type you should be able to remove some of the leg work by using sub-transforms.

For example; if you were looking to adjust the datestamp used in an admission date within a PV1, rather than complete the desired transform work in every message transform that contains a PV1, you create a sub-transform for the PV1 segment once and then reference this in your transform for each message type. If you then have additional changes required to the PV1 which is message specific (say, a change for an A01 but not an A02) you can then make this change in the message specific transformation.

As far as transforming datetime in HL7, I have faced my share of faff when it comes to this with suppliers. The best approach from within the transforms is to use the built in ensemble utility function "ConvertDateTime" listed here

What type of business service are you using? If you are using a single job on the inbound, I guess you're hitting a limit on how fast the adapter can work on handling each message (in your case, you're getting around 15ms per message)

You could look at increasing the pool size and jobs per connection if you're not worried about the order in which the messages are received into your process.

Hi Alexandr.

If you are looking to run a task at specific times, you could create a new task which extends %SYS.Task.Definition to then be selectable as an option from the task manager.

For example, I have a folder which I need to periodically delete files older than x days.

To achieve this, I have a class that looks like this:

Class DEV.Schedule.Purge Extends %SYS.Task.Definition
{

Parameter TaskName = "Purge Sent Folder";

Property Directory As %String;

Property Daystokeep As %Integer(VALUELIST = ",5,10,15,20,25,30") [ InitialExpression = "30" ];

Method OnTask() As %Status
{

Set tsc = ..PurgeSentFolder(..Directory,..Daystokeep,"txt")
Quit tsc
}

Method PurgeSentFolder(Directory As %String, DaysToKeep As %Integer, Extention As %String) As %Status
{
// Calculate the oldest date to keep files on or after
set BeforeThisDate = $zdt($h-DaysToKeep_",0",3)

// Gather the list of files in the specified directory
set rs=##class(%ResultSet).%New("%File:FileSet")
Set ext = "*."_Extention
do rs.Execute(Directory,ext,"DateModified")

// Step through the files in DateModified order
while rs.Next() {
set DateModified=rs.Get("DateModified")
if BeforeThisDate]DateModified {
// Delete the file
set Name=rs.Get("Name")
do ##class(%File).Delete(Name)
}
// Stop when we get to files with last modified dates on or after our delete date
if DateModified]BeforeThisDate 
set tSC = 1
}
quit tSC
}

}

Then I created a new task in the scheduler, selected the namespace where the new class exists, and then filled in the properties and times I want the task to run.

So I found that it is possible to save single messages using the "HL7 V2.x Message Viewer" which might not be suitable for you if you're looking to export loads of messages.

One option could be to add a new HL7 file out operation, search for your desired messages from the Router you wish to "export" from and then resend them to a new target which can be selected from the Resend messages page.