From the documentation...

The %Collection.ArrayOfDataTypes class represents an array of literal (i.e., data type) elements, each of which is associated with a unique key value. Keys can have any value, string or numeric. These %Collection classes can only be used when you have a collection property of another object as they rely on storing the data inside the parent object, they can not be used as 'stand alone' collections, for this use the %ArrayOfDataTypes.

Hi Sebastian,

> The rest service won't see a whole lot of usage still I wonder whether it is a good idea. Or let me rephrase this, it certainly isn't a good idea but is it a viable one due to lack of alternatives?

You don't have to use REST, you can use a standard CSP page (particularly for anyone using a pre REST version of Caché).

Class Foo.FileServer Extends %CSP.Page
{

ClassMethod OnPreHTTP() As %Boolean [ ServerOnly = 1 ]
{
    set filename=%request.Get("filename",1)
    set %response.ContentType=..GetFileContentType($p(filename,".",*))
    do %response.SetHeader("Content-Disposition","attachment; filename="""_$p(filename,"\",*)_"""")
    quit 1
}

ClassMethod GetFileContentType(pFileType) As %String
{
    if pFileType="pdf" quit "application/pdf"
    if pFileType="docx" quit "application/msword"
    if pFileType="txt" quit "text/plain"
    //TODO, add more MIME types...
    quit ""
}

ClassMethod OnPage() As %Status [ ServerOnly = 1 ]
{
    set filename=%request.Get("filename",1)
    set file=##class(%File).%New(filename)
    do file.Open("R")
    do file.OutputToDevice()
    Quit $$$OK
}

}


There are three things to point out.

1. You need to set the ContentType on the %response object
2. If the user is going to want to use the URL directly and download the file to a local disk, then set the content disposition, otherwise, the file name will end in .CLS
3. Simply use the OutputToDevice() method on the %File class to stream the file contents to the client.

This can be easily applied to REST, but there is a caveat that you need to look out for.

The initial REST path might look like this...

/file/:fileref


However, if you allow a full path name in the file name (including folders) then you will hit two problems

1. Security, file paths will need validating, otherwise, any file could be accessed
2. Caché REST just does not work well with \ characters or %5C characters in the URL

If you want to get around the second problem, use a different folder delimiter.

Personally, I would limit the solution to a few nick-named folders, such that the URL match would be

/file/:folder/:file


So

/file/Test/Hello.txt

would map to say C:\REST\Test\Hello.txt, or the alternative file that you would provide. Putting that together the solution would look something like...

Class Foo.RestRouter Extends %CSP.REST
{

XData UrlMap
{
<Routes>
  <Route Url="/file/:folder/:file" Method="GET" Call="GetFile" />
</Routes>
}

ClassMethod GetFile(folder, file)
{
    set filename="C:\REST\"_folder_"\"_file
    set %response.ContentType=..GetFileContentType($p(filename,".",*))
    do %response.SetHeader("Content-Disposition","attachment; filename="""_$p(filename,"\",*)_"""")
    set file=##class(%File).%New(filename)
    do file.Open("R")
    do file.OutputToDevice()
    Quit $$$OK
}

ClassMethod GetFileContentType(pFileType) As %String
{
    if pFileType="pdf" quit "application/pdf"
    if pFileType="docx" quit "application/msword"
    if pFileType="txt" quit "text/plain"
    //TODO, add more MIME types...
    quit ""
}

}

Note, if you look at Dmitry's solution the file name is added to the CGI variables which would work around some of the issues I have mentioned, but of course, the REST path would no longer keep its state if bookmarked etc.

In general, REST is fine for this type of use case.

Sean

There are a few approaches.

The schedule setting on a service can be hijacked to trigger some kind of start job message to an operation. It's not a real scheduler and IMHO a bit of a fudge.

A slightly non Ensemble solution is to use the Caché Task manager to trigger an Ensemble service at specific times. The service would be adapterless and would only need send a simple start message (Ens.StringContainer) to its job target. A custom task class (extends %SYS.Task.Definition) would use the CreateBusinessService() method on the Ens.Director to create an instance of this service and call its ProcessInput() method.

The only downside to this is those scheduled configuration settings are now living outside of the production settings. If you can live with that then this would be an ok approach.

Alternatively, you could write your own custom schedule adapter that uses custom settings for target names and start times. The adapters OnTask would get called every n seconds via its call interval setting and would check to see if it's time to trigger a process input message for one of the targets. The service would then send a simple start message to that target.

I prefer this last approach because it's more transparent to an Ensemble developer new to the production, also the settings stay with the production and are automatically mirrored to fail over members.

Hi Scott,

It does sound like you have duplicate or previously processed records. The SQL inbound adapter will skip these.

One of the things to note about the inbound adapter is that the underlying adapter will call the on process input of your service for every row in the resultset. The potential problem with this is that the service could be killed before it has finished off all of the rows in the resultset (forced shut down etc). For this reason, the adapter has to keep track of where it last processed a record, such that it can continue from where it left off.

In your instance, it sounds like this behavior does not fit your data.

If you want to implement your own ground up service solution then you would have to build your own adapter and make the query execution from that adapters OnTask() method. Your adapter should probably extend EnsLib.SQL.Common and implement ExectureQuery* as a member method of the adapter.

If you are going to go down this route then be mindful of building resilience to handle forced shut downs so that it can continue from where it left off.

Also, it's good behavior for adapters to not block the production for long periods. Any adapter such as this will be looping around a set of data, calling out to the ProcessInput method of its business host. If there are many rows then this loop could be going for minutes. It's only when an adapter drops out of its OnTask method can the Ensemble Director cleanly shut down a production. This is why you sometimes see a production struggling to shut down. To not block the production, the adapter will need to chunk its work down, for instance limiting a query size and continuing on from that point on the next OnTask().

Alternatively, you could look to use the SQL outbound which avoids all of the high water mark functionality. Your query will always return everything you expect. I often do it this way and have never had any problems with skipped rows (I run separate audit jobs that also double check this).

Sean.

Jeffrey has the right answer.

Murillo, here are the comments for the %SYS.Task.PurgeErrorsAndLogs task that you are currently using, the ERRORS global is for Caché wide errors and not Ensemble errors...

/// This Task will purge errors (in the ^ERRORS global) that are older than the configured value.<br>
/// It also renames the cconsole.log file if it is larger than the configured maximum size.<br>
/// On a MultiValue system it also renames the mv.log file if it grows too large.<br>
/// This Task is normally run nightly.<br>

Hi Simcha,

Your production class has a parent method called OnConfigChange() which you can override.

The method receives two objects, the updated production config object (Ens.Config.Production) and the production item config object that changed (Ens.Config.Item).

You will need to write your own diff solution around this.

Note, this method only gets called for changes made via the management portal, it will not record changes made to the production class directly.

Alternatively, you could implement an abstract projection class to trigger a diff method on compilation. This will work for both cases. Take a look at this post... https://community.intersystems.com/post/class-projections-and-projection... on how to implement this alternative.

If you want to know who made the change via the management portal then you can get the users login name using this value...

%session.Username

Sean.

One option could be to generate a sibling class that implements the generated code.

You could use the abstract create projection event to create the sibling implementation class and then tie the two together using the base class methods.

Alternatively, if you wanted to trigger code on a save event then create a class that extends %Studio.Extension.Base and override its OnAfterSave method. You will then need to enable that class in management portal > system admin > config > additional settings > source control.

Hi Russel,

I would solve the problem along the following lines...

set k2=$s($d(^G("ABC","A")):"A",1:$o(^G("ABC","A")))
while $e(k2,1)="A" {
   set k3=$o(^G("ABC",k2,""))
   while k3'="" {
     write !,k2," ",k3
     set k3=$o(^G("ABC",k2,k3))
   }
   set k2=$o(^G("ABC",k2))
}

Explanation...

//set k2 to either "A" if that key exists in the data, or the next key following "A".
set k2=$s($d(^G("ABC","A")):"A",1:$o(^G("ABC","A")))

//only process k2 when it starts with an "A", this is the wildcard functionality you are looking for
//when k2 does not start with an "A" the logic will drop through
while $e(k2,1)="A" {

    //get first child key of k2
    set k3=$o(^G("ABC",k2,""))

    //loop on all child keys found
    while k3'="" {

        write !,k2," ",k3

        //get next child key of k2
        set k3=$o(^G("ABC",k2,k3))
    }

    //get the next k2 key
    set k2=$o(^G("ABC",k2))

}

I don't think there is a function that will do it in one step for you.

There is the $zhex function which will convert a decimal to a hex value.

If you combine it with $ascii you can do one character at a time...

>w $zhex($ascii("a"))
61


Use a for loop and you can get the result you want...

>set hex="" for i=1:1:$l("abc") set hex=hex_$zhex($ascii($e("abc",i)))
>w hex
616263


There is also a utility which may of be of use to you when working on the command line...

>zzdump "abc"
0000: 61 62 63


Sean

I've knocked up a quick example using dual ack's.

I've put the source code in a gist here...

https://gist.github.com/SeanConnelly/19b79c790daad530a754461923f9f2f1

Save the code to a file and import into a test namespace.

Either create the "in" and "archive" folders are per the inbound test file feeder, or change to suit your environment.

Drop an HL7 message into the "in" folder and this is what you will see in the trace...

The ACK message is sent into the service and is automatically forwarded to the sending operation where it is returned to the calling process as if it was original messages ACK.

Make sure to follow the instructions here when configuring the service and operation, they specifically need to be implemented using  EnsLib.HL7.Operation.TCPAckOutOperation and  EnsLib.HL7.Service.TCPAckInService

http://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=...

Also note, that you need to set the reply code actions so that the AE ACK is returned to the process, otherwise it will stop at the operation, I have set the actions to...

:?R=RF,:?E=W,:~=S,:?A=C,:*=S,:T?=C

Where a match on ?E will just warn and continue as if the message was ok.

Take a look at the custom class  Examples.DeferredHL7.CustomProcess

Which contains the following OnResponse method...

Method OnResponse(request As EnsLib.HL7.Message, ByRef response As EnsLib.HL7.Message, callrequest As EnsLib.HL7.Message, callresponse As EnsLib.HL7.Message, pCompletionKey As %String) As %Status
{
    $$$TRACE("request contains the inbound request "_request.RawContent)
    $$$TRACE("callrequest contains the sent request "_callrequest.RawContent)
    $$$TRACE("callresponse contains the deferred ACK "_callresponse.RawContent)
    quit $$$OK
}

This is where you will have both the original request messages and the ACK in the same scope. From here you can construct a new message from the data in both messages.

This is just one approach but should fit your needs.

Sean.

Hi Joao,

I'm assuming you are sending an HL7 v2.x message from an operation, and its ACK is coming back via your service.

If this is the case then you might want to look at the deferred response functionality...

http://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=...

This allows you to automagically send the ACK back as a response to the sending process.

Depending on how you implement the process, you will have the original message in scope, or it will be passed as an on response argument with the ACK response.

Sean.

FOO>set msg=##class(EnsLib.HL7.Message).%OpenId(15)
 
FOO>w msg.RawContent
PID|2|2161348462|20809880170|1614614|20809880170^TESTPAT||19760924|M|||^^^^00000
OBR|1|8642753100012^LIS|20809880170^LCS|008342^UPPER RESPIRATORYCULTURE^L|||19980727175800||||||SS#634748641 CH14885 SRC:THROASRC:PENI|19980727000000||||||20809
OBX|1|ST|008342^UPPER RESPIRATORY||POSITIVE~~~~~~~|

FOO>w !,msg.SetValueAt("Positive","PIDgrpgrp(1).ORCgrp(1).OBXgrp(1).OBX:5")
 
0 0<Ens>ErrGeneralObject is immutable
 
FOO>set msg2=msg.%ConstructClone()
 
FOO>w !,msg2.SetValueAt(msg.GetValueAt("PIDgrpgrp(1).ORCgrp(1).OBXgrp(1).OBX:5.1"),"PIDgrpgrp(1).ORCgrp(1).OBXgrp(1).OBX:5")
 

1
 
FOO>w msg2.RawContent                                                           

PID|2|2161348462|20809880170|1614614|20809880170^TESTPAT||19760924|M|||^^^^00000
OBR|1|8642753100012^LIS|20809880170^LCS|008342^UPPER RESPIRATORYCULTURE^L|||19980727175800||||||SS#634748641 CH14885 SRC:THROASRC:PENI|19980727000000||||||20809
OBX|1|ST|008342^UPPER RESPIRATORY||POSITIVE|

Hi Gigi,

As you've not supplied any code it's hard to know where you are going wrong.

Firstly, just in case you are trying to change the original message object, you can't (it's immutable).

If you are doing this from code then you will need to make a clone of the original object and then set the OBX:5 from the OBX:5.1 value.

If you are using a DTL then map from the OBX:5.1 value to the OBX:5.

Sean.