I would create my "custom" datatype extending %Library.DateTime:

Class Community.dt.CustomDateTime Extends %Library.DateTime
{

/// Converts the %TimeStamp value to the canonical SOAP encoded value.
ClassMethod LogicalToXSD(%val As %TimeStamp) As %String [ ServerOnly = 1 ]
{
	Set %val=##class(%Library.TimeStamp).LogicalToXSD(%val)
	Quit $translate(%val,"TZ"," ")
}

}

Then in your class define your property as:

Property OPDT As Community.dt.CustomDateTime;

Are you sure you really need %Library.DateTime and not %Library.TimeStamp?
The difference is the JDBC/ODBC format.

If you prefer using %Library.TimeStamp, then change the superclass in my sample code.

Enrico

Hi Nicki,

that's EXACTLY the point of the two different calls Sync/Async (the second option commented) in my sample.

If you need to wait for the task to finish (whatever it takes, maybe longer that call interval), then use SendRequestSync() call. Using SendRequestSync() if task takes longer than time interval then when it finishes the call is performed immediately because time interval has already expired.

If you need to call the task on every call interval, regardless the previous call has finished, then use SendRequestAsync() call.

Enrico

Hi Michael,

in order for %JSONImport() method to work properly the class of "pResponse" (inheriting from %JSON.Adaptor) MUST match the structure of the JSON being imported.

From the error it seems this is not the case for your class and JSON.

Another option is to load/import the JSON stream into a dynamic object {} (%Library.DynamicObject) and then "manually" set your pResponse properties from the dynamic object properties.

Enrico

Hi Summer,

thank you for the information, since then I solved my issue and have used %setall()/getall() "magic" (secret? 😉 ) methods in a couple of cases (like streams), although I'm not sure I discovered all the magic! 

The real issue is the lack of documentation and samples, for me as well as for all the community.

In addition, InterSystems use (used?) to say that what is not documented is considered not (officially) supported....

Enrico

Method OnRequest(pRequest As Ens.StreamContainer, Output pResponse As Ens.Response) As %Status
{ $$$LOGINFO("Inne i XmlFilterProcess")
    set filename = pRequest.OutputFilename
    set stream = pRequest.Stream
    $$$LOGINFO(stream.Read())

 set status=##class(%XML.XPATH.Document).CreateFromStream(stream,.mydoc)
 set status=mydoc.EvaluateExpression("/staff/doc/name","text()",.myresults)
 set count = myresults.Count()
 $$$LOGINFO(count)
 
  for i =1:1:count
 {
 set result = myresults.GetAt(i).Value
 $$$LOGINFO(result)
 }
    Quit status
}

You need to change this two lines:

set status=mydoc.EvaluateExpression("/staff/doc/name","1",.myresults)
set status=mydoc.EvaluateExpression("/staff/doc/name","text()",.myresults)

set result = myresults.GetAt(i)
set result = myresults.GetAt(i).Value

Enrico

Simply replace the line:
//call here Do ##class(Ens.Util.XML.Reader).ChangeXMLStreamEncoding...

With:
Set outputBinaryStream=##class(%Stream.FileBinary).%New()
Do ##class(Ens.Util.XML.Reader).ChangeXMLStreamEncoding(tStream, "ISO-8859-1",outputBinaryStream, .tSC)
//you may want to check tSC, just in case....
//now on use the header changed outputBinaryStream instead of tStream

Enrico

Set inputBinaryStream=##class(%Stream.FileBinary).%New()
Set inputBinaryStream.Filename="\\server\your\share\file.xml"
Set outputBinaryStream=##class(%Stream.FileBinary).%New()
Set outputBinaryStream = ##class(Ens.Util.XML.Reader).ChangeXMLStreamEncoding(inputBinaryStream, "ISO-8859-1",outputBinaryStream, .tSC)

; Since the output stream is passed, you can just call (last line) with:
Do ##class(Ens.Util.XML.Reader).ChangeXMLStreamEncoding(inputBinaryStream, "ISO-8859-1",outputBinaryStream, .tSC)

Enrico

EVERY interoperability session start from a Business Service, be it from a message/call received from an external system or triggered by a timed event, like in this case.
Your problem/question is:

"I need to trigger production process/operation every minute"

That's EXACTLY what my BS sample does, all you need is to call your "process/operation" that "exchange data with external system".

This is the way to implement it.
Enrico

Purging every N days should keep the database size almost constant, assuming a constant number of messages.

Unless you have some other database classes that keep growing.

Are you purging message bodies too?

What kind of messages does your production use? HL7 only? Other messages?

You may have orphaned messages in your database.

Regarding moving a CACHE.DAT, can you stop the system for the time it takes to copy the file to a different filesystem/drive?

Enrico