Hi-

Heres something I did on a Windows system at one point to purge files in a directory that were older than a given time frame

ClassMethod PurgeFiles(Path As %String, OlderThan As %Integer)
{
    set Date=$zd($h-OlderThan)
    set cmd="forfiles /P "_Path_" /D -"_Date_" /C ""cmd /c del @path"""
    set sc=$zf(-1,cmd)
}

I'm pretty sure there is also ways to do this using something in %File or one of those classes also. I will poke around for an example there as well, but this should get you started.

Jen

Hi Jimmy

The short answer is that the EnsLib.File.PassthroughOperation is a special operation that can take a stream and write it out to a file.  The operation expects you to pass in an instance of Ens.StreamContainer and you need to populate the Stream property which is what the PassthroughOperation is looking for.

An example might be:

set sc=##class(Ens.StreamContainer).%New()

set stream = ##class(%Stream.GlobalCharacter).%New()

do stream.Write("This is my text to go into the file")

set sc.Stream=stream

Once you have done this, you can send sc as the input to your operation using ..SendRequestAsync or ..SendRequestSync, or from a BP if that is where the message is coming from

My question is given your request message you created, what are you hoping the output in the file will look like?

Thanks

Jenna

So a customer asked me a question about this and I decided to actually implement a mechanism to encode a global as JSON.  I haven't done the opposite which would be to take the encoded json and turn it back into a global, but the reverse is pretty simple

Here's the way I encode the Global as json  There is a top-level object with two properties, globalName and nodes.  globalName is a string and represents the actual global name, nodes is an array and contains an object for each node of the global which contains data.  To ensure that non-printable characters are handled properly,  I am using HTML Escaping to escape both the data and the subscript values

{

     "globalName":"^ISCSOAP",
     "nodes":[{
               "data":"io",
               "subscripts":[
                             ""Log""
                            ]
              },
              {
               "data":"c:\\temp\\SOAP.log",
               "subscripts":[
                             ""Log"",
                             ""LogFile""
                            ]
              }]
} 

Here is an example of the class method that generates this output

Class ISC.JSONGlobalProcessor [ Not ProcedureBlock ]
{

ClassMethod Export(GlobalRoot As %String, Output JSON As %DynamicObject) As %Status [ ProcedureBlock = 0 ]
{
    if '$d(@GlobalRoot) quit $System.Status.Error(5001, "Nothing to export; "_GlobalRoot_" <undefined>")
    set root=$p(GlobalRoot,")",1,$l(GlobalRoot,")")-1),node=GlobalRoot s:root="" root=GlobalRoot
    set JSON=##class(%DynamicObject).%New()
    set JSON.globalName=$p(GlobalRoot,"(",1)
    set JSON.nodes=##class(%DynamicArray).%New()
    while $e(node,1,$l(root))=root {
        if $d(@node)#10 do ..addNode(node,.JSON)
        set node=$q(@node)
    }
    quit 1
}

ClassMethod addNode(node As %String, ByRef JSON As %DynamicObject) As %Status
{
    set nodeJSON=##class(%DynamicObject).%New()
    set data=@node,nodeJSON.data=##class(%CSP.Page).EscapeHTML(data)
    set subscripts=$p(node,"(",2,999),subscripts=$p(subscripts,")",1,$l(subscripts,")")-1)
    if ""'=subscripts {
        set nodeJSON.subscripts=##class(%DynamicArray).%New()
        set cp=1
        for {
            q:cp>$l(subscripts,",")
            set subscript=$p(subscripts,",",cp)
            f  {
                q:$l(subscript,"""")#2
                set cp=cp+1,subscript=subscript_","_$p(subscripts,",",cp)
            }
            set subArray=$i(subArray),subArray(subArray)=subscript
            set cp=cp+1
        }
        for i=1:1:subArray do nodeJSON.subscripts.%Push(##class(%CSP.Page).EscapeHTML(subArray(i)))
    }    
    do JSON.nodes.%Push(nodeJSON)
}

}

To call this code you can do the following

set sc=##class(ISC.JSONGlobalProcessor).Export($na(^SAMPLE.PersonD),.json)

Once you have the global encoded in a JSON object, you can output that JSON by:

do json.%ToJSON()

Hello-

First, admittedly I'm not sure the cause to your error as I do not have access to your generated SOAP client classes, your SSL Configuration, etc.

That said, I was able to implement the UPS API for the Tracking API and was able to execute without issue.  To get it to work you will need to do the following.

1.   Create a SSL client configuration (test it against the UPS server) to be used for SSL encryption.  it does not appear that UPS provides a non encrypted connection for testing.

2.   Obtain all of the wsdl files (including the additional xsd documentts that define the different types that UPS API supports. 

3.  Use the SOAP wizard to create a client.  I used the package UPS.   In the wizard, on the Step 3 page, select the option "Use unwrapped message format for document style web methods "  This is critical because the SOAP wizard will not create the correct client without it checked.

Once created, the following class method can be used to test your service.  I just added this to my SOAP client class UPS.TrackPort

ClassMethod Test(InquiryNumber As %String) As UPS.common.ResponseType
{
    ; Setup Web Service Client and Security
    s ws=##class(UPS.TrackPort).%New()
     s ws.SSLConfiguration="SSLClient"
     s sechdr=##class(UPS.upss.UPSSecurity).%New()
     s usertoken=##class(UPS.upss.UsernameToken).%New()
     s usertoken.Username="myusername"
     s usertoken.Password="mypassword"
     s sechdr.UsernameToken=usertoken
     s acctoken=##class(UPS.upss.ServiceAccessToken).%New()
     s acctoken.AccessLicenseNumber="myaccessLicenseNumber"
     s sechdr.ServiceAccessToken=acctoken
     do ws.HeadersOut.SetAt(sechdr,"UPSSecurity")
    ;
    ; Setup Request
    set trakRequest=##class(UPS.trk.TrackRequest).%New()
    set trakRequest.Request=##class(UPS.common.RequestType).%New()
    do trakRequest.Request.RequestOption.Insert(1)
    set transactionReference=##class(UPS.common.TransactionReferenceType).%New()
    set transactionReference.CustomerContext="My Ensemble Process "_$j
    set trakRequest.Request.TransactionReference=transactionReference
    set trakRequest.InquiryNumber=InquiryNumber
    ;
    quit ws.ProcessTrack(trakRequest)
}

Once this is done, you can test as follows

USER>s resp=##class(UPS.TrackPort).Test("1Z12345E0205271688")
 
USER>w resp
20@UPS.trk.TrackResponse
USER>zw resp
resp=<OBJECT REFERENCE>[20@UPS.trk.TrackResponse]
+----------------- general information ---------------
|      oref value: 20
|      class name: UPS.trk.TrackResponse
| reference count: 2
+----------------- attribute values ------------------
|           (none)
+----------------- swizzled references ---------------
|       i%Disclaimer = ""
|    i%Disclaimer(1) = "You are using UPS tracking service on customer integration test environment, please switch to UPS production environment once you finish the test. The URL is https://onlinetools.ups.com/webservices/Track"
|       r%Disclaimer = "49@%Collection.ListOfDT"  <Set>
|         i%Response = ""
|         r%Response = "21@UPS.common.ResponseType"
|         i%Shipment = ""
|      i%Shipment(1) = ""
|         r%Shipment = "48@%Collection.ListOfObj"
|      r%Shipment(1) = "24@UPS.trk.ShipmentType"
+-----------------------------------------------------

Hope this helps

Thanks Robert-

I dont think I have a choice here.  the All method of the %SYSTEM.OBJ.FM2Class expects the array to be passed by reference.  I did come up with a solution though....

/// Run FM2Class in background fo rnewly created namespace

ClassMethod ConfigFM2Class(Namespace As %String, LocalPath As %String) As %String
{
new $namespace set curnsp=$namespace,$namespace=Namespace
write !,"Starting FM2Class for Namespace ",Namespace," in background"
; Build Up Parameters
set params("childTableNameFormat")="SUB_<FILENAME>,<FILENUMBER>"
    set params("compile")=1
    set params("compileQSpec")="/display=all/lock=0"
    set params("dateType")="%Library.FilemanDate"
    set params("datetimeType")="%Library.FilemanTimeStamp"
    set params("deleteQSpec")="/display=all"
    set params("display")=0
    set params("expandPointers")=0
    set params("expandSetOfCodes")=0
    set params("extendedMapping")=""
    set params("fieldNameFormat")="Exact"
    set params("ienFieldName")="IEN"
    set params("logFile")=LocalPath_"fm2class_"_Namespace_".log"
    set params("nameLength")=180
    set params("owner")="_SYSTEM"
    set params("package")="VISTA"
    set params("readonly")=0
    set params("recursion")=2
    set params("requiredType")=0
    set params("retainClass")=1
    set params("setOfCodesEnum")=1
    set params("strictData")=0
    set params("superClasses")=""
    set params("tableNameFormat")="<FILENAME>,<FILENUMBER>"
    set params("variablePointerValueField")=0
    set params("wpIsList")=0
    kill ^UTILITY(Namespace,$j,"ISC.HealthConnect.Installer") merge ^UTILITY($j,"ISC.HealthConnect.Installer")=params
    set $namespace=curnsp
    job ##class(ISC.HealthConnect.Installer).jobfm(Namespace,$j)::5
set zSC=1
if '$t set zSC=0
quit zSC
}
ClassMethod jobfm(Namespace,job)
{
;Startup FM2Class
new $namespace set $namespace=Namespace
merge params=^UTILITY(job,"ISC.HealthConnect.Installer")
kill ^UTILITY(job,"ISC.HealthConnect.Installer")
do ##class(%SYSTEM.OBJ.FM2Class).All(.params)
quit

}

I would imagine that the same code you wrote to convert XML to JSON in studio could be used from an Ensemble production to do the conversion from within the context of a production.  You could implement this code as a business operation method and could pass a message containing the XML as the request message and then the response message would contain the json converted version.

Can you share your code that converts XML to JSON from Terminal?

Perhaps you are referring to the ContentType response header which is used by many applications to know what format the response is in.  In one of my RESTful services that returns a JSON response I use the following code in my REST handler.

    Set %response.ContentType="application/json"

You can find a list of all of the various MIME types that could be used as the %response.ContentType

https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/MIME_types/Complete_list_of_MIME_types

Hi Kishan

I think it would help to have a little bit more information on your particular use case for FHIR.  

FHIR is the latest standard to be developed under the HL7 organization. Pronounced 'Fire' , FHIR stands for Fast Healthcare Interoperability Resources.  FHIR is a standard for exchanging healthcare information electronically.  The FHIR standard is just that, a standard, and as is the case for all standards, requires implementation.  Complete details about the FHIR standard are publically available on the internet at https://www.hl7.org/fhir/

There is no specific functionality built into the InterSystems Cache product to support the FHIR standard, although InterSystems Cache could be used to develop an implementation of the FHIR standard.  Such an implementation, done with Cache, would require a significant amount of development

On the other hand InterSystems HealthShare does have specific libraries included that can make working with the FHIR standard much easier, but this obviously would depend upon what your exact use-case for FHIR was.  If you could provide additional information as to what you are wanting to do with FHIR it would make answering your question much easier.

Mack

It seems to me that your real issue is that your database is taking up too much space on disk and you want to shrink it.  To do this you really don't need to create a whole new database and namespace.  Even on non VMS systems before we had the compact and truncate functions I used to compact databases using GBLOCKCOPY which is pretty simple.

1.   Create a new database to hold the compacted globals

2.   Use GBLOCKCOPY to copy all globals from the large database to the new database you created in step 1.  Because routines are stored in the database they will be copied as well.

3.  Shutdown (or unmount) the original database and new compacted database

4.  Replace the huge CACHE.DAT file with the new compacted one.

5.  Remount new compacted database.

Global and Routine mappings are stored in the cache.cpf file and are related to the namespace configuration and not the database configuration.  After completing this process the database will be compacted and global/routine mappings will be preserved.

And of course, you shouldnt do any of this without a good backup of your original database.

I found what I was looking for.  I was looking at EnsLib.HL7.Message looking for a way to correlate that to Ens.MessageHeader when I should have been looking from the reverse order.

In Ens.MessageHeader there is a field MessageBodyClassName which contains the name of the class containing the message body.  In this case EnsLib.HL7.Message

There is also a field in Ens.MessageHeader called MessageBodyId which contains the ID of the corresponding message body.

Here's an example of how one might use the FileSet query in the %File class and the Delete class method in %File to purge backup files in a given directory before a given date.

/// Purge backups older than <var>DaysToKeep</var>
/// <var>Path</var> points to the directory path containing the backups.
/// Only *.cbk files will be purged
ClassMethod PurgeBackups(Directory As %String, DaysToKeep As %Integer = 14) As %Integer
{
	// Calculate the oldest date to keep files on or after
	set BeforeThisDate = $zdt($h-DaysToKeep_",0",3)

	// Gather the list of files in the specified directory
	set rs=##class(%ResultSet).%New("%File:FileSet")
	do rs.Execute(Directory,"*.cbk","DateModified")

	// Step through the files in DateModified order
	while rs.Next() {
		set DateModified=rs.Get("DateModified")
		if BeforeThisDate]DateModified {
			// Delete the file
			set Name=rs.Get("Name")
			do ##class(%File).Delete(Name)
		}
		// Stop when we get to files with last modified dates on or after our delete date
		if DateModified]BeforeThisDate quit
	}
}

You could create a SQL Stored procedure to return ##class(%Library.Functions).HostName(), such as:

 Class Utils.Procedures Extends %RegisteredObject
{

ClassMethod hostname() As %String [ SqlProc ]
{
 /* method code */
      Quit ##class(%Library.Function).HostName()
}

}

And once that was done,  you can then use that stored procedure from a sql query, such as:

SELECT Utils.Procedures_HostName()

which on my system returns

poindextwin10vm

which is the hostname of my Windows system

Interesting,  I took your code and just added some code to the OnProcessInput method so that it would just take a string int he body of the http request and then echo that string back as the response.  here is the code, which when tested with my http test tool seems to work fine which would indicate that there might be a problem elsewhere.

What settings are you using when adding your service to the production, and what version of Ensemble are you using?

Here's a copy of the working code I tested in 2016.1

Class Community.Services.MyService Extends Ens.BusinessService
{

/// Set Adapter
Parameter ADAPTER = "EnsLib.HTTP.InboundAdapter";

/// Set this to 0 to prevent normalizing of HTTP header variable names to lowercase
Parameter TOLOWERHEADERVARS = 1;

/// Set this to make page parse form variables from the form body in case of a form POST
Parameter PARSEBODYFORMVARS = 0;

/// Copied from EnsLib.HTTP.Service
Method OnInit() As %Status
{
    If $IsObject(..Adapter) {
        Set ..Adapter.%ToLowerHeaderVars=..#TOLOWERHEADERVARS
        Set ..Adapter.ParseBodyFormVars=..#PARSEBODYFORMVARS
    }
    Quit ##super()
}

/// Same method signature as EnsLib.REST.Service
Method OnProcessInput(pInput As %Library.AbstractStream, Output pOutput As %Stream.Object = {$$$NULLOREF}) As %Status
{
    set pOutput=##class(%GlobalCharacterStream).%New()
    set string=pInput.Read()
    do pOutput.Write(string)
    quit $$$OK
}

}