Apparently this is exactly what I am looking for: intersystems help but the problem is that I would prefer to use the standard file operation rather than create a new custom one.

Writes a string to the file and appends to the string the characters specified in the LineTerminator property. By default, the LineTerminator is a carriage return followed by a line feed (ASCII 13, ASCII 10).

If your operating system requires a different value for the LineTerminator property, set the value in the OnInit() method of the business operation. For example:

 Method OnInit() As %Status
  {
      Set ..Adapter.LineTerminator="$C(10)"
      Quit $$$OK
  }

You can also make the property value to be dependent on the operating system:

 Set ..Adapter.LineTerminator="$Select($$$isUNIX:$C(10),1:$C(13,10))"

Not sure if this will help but I was having problems to connect to a database in ensemble. I needed to call a store procedure and one of the problems was because of the input variables was a VARCHAR(MAX).

Have a look in case it can help... Post

Thank Sean

When creating the class linked to the global a package name is required...but anyway I think it worked. So thank you very much for your help.

1) my question is because when I create the global I have to specify the package as well:

Package: XXX

Name: XXX

I am not sure if the CacheTemp has to be included in the package name or just class name so: XXX.CacheTempGlobal or CacheTempGlobal

2) About how the global is structured:

    ^XXX.Stored.GlobalD    =    3
     ^XXX.Stored.GlobalD(1)    =    $lb("","Code3","TESTING3","","","FIRCROFT, LONDON ROAD","GREEN","EGHAM","",$c(0)," 0BS","1",$c(0),"A","P","H1","19740401","19910401",$c(0),$c(0),$c(0),$c(0),"0",$c(0),$c(0),$c(0),$c(0),$c(0))
     ^XXX.Stored.GlobalD(2)    =    $lb("","Code2","TESTING2","","","FIRCROFT, LONDON ROAD","GREEN","EGHAM","",$c(0)," 0BS","1",$c(0),"A","P","H2","19740401","19910401",$c(0),$c(0),$c(0),$c(0),"0",$c(0),$c(0),$c(0),$c(0),$c(0))
     ^XXX.Stored.GlobalD(3)    =    $lb("","Code10","TESTING10","","","FIRCROFT, LONDON ROAD","GREEN","EGHAM","",$c(0)," 0BS","1",$c(0),"A","P","H3","19740401","19910401",$c(0),$c(0),$c(0),$c(0),"0",$c(0),$c(0),$c(0),$c(0),$c(0))

Thats how the global looks like. So basically the search the interfaces run is jut to check a specific field in the message is in the global field 14 (H1,H2,H3...) and if so translate that code to global field 1 (Code2, Code3, Code10...). But when importing the data any of the other fields can change...so that's why is better to remove the global completely and upload everything from the file.

The problem with the logs is when so many duplicates are found in the file and the last one needs to be kept in the global...that made every single duplicate new code to be uploaded and the existing one to be removed...which I think generates an entry for each action...

WOW...that was quick...thanks Sean

1) CacheTemp needs to be the name of the class or the name of the global? I mean:

^XXX.Global.CacheTempGlobal OR ^CacheTempGlobal

2) The process I designed read the file with the updated data, and upload line by line unless is a duplicate...so basically every line requires a query against the Temp global to check if that entry already exist or not. Sometimes, the new entry needs to replace the value in the temp global, so for that I delete the entry in the temp global and then add the new one. (That part is quite complex and I do not think is relevant for the global problem). Once the temp global contains all data from the file is when I start the process to export the global to a file, modify the name of the global in the file to point to the LIVE one and then when I update the LIVE global with that file.

I cannot use the merge command as the file with the updated data includes all data not just new entries but everything...so I rather delete the LIVE global completely and then import the new global file.

3) In order to avoid so many deletions of data to the temp global, I was thinking maybe to upload the content of the file into an array or something, remove all duplicates from there and then the final data to be uploaded to the temp global. But I am not sure if that will have a real improvement with just uploading/deleting to the temp global directly.

Thanks

Thanks Robert...very helpful

I noticed that I need to have an ADAPTER to be configured for that service to work...otherwise Init method will not run. As soon as I had an adapter configured...$$$LOGINFO and OnInit method worked...

Thanks

It worked. What I did is the following:

Start & stop with a few seconds diff. so the service runs but stop immediately.

START:WEEK-*-03T10:45:00

,STOP:WEEK-*-03T10:45:10

and use "Call Interval=999999999" so the following run during start/stop times is longer than that time so it will only run once.

Thanks