Hello Rafael,

   I have not tested this extensively, nor do I recall it being done. However it does sound very reasonable and a quick test leads me to believe it would work.  You should test the first few backups by restoring them and running ^VALIDATE on them.  If you need further assistance please enter a support request at http://support.intersystems.com.

Sincerely,
   Clark
----------------
Clark C. Matthews                clarkm@intersystems.com
Technical Support Specialist
InterSystems Worldwide Response Center    Voice: +1-617-621-07001
http://www.intersystems.com/ * ftp://ftp.intersystems.com/  * support@intersystems.com

Magnus,

   I am no Ensemble expert, however I understood that logging what it does is part of what Ensemble does.  So I would expect it to log what it is transferring.

   You could write your own class outside of Ensemble using the %Net.FtpSession class.  Open up two connections and copy the stream you read with .Retrieve() to the stream you write with .Store().  Would this work?

HTH,
   Clark

Sean,

   Mapping was just the first example that came to my mind, and probably not the best one.  From two different namespaces with different mappings you could end up using different versions of some routines or classes.  I'm not saying it's good practice, just that it can happen.

   In addition to mappings there are a number of other things to consider, like the fact that the target can be changed after compilation.  There are object code only routines, or routines where, because something has not been re-compiled, the source and object do not match, so which does the compiler use?  What if the target is changed after the caller was compiled and now the call is incorrect?  What if the target does not exist on the development or build machine, and will only work when combined with object code from other modules of a product in a testing environment?

   One likely case where the caller and function are not both available is where hooks are implemented (like ^ZJRNFILT or ^%ZSTART in the kernel) and a product calls a method or routine in a call frame specified by it, if it exists.  Then someone else is free to write the function so long as it conforms to what is expected.  The module making the call would either be getting compile errors, or they would be compiling using some sort of dummy to eliminate the errors.  In any case the check is useless for the developer of the caller, and the developer of the target is unlike to re-compile after each upgrade unless it is specifically documented that the call had changed. 

   At best, in any of these cases you would get numerous "false" errors that are going to make it easier to miss "real" compilation errors.  I am not sure you really want to validate the external calls in the same step as compilation.  Wouldn't you want to do this once the module is complete and connected with any other modules required for the entire package to work?  This way you can be sure that all the inter-object references are correct in the environment being tested.

Clark

HI,

   The first though I have is how, for any reference outside of the routine or class being compiled, could the compiler be sure of what the routine and package mappings are going to be like at run time.  Most likely in a different environment.

   For example, even in a single instance of Cache, it is possible that for one user an argument type, argument, or return type would be correct and for another it would be incorrect.  This is because they are each using different mappings, maybe running is different namespaces, at run time.  You will notice that if you DO a non-existent routine or label in another routine it will compile, however if you call a non-existent label within the current routine you get a <NOLINE> at compile time.

    Having this only work within the current item seems to me to be of limited value.  So I am thinking that this would be a show-stopper for doing this at compile-time.  You would need something, like a 'lint', that could be run in a full test environment or even in production.

Clark

Rafael,

   The only times I have ever seen this happen is when there was an error returned from the OS while MSM was trying to access the file.  I would recommend that you simply delete and recreate the file.  If it happens again check all the log files including the OS logs and 'msmlog' for any errors or other indications of an I/O problem.  Beyond that you should log a support call with us at 'support@intersystems.com' if it is a chronic problem, and I can help you get to the bottom of it.

Thank you,
   Clark

Thomas,

   You should get a hang if you try '$zf(-1) from Studio, or any other connection method (JOB command, Telnet, CSP, etc) where the process is not directly associated with your Windows session.  The hang occurs because, in fact, a 'cmd' session is started, but no on your desktop, you can see it in Task Manager although it is difficult to identify there.  You should be able to execute a command that does no interact with the Windows session user however, for example 'w $ZF(-1,"dir > C:\Temp\Dir.txt")' should return 0 and result in creating the listing in the specified file assuming you have  privileges to do so.

   The fact you get a -1 back from terminal (a $io="|TRM|"* process) is a bit odd, I just tested and it works for me.   Does your user have the Windows rights required to run 'cmd'?  What happens if you right-click '[Windows Start] > [Run]' and enter the command 'cmd', does that work?  You may need to contact Support to further investigate this.

Clark

Wendy,

   Is the file on a mapped device?  If so does the session that is issuing the '.RemoveFile()' see the mapping.  You can check this by having the session issue a command like 's X=$ZF(-1,"NET USE > C:\Temp\UseResult-"_$J_".txt")'.  Note that "Cache Terminal ($I="|TRM|*") get the logged in users mapping while JOBbed processes, those coming in via telnet, or other TCP/IP based connections get the mappings of the Cache service.  For example, from Cache Terminal ($I="|TRM|:|13248") I get:

New connections will be remembered.


Status       Local     Remote                    Network

-------------------------------------------------------------------------------
Unavailable  H:        \\cambnfs1\nethome\clarkm Microsoft Windows Network
OK           L:        \\192.168.10.32\Library   Microsoft Windows Network
OK           S:        \\192.168.10.32\Scratch   Microsoft Windows Network
OK           W:        \\supnfs1\supscratch1\clarkm
                                                Microsoft Windows Network
OK           X:        \\refiles\scratch1\clarkm Microsoft Windows Network
Unavailable  Y:        \\refiles\scratch2\clarkm Microsoft Windows Network
Unavailable  Z:        \\refiles\scratch3\clarkm Microsoft Windows Network
Unavailable  LPT2:     \\192.168.10.36\Canon MP530 Series Printer
                                                Microsoft Windows Network
The command completed successfully.

   However from a Telnet session ($I="|TNT|CMATTHEWS6440.iscinternal.com:11992|12936") I see:

New connections will be remembered.

There are no entries in the list.

   This is because the telnet session does not see the mappings that the logged in user has.  If you want the service to, you will need to add the mappings somewhere using a NET USE command.  In Cache you can put it in the %ZSTART routine, or you can put it in your application code.

Jonathan,

   I am not sure I understand the question, but I will try.

   The "Generic" driver comes from Microsoft, many other drivers come from the printer manufacturer.  After Caché put the job into the spooler, Windows uses the driver to communicate the job to the printer.  I have simply found through experience that the "Generic" driver works for Caché when the more advanced drivers do not because they do not support text output.  If this is a driver question perhaps it should be directed at the author of the driver, I do not have sources for the many printer drivers out there.

Clark