Not 100% sure of the requirement but if you write the output from the Cache code as if it was to the terminal then you can redirect the output to a file by appending >output.file but I don't think cterm is the way to do that as cterm will trap all the output itself, you would need to use csession for that.

You could just write to a file in your Cache code or run an external command/script from inside the Cache code?

Microsoft used to have an Excel Viewer but it is retired, though it may still work.
Download the latest online Excel Viewer - Office | Microsoft Docs

If you don't have an MS Office license I'd suggest LibreOffice.
Home | LibreOffice - Free Office Suite - Based on OpenOffice - Compatible with Microsoft

However, you mention CSV files and these are not actually Excel Spreadsheets, they can just be viewed in a text editor if needs be, or any Spreadsheet software will open a formatted view.  There are also CSV specific viewers such as Nirsoft's
CSV / Tab delimited file viewer and converter for Windows (nirsoft.net)
 

As far as I recall this isn't logged at the point the status becomes "dead" only when cleaned up.

We once wrote a scheduled task that looks for and logs these processes using IsGhost() as they can sit there a while waiting to be cleaned up at times.

There is a mention in the IRIS documentation that says these are logged to the Event Log but I am not sure what happens if you aren't using Interoperability or Productions in IRIS, I suspect there is no logging then as it is part of Production Monitoring.

As Eduard suggests, you need to know the class that the pInput object is derived from as it doesn't seem to be a valid Stream if .Size fails.

It also looks like you might be expecting a file as you retrieve the name so why not use the %File class if the file already exists.

In fact, noting how you get the Filename using the Attributes method this looks like the pInput object is a response with content which means the actual stream will be in pInput.Content

It is far too long since I worked on Hospital systems to give an authoritative answer given extra restrictions you may have but for https the connection and data are already secure.

For outbound infrastructure can ensure it is routed outside appropriately and you could even limit traffic to specific external endpoints.

For inbound a load balancer or reverse proxy in the DMZ can keep your system away from the outside world and limit traffic both between the two and only allow specific external endpoints to have access.

To get the remote filesize over SFTP you would use the FileInfo method in %Net.SSH.SFTP.

To get the stream size you would use the SizeGet method on the stream.

However, be aware that during the transfer, depending on OS and methods used there may be translations of line terminators which would affect file size (often called ASCII mode).  This is not very common in SFTP, unlike standard FTP, but can still occur.  Also, I have found in the past that on a non-file stream the SizeGet is not always the same as the final file size when saved but that might just be my findings in special cases.

A valid point but it can depend on how the command string is formed in the first place, unless you write a parser to break a command string down into a command plus arguements.  I agree that you may as well use $zf(-1 but as the documentation will point you to use $zf(-100 then it can be valid.

It's also usefull to know you can use brackets in this way for general knowledge.