Hi Randy,

I've been around the block on this problem.

There is no simple native solution. I have tried to create a native COS to PDF generator but its a much bigger problem than time affords.

I know you are asking for a solution that does not include creating a web page, BUT, with exception of going to some other document type (e.g. DOC to PDF) you really have no other easy choice.

However, there is a better way than Zen reports to implement this (IMHO). I use wkhtmltopdf...

https://wkhtmltopdf.org/

It's an executable that takes the name of an HTML file and produces a PDF file. You can also control headers, page numbers etc with it. It's rock solid and really easy to use.

Essentially...

1. Create a simple HTML string with an img tag and save it to a file.

2. Call out to wkhtmltopdf using $ZF passing in the name of the files

Done.

If you want to be clever and not save the image to a linked file then you can take its base64 directly from the HL7 message and embed it into the img tag, e.g.

<img src="data:image/png;base64,iVBORw0KGgo...">

I have wkhtmltopdf running at numerous hospitals turning out upwards of 10,000 letters a day, rock solid, no problems.

Hope that helps.

Sean.

You can roll your own export function in a few lines of code and tailor it to your specific needs, something like..

ClassMethod Export(pGlobal, pFile)
{
    set file=##class(%File).%New(pFile)
    do file.Open("WN")
    set key=$order(@pGlobal@(""))
    while key'="" {
        do file.WriteLine(@pGlobal@(key))
        set key=$order(@pGlobal@(key))
    }
    do file.%Save()
}

and call it like so...

do ##class(Foo.CSVUtil).Export("^foo","C:\Temp\foo.csv")

(its a 30 second hack so might need some tweaking, also assumed no CSV escaping needed since commas are already used in the data.)

Hi Murali,

Your perfectly right.

You can have multiple namespaces on the same Caché instance for different purposes. These should have a naming convention to identify their purpose. That convention is really down to you. I normally postfix the name with -DEV and -TEST, e.g. FOO-DEV & FOO-TEST.

These namespaces will share commonly mapped code from the main library, but unless configured otherwise they are completely independent from each other. You can dev and test in them respectively without fear of polluting one another.

Tip

You can mark the mode of an instance via the management portal > system > configuration > memory and startup. On that configuration page you will see a drop down for "system mode" with the options...

Live System
Test System
Development System
Failover System

The options are mostly inert, but what they will do is paint a box on the top of your management portal screens. If you mark it as live then you will get a red box that you can't miss.

Sean

Hi Richard,

It will be because you are trying to send %Net.MailMessage as an Ensemble message. This does not extend %Persistent which is the minimum required for an Ensemble message.

You will need to create your own custom class that extends Ens.Request and have reflective properties for each of the MailMessage properties that you need at the other end. The operation will then need to create a new MailMessage from the message class.

Sean.

Ahhh OK, the Page() method has been overridden so we lose all of the %CSP.Page event methods.

Since there are no event methods the only thing you can do is override either the Page() method or the  DispatchRequest(). At least the headers are not written too until after the method call.

I guess what you are doing will be as good as it gets. Only worry is if the implementation changes in a later release.

Ideally the class should have an OnBeforeRequest() and an OnAfterRequest() set of methods.

No problem. I know you said you didn't want to write your own custom code, so this is for anyone else landing on the question.

If you use the DTL editor (which I advise even for ardent COS developers), then you will most likely use the helper methods in Ens.Util.FunctionSet that your DTL extends from, e.g. ToUpper, ToLower etc.

Inevitably there will be other common functions that will be needed time and time again. A good example would be to select a patient ID in a repeating field of no known order. This can be achieved with a small amount of DTL logic, but many developers will break out and write some custom code for this. The problem however is that I see each developer doing there own thing and a system ends up with custom DTL logic all over the place, often repeating the same code over and over again.

The answer is to have one class for all of these functions and make that class extend Ens.Rule.FunctionSet. By extending this class, all of the ClassMethods in that class will magically appear in the drop down list of the DTL function wizard. This way all developers across the team, past and future will visibly see what methods are available to them.

To see this in action, create your own class, something like this...

Class Foo.FunctionSet Extends Ens.Rule.FunctionSet
{

  ClassMethod SegmentExists(pSegment) As %Boolean
  {
      Quit pSegment'=""
  }

}


Then create a new HL7 DTL. Click on any segment on the source and add an "If" action to it. Now in the action tab, click on the condition wizard and select the function drop down box. The SegmentExists function will magically appear in the list. Select it and the wizard will inject the segment value into this function.

Whilst developers feel the need to type these things out by hand, they will never beat the precision of using these types of building blocks and tools. It also means that you can have data mappers with strong business logic and not so broad programming skills bashing out these transforms.

I've found Veeam can cause problems in a mirrored environment for Cache.

We invested a great deal of energy trying to get to the bottom of this (with ISC) but never truly fathomed it out.

As you can imagine we tried all sorts of freeze thaw scripts and QOS settings. Whilst we did reduce the frequency we could not completely eliminate the problem.

In the end we came up with this strategy.

1. Run Veeam on the backup member nightly
2. Run Veeam on the primary member only once a week during the day
3. Only use Veeam as a way to restore the OS
4. This would exclude all DAT, Journal and WIJ files
5. Take a nightly dat backup of both members and stash several days of Journal files
6. Ensure the mirror pair are on completely different sets of hardware and ideally locations
7. Configure the DAT and Journal files to be on different LUNs on the VM setup
8. Understand how to physically recover these files should the server crash and not recover

Conclusion for me was that Veeam is a nice to have tool in a VM set-up, but it should not be a replacement for the battle tested cache backup solution.
 

Hi Andre,

1. Description might be empty whilst NTE might actually exist, so it would be better to do...

request.GetValueAt("NTE(1)")'=""


2. If you are looking to shave off a few characters then take a look at $zdh...

https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY...

$zdh((source.{G62:Date},8)


3. For time there is $zth, but it expects a : in the time (e.g. HH:MM).

https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY...

It doesn't look like you have a colon, so you could use $tr to reformat the time first. This will save you 10 or so characters...

$zth($tr("Hh:Mm","HhMm",source.{G62:Time}))


4. You can have more than one PID group, but you should not see more than one PID segment in a PID group. Not all message types define (or allow) a repeating PID group. You might see multiple PID groups in an ADT merge message. You might also see bulk records in an ORU message, but in the real world probably not. If you know the implementation only ever sends one PID group then you will see many developers just hard coding its ordinal key to 1, e.g.

request.GetValueAt("PIDgrpgrp(1).ORCgrp(notHardcodedKey)")


Developing with HL7 can feel verbose to begin with. Tbh, what you are currently doing is all perfectly fine and acceptable.

Sean.

You could buffer it up, but you will still have a period of writing to disk where the other collection process could grab it mid write.

I normally write the file to a temp folder or use a temp file name and then change the file name once its been fully written to. Making sure the collection process ignores the temp file extension or the temp folder location.

Just glancing comments.

You are trying to set a parameter. I'm no ZEN expert, but I am pretty sure parameters are immutable in all classes.

The other thing, if I was doing this in CSP. Setting the content-type in the OnPage method would be too late, the headers would already be written. It would have to be written before then. Not sure if ZEN is similar, but I would override the OnPreHTTP (or equivalent) and set %response.ContentType=myCONTENTTYPE in that method.

This is what I would do.

Create a custom process and extract the value using GetValueAt and put it into a string container. String containers are handy Ens.Request messages that you can use to move a strings around without needing to create a custom Ens.Request message. Then just send it async to an operation that will decode the base64 and write it to a file. Two lines of code, nice and simple...

Class My.DocExtractor Extends Ens.BusinessProcess [ ClassType = persistent ]
{

Method OnRequest(pRequest As Ens.Request, Output pResponse As Ens.Response) As %Status
{
    Set msg=##class(Ens.StringContainer).%New(pRequest.GetValueAt("OBX(1):5.5"))
    Quit ..SendRequestAsync("FILE OUT",msg,,"Send DOC as Base64 to a file writer")
}

}


To decode the base64 use this method inside your operation.

set decodedString=##class(%SYSTEM.Encryption).Base64Decode(pRequest.StringValue)


Things to consider...

1. You mention the message is 2.3, but the MSH has a 2.4
2. If you set your inbound service to use either of the default schemas for these two then you will have a problem with required EVN and PV1 segments
3. Therefore you will need to create a custom schema and make these optional.
4. The base 64 decode method is limited to a string, so your PDF documents can not be greater than 3.6MB (assuming large string support is on be default).
5. You probably don't want to decode the document into another message too soon, do this just before writing to a file

T02 to ITK should just be a matter of creating a new transform and dragging the OBX(1):5.5 field onto the reflective target field.

The error message is heavily escaped, it would look like this...

{"Info":{"Error":"ErrorCode":"5001","ErrorMessage":"ERROR #5001: Cannot find Subject Area: 'SampleCube'"} } }

This error is only raised in the %ParseStatement method of the %DeepSee.Query.Parser class.

I'm at the limits of what I know on DeepSee, but if I read this as it looks, there is a missing cube called SampleCube?

Hi Everardo,

There is an extra couple of compilation steps required for the web method.

Each web method requires its own separate message descriptor class. This class contains the arguments of your method as properties of the class, e.g.
 

Property file As %Library.String(MAXLEN = "", XMLIO = "IN");
Property sql As %Library.String(MAXLEN = "", XMLIO = "IN");


This extra class is required to provide a concrete API to your web method. The web service description will project this class as a complex type that the calling services needs to adhere to.

What I think is happening is that when you have an argument called args... that the compiler is trying to compile
 

Property args... As %Library.String(MAXLEN = "", XMLIO = "IN");


Which would fail with an invalid member name error (which correlates with the 5130/5030 error code you have).

I think the main issue here is that there is nothing (to the best of my knowledge) in the SOAP specification that allows for variadic types.

Instead what you want is an argument type that can be projected as a list or an array, e.g.
 

ClassMethod GenerateFileFromSQL(file As %String, sql As %String, delimiter As %String = "", args As %ListOfDataTypes) As %String [ WebMethod ]


That will then be projected in the WSDL as a complex type with an unbounded max occurs, allowing the client to send any number of repeating XML elements for the property args.

If you pass args as %ListOfDataTypes to your non web method then you will need to decide if that method should have the same formal spec, or overload it, something like...
 

if $IsObject(args(1)),args(1).%IsA("%Library.ListOfDataTypes") {
  set list=args(1)
  for i=1:1:list.Count() {
    write !,list.GetAt(i)
  }
} else {
  for i=1:1:args {
      write !,args(i)
  }
}


Sean.

Hi Scott,

The %Stream package superseded the stream classes in the %Library package. If you look at the class documentation you will see in the descriptions that the %Library stream classes have been deprecated in favour of the %Stream variants. The only reason they still exist would be for legacy implementations.

The other difference is that one is a character stream and the other is a binary stream. As a general rule you should only write text to the character stream and non text (e.g. images) to the binary stream. The main reason for this is to do with unicode characters. You may not have seen issues writing text to %FileBinaryStream, but that might well be because your text didn't have any unicode conversions going on.

Performance wise I'm not sure there would be much in it between the two. You can access the source code of both and they both use the same underlying raw code for reading and writing to files. If you benchmarked them then I guess you would see a marginal difference, but not enough to question which one to use for best performance.

I wonder, how did you determine that the logIt code was the reason for messages slowing down. On the surface it should only have a small impact on the message throughput. If messages are queueing up then it almost feels like this is just the first observation of an overall performance issue going on. I guess you have monitored overall IO performance. If it's already under strain then this could be the straw that breaks the camels back.

On a curious note, whilst you might have needed to log messages in eGate, I wonder why this would be necessary in Ensemble. Unless you are using in memory messaging, all of your messages will be automatically logged internally, as well as being tailed to the transaction logs. By adding your own logging you are effectively writing the same message to disk not twice but three times. If you also have IO logging enabled on your operation then it will be four times. Not to mention how many times the message was logged before the operation. On top of that, if you have log trace events enabled in production then the IO overhead for just one messages is going to thrash the disks more than it needs to. Multiply that across your production(s) and how well IO is (or is not) spread over disks and it would be easy to see how a peak flow of messages can start to queue.

Another reason I see for messages queuing (due to IO thrashing) is because of poor indexes elsewhere in the production. A data store that worked fast in development will now be so large that even simple lookups will hog the disks and flush out memory cache putting an exponential strain on everything else. Suddenly a simple bespoke logger feels like its writing at the speed of a ZX Spectrum to a tape recorder.

Of course you may well have a highly tuned system and production and all of this is a rambling spam from me. In which case, nine times out of ten if I see messages queuing its just because the downstream system can't process messages as quickly as Ensemble can send them.

Sean.

Hi Greg,

The only zip utility that I have come across is in Healthshare (core 10+).

If you have Healthshare then take a look at...

HS.Util.Zip.Adapter


If you don't have Healthshare then it's still easy enough to do via the command line with $zf...

https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=RCOS_fzf-1

First, if you are on windows then there is no built in command line outside of powershell. You will need to install 7zip (btw, Healthshare defaults to 7zip on windows as well). If you are on Linux then there is a built in zip command, but you might also chose to install 7zip as well.

Couple of trip hazards.

If you are building the command line on windows then 7zip will be installed in "Program Files" with a space, so you will need to wrap quotes around the exe path, which will need double quoting in a cache string.

If you are unzipping to a directory, the directory needs to exist first. Take a look at CreateDirectoryChain on the %File class to make this easier to do.

A simple untested example...

ClassMethod ZipFile(pSourceFile As %String, pTargetFile As %String) As %Status
{
    set cmd="""C:\Program Files\7-Zip\7z.exe"" a "_pTargetFile _" "_pSourceFile
    set status=$zf(-1,cmd)
    if status=0 quit $$$OK
    quit $$$ERROR($$$GeneralError,"Failed to zip, reason code: "_status)
}


Anyone landing here and happy just to use gzip, then there was a recent discussion here...

https://community.intersystems.com/post/there-option-export-globals-archive

Hope that helps.

Sean.