Mike.W · Apr 23, 2021 go to post

The reason that there is no sort "function", is that sorting is part of the basic storage structure that the whole system is built on. Arrays, both global and local, are automatically sorted as you set them up (effectively by an insertion sort). No need for a separate program or request. Just set up the data in an array as you go along and it will be sorted when you need it. This has always been a basic idea in the language, right from the original MUMPS, so the documentation may well skip over it a bit. However, when you get used to it, it works well.

Mike.W · Nov 19, 2020 go to post

We still have green-screen parts of the app, so interested in the answer. And is there a way to tie the Windows command app into running cache? (I know very little about Windows.). Thanks.

Mike.W · Mar 13, 2020 go to post

Hi,

I don't want to achieve anything else, it's just that "cache being cache" there's often another way to do the same thing and it might be easier. :-) 

Mike

Mike.W · Mar 12, 2020 go to post

Hi,

I recently needed a temporary class (not just a table) to store data while it was manipulated (imported, queried, modified, etc. and eventually exported) . I eventually set up all the storage locations with PPG refs as noted above, which works, but did wonder if there was a class parameter I had missed that just indicated the extent was temporary. Or maybe an alternative to extending %Persistent?

Mike

Mike.W · Mar 12, 2020 go to post

Hi Stuart,

As others have said, it's best practice to build a new variable, rather than amending what you have (there's some fancy name for the rule, I think, or maybe just "functional programming").

Looking at the string, I assume it is HealthShare HL7 message format, so the contents are pretty limited. In which case maybe a shortcut could be used:

s out=$Replace($Replace(list,"~]",""),"~[","")

Here's another alternative:

w $ZSTRIP($ZSTRIP(list,"*","[]"),"=>P")

I admit it could go horribly wrong with multiple nesting (I don't remember all the possible formats), so needs some testing. 

Hope you're keeping well,

Mike

Mike.W · Oct 25, 2019 go to post

Sadly, in my team we've all been writing MUMPS for so long that the abbreviated style comes naturally and is a hard habit to break. Yes, expanded is better for new starters in the language.

However... Playing devil's advocate you could say that abbreviated commands are:

1. Faster to type (as you said).

2. More compact. Allowing the reader to "see" more of the structure in one go.

You see, you can expand things out too much, in my opinion. Also, it only takes a few minutes for a (reasonable) programmer to get that "S" means "set", "I" means "if", etc. Commands always appear in the same part of the code (unlike some languages), there are not that many to learn, and once you know them, you can read them! So why bother with extra letters? After all "set" is itself only a token for "put the value on the right of the = into the variable on the left" or something like that. It could be "make" or "update" or "<-" (look up the programming language "APL" on Wikipedia if you want a real scare).

I think the main problem with "old fashioned" code is usually poor label/variable names, squeezing too much on one line, and lack of indenting. It's hard to read mainly because of the other parts of the code, not because the commands are single characters. Some things should be longer to better convey what they are for (though not as long as COBOL), and more lines can help convey program structure.

While I'm here, I'm not that keen on spurious spaces in "set x = 1", as opposed to "set x=1". It just spreads out the important stuff  - spaces are there to split out the commands.  :-) 

Mike

Mike.W · Oct 4, 2019 go to post

Hi. It depends on what you mean by "certain criteria". If it's a special file name then you could amend the FileSpec  property to skip the ones you don't want yet. If it's in the content, then maybe you should be reading in the file (creating a copy or allowing archive so the original continues to exist) and sending it as a message into Ensemble that can then  be held up in a business process until ready to send out to an Operation that creates an output file. That is the way ensemble is supposed to work, so you get a full record of what happened, etc.

(Otherwise, I'm pretty certain that there are actions that reset the list of processed files - maybe resetting that file path or restarting the job - but I cannot find the documentation about it at the moment. )

Mike.W · Jun 28, 2019 go to post

Hi.

Recently, well yesterday, I needed to do exactly the same , and on a class property as well! I found an answer more by accident than design:

s data=$LB(1,2,3)

s data=$LI(data,1,*-1)

zw data
data=$lb(1,2)

When I saw the other answer, I worried that this might not work when the result is only one item, but it does, as confirmed by the documentation for the $LIST function. If you supply all three parameters - list, position, end - then it always returns another list. I was pleasantly surprised!

Mike

Mike.W · May 3, 2019 go to post

Hi / It sounds like a good idea! I can think of a number of interfaces I've seen where the target application - a small local system - struggled to keep up with the flow of updates from a large PAS.

The only thing I've done like it was complicated, and had to use a proper Business Process. In that case the "department" was neonatal, so we were only interested in patients admitted to a particular ward. The solution looked for HL7 admissions and transfers to that ward, and when found used the data to create a local record in Caché. Then all other types of message could be checked against those records to see if it needed further processing and passing on (to "BadgerNet" eventually when a full episode was built up). Of course this only works if you can define a clear "starting point" that can be spotted in the message stream.                        / Mike

Mike.W · May 3, 2019 go to post

Hi. The "clean code" people would recommend just one parameter max, and better would be none! But I think that's going a bit far, and agree that 3, or maybe 4, maximum should be aimed for to keep things easy to understand when reading, though there may be exceptions.

The array passing is a good idea to reduce the number for normal routines, but I think the ideal for classes is using objects. If you are truly embracing objects, and self-documenting code, then new classes are usually needed and what used to be parameters become the setting of properties, like this:

table=##class(CMT.UI.Table).%New()
table.TopLine=10,table.BottomLine=21
table.HeaderFormat="Underline"
table.DefineQuery("CMT.UI.PatchSite:MyList")
table.AddColumn(3,,"BOLD")
...etc.

table.Display()

It works well in some cases, but I have to admit that there is a tendency for the number of classes to get a bit silly if you take it to the extreme. Sometimes simple code is best.  :-)     / Mike

Mike.W · Oct 25, 2018 go to post

I think that is my preferred method as well, but it depends to some extent what you are going to do with the result, and what you want to happen if the input number is too big. This $J solution will always return all the characters input, which may be the safest thing. (Though any space characters inside the input will get converted to zeros.)

When the fail mode needs to still return the same length string, e.g. to avoid messing up some fixed length message format, it might be best to use $E(), e.g.

W $E("0000"_number,$L(a)+1,*)

I've also seen the following used, but I'm not sure I recommend it. So many interesting ways it could go wrong!

w $E(number+10000,2,999)
Mike.W · Oct 18, 2018 go to post

Hi. I was actually looking up some information about pattern matching, but came across this warning:

If a call attempts to use indirection to get or set the value of object properties, it may result in an error. Do not use calls of this kind, as they attempt to bypass property accessor methods (<PropertyName>Get and <PropertyName>Set). Instead, use the $CLASSMETHOD, $METHOD, and $PROPERTY) functions, which are designed for this purpose

This was from https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=GCOS_operators#GCOS_operators_pattern

So it looks like it may well work now, but there's no guarantee it will always work.

Surely there is a $method() call you can use to get the next item in the array?

Mike.W · Jul 27, 2018 go to post

Hi, I also help support an NHS trust using Ensemble, and it also has ever-growing PDF files in messages. We have our incoming PDFs as external file streams and it helps, though you have to bear in mind that the files are not going to be part of the cache backup for Disaster Recovery, etc. (Not sure about mirroring. I'd assume they don't get mirrored either as the contents are not in the journal.)

As yet, we don't have as big a problem as you - less messages and we only keep 92 days - but that is just as well as the PDF files are converted to base64 encoded in HL7 v2 messages, so they then do take up space in the database, and the journal, and the backup, which has resulted in the need to expand the disk space recently. I can recommend keeping Ensemble on a virtual server with disk expansion on demand.

I tend to think the problem is not going to go away whatever you do. I assume, like us, the PDFs come from 3rd party applications and they are always going to be producing  ever more and prettier documents as time goes by. So I recommend looking at more disk. :-)  / Mike

Mike.W · Jun 15, 2018 go to post

Hi,

I won't claim this is an answer, because it's not quite the same and people may object to the structure, but here is one solution that is used quite a lot in code I look after. Basically, a subroutine is called and then tests are done and a Quit is used to drop out when a match is found. Often used for validation, something like this that returns a result in the zER variable:

V1 ; Validate ORGC
   S ORG=zORG WC2^hZUTV zER'="" Q
   I IPACC<9,'$D(^hIW(WAID)) zER="No details set up for ward" Q
   D WARDON^hILO1 LOCK zER="Ward in use" Q

VQ2 Q

Apologies for the old-fashioned code! However, you can see each test can be quite complex and using lots of variables, but it is easy to understand as long as you expect the structure to work that way.

This is very similar to the "clean code" solution of making the whole thing into a function that returns a value:

ClassMethod Main(val1 As %String, val2 As %String) {  write ..MyOutput(val1,val2) }ClassMethod MyOutput(val1 As %String, val2 As %String) As %String {  if val1 = 1 return "case 1"  if val1 = 2, val2="*" return "case 2"  return "default match" }

Of course there are probably as many answers as there are Cache programmers!  :-)

Mike.W · Apr 20, 2018 go to post

Yes, we have something like that, except we use the letter "q" as the prefix, and we follow it by the programmer's initials so that it becomes a "personal" set that is left alone in all namespaces, dev-test and live. We also extend this to rule to globals and things inside the application like functions, screens, tasks, etc. The in-house configuration management system we use ignores them so they are left untouched. It's a useful convention.

(We might have used "z" like you, but it was already taken for "utility/library" stuff.)

Mike.W · Apr 19, 2018 go to post

Neither - I think it best to just remove all the code and leave a stub with just a comment (usually with the change request id and reason).

That ensures the unused code is removed from all downstream libraries, so does not pop up in searches and testing, and yet keeps a record of its previous existence. I've always been against leaving unused code in source files, even if it's "commented out".

Mike.W · Feb 21, 2018 go to post

Anyone know why we ended up with this strange behaviour? Why doesn't COS store 2 and "2" in the same way in lists? The rest of the programming environment is based around them being the same - (2="2") - as everything is a string until used otherwise. It may use "an optimized binary representation", but surely that's not really an excuse. Just curious.

Mike.W · Feb 21, 2018 go to post

An alternative solution that works for us is to use the "Schedule" setting to run it for 30m (to allow some leeway as the job takes a while), and then set the "Call Interval" setting to something very large like "999999". This is for an Inbound SQL adaptor. (If something goes wrong with this overnight run then we manually remove the "Schedule" setting and restart the Service. Once complete, we put back the setting ready for the next night.)

Mike.W · Jan 12, 2018 go to post

Hi.

I'm not sure what your "persistent objects" contain, but if the repeating data is in the form of strings or streams then perhaps you could put them into a XPATH document object ( %XML.XPATH.Document) and use the evaluator in that? I support a system that has a message with a transient property to hold the document created, so it's only done once (per processing), and a method that builds that property if needed and then calls the  EvaluateExpression method in that property for an expression supplied as a parameter. This is used in transformations to extract data to post into HL7 v2 messages, so the same calls should work in rules as well.

Of course the XPATH expressions may be no easier to define than your exporting and importing methods. I certainly struggled with them and in the end had to build lots of special inputs that added the huge long list of nested items that usually went at the front of each expression, resulting in calls like source.Pull("Baby","ep2:id/@extension") where "Baby" added a prefix of over 160 characters to find the baby section of the Mother's full document before tunneling down to the hospital number. If you already have a sub-section of the full document then you may not need as much.

Mike

Mike.W · Jan 4, 2018 go to post

Hi. We had a site upgrade from 2012.2.5 to 2017.1.0 last year, and it included a mirror. We had very few code changes needed - just an issue with it failing to save objects inside a Business Process where we had used some "unusual" structures. The upgrade itself went smoothly. The only issue was afterwards when the next backup was a "full" one instead of the scheduled "partial", using more space than expected. Our Production was much smaller than yours, with only about 120 items, and it is hard to say how much effort went into pre-release testing as it was "fitted in" around other work by a team of people. Maybe a couple of man  months?

To be honest, it all depends on how much custom or unusual code you have, and how much testing the customer wants. We upgraded a development namespace and re-ran test messages through all the important paths and compared the result before and after upgrade. Plus some connection testing to cover all the "types" we used: ftp, web service, HL7, etc.  In our case the testers included people from the user side, so they could decide when they were happy with it.

InterSystems were very helpful. We raised a call a few months before and they gave advice on testing and desk checked our detailed plan of the upgrade itself, including how to do the mirror.

Good luck.

Mike.W · Dec 20, 2017 go to post

Hi - we may be overcomplicating the solution here. Rather than comparing the age for every record, all you need is to work out a single cut off date to compare against the DOB. This was given in Jill's answer above, but another variation that might be clearer is:

WHERE DOB < todate((tochar(current_date,'YYYY')-12)||'0101','yyyymmdd')

Also, the usefulness  of an index will depend on the ratio of under 13 to over 13 records. If the vast majority are to be included, then use of an index may slow access down as the system flips back and forth over the main global, whereas a straight run without an index could be quicker (hopefully the compiler would work this out for you).

Regards,

Mike.W · Dec 8, 2017 go to post

I won't embarrass myself by listing the MUMPS code from 1991 that does this on our application, but I will comment that you need to work out how many birthdays have gone by so it must compare month and day values once the basic year subtraction has been done. It gets quite complicated. (Might also like to look at whether you need more than just a "years old", and also need months or days for very low values.)

Mike

Mike.W · Dec 8, 2017 go to post

I have used the  %XML.Writer class to create a document, but only for a fairly simple one that was destined for a SOAP outbound call. The SDA is tricky, so I would imagine using HealthShare (Ensemble) would be much easier. (I have used Ensemble classes like EnsLib.EDI.XML.Document that can be added to a message, etc. and used as the target for transformations once you have an appropriate document definition loaded in. Reduces the coding required, though not entirely as repeating groups are an issue.)

Mike

Mike.W · Dec 8, 2017 go to post

I'm just thinking that maybe we need more information as to why this is needed before recommending anything. If the network connection is good enough for mirroring, then why not just map the classes to a central repository? Perhaps all that is needed are security settings to prevent updates from the "slave" systems. Perhaps there is no network connection, in which case mirroring or shadowing is not possible, and what is needed is a good way to automate export/import to OS files.

Mike

Mike.W · Nov 23, 2017 go to post

I may have misunderstood your requirement, but  you may not need a Business Rule at all. To pull data from the request into the context  I've actually used Transforms.

To get this to work, I created a new class that Extends (%SerialObject, %XML.Adaptor), and defined in it the properties that I need to store. I could then define a Transformation from the incoming message type to this new one, pulling everything I needed. In the BPL  I then added a property called "TempStore" of that type to the Context object in my BLP. To pull the data I added a Transform Activity with a Source of "request" and a Target of "context.TempStore" using the Transform.

Later Activity boxes could then use the fields with references like "context.TempStore.priorMRN" to do tests, etc. I've also used the same trick to update outgoing messages with data from the context (using Create = existing in the Transform).

I hope this is useful.

Regards,

Mike

Mike.W · Nov 17, 2017 go to post

if you open the class in Studio, and view the "Inspector" pane, then pick "Property" from the first drop down and your property in the other (or double click it), then it shows a very long list of possible things you can add. It may even show all of them, I don't know. You can then click on the values to enter them, sometimes getting drop down lists of possible values.

I've often used the inspector as a way of finding out what might be available, and then searching any keywords in the Caché documentation to confirm how to use it.

Regards,

Mike

Mike.W · Nov 1, 2017 go to post

Amir's answer with option 2 is what we did. The XML we sent had to be converted to allow it to be sent, so our code looked a bit like this:

Method ImportEpisode(pRequest As EnsLib.EDI.XML.Document, Output pResponse As Ens.Response) As %Status
{
 ; Use format 8 bit regardless of cache default (else Base64Encode gives ILLEGAL VALUE error)
 Set sendingXML = pRequest.OutputToString("C(utf-8)",.tSC)
  If $$$ISERR(tSC) Quit tSC
  $$$TRACE("Sending: "_sendingXML)
  Set sending = $system.Encryption.Base64Encode(sendingXML)
  Set tSC = ..Adapter.InvokeMethod("ImportEpisode",.result,sending,{plus some other id parameters})
  If $$$ISERR(tSC) Quit tSC
 Set resultXML = $system.Encryption.Base64Decode(result)

...etc.

I hope this is useful to you.

Mike

Mike.W · Aug 8, 2017 go to post

Many guides to "good programming" (in any language) would advise that the return from a function/method should be used for "real" data only, and any "exception" situations should be flagged as an error. While I'm not convinced this is always the best way, I can see the advantages. Code with repeated tests of returned status values can be messy and hard to read, and if the only thing it can do when the status is a fail is to quit out again with a status of "failed", then there is not a lot to be gained.

Mike

Mike.W · Jul 26, 2017 go to post

I had a similar situation and ended up with an Ensemble Service reading in the meta data file (like your xml), and composing an Ensemble message with that information, including a file reference for the data file (your pdf). This meant that the meta data file could be automatically archived by Ensemble, but now I had to archive the data file instead, using calls to the OS like you have done for your xml file above.

In my case this did make some sense, as I wanted to convert the data file using an OS call to an "exe", and at least the messages in Ensemble had all the meta information, file name, etc. But I also think it was a bit clumsy so would be interested in any better ideas.

Regards,

Mike