The $LENGTH function can give you the number of PIECEs in a string, and the $PIECE function can grab multiple pieces, so for your example:

S DELIM="D"
S STRING="aaDtext1Dtext2"
W $PIECE(STRING,DELIM,1,($LENGTH(STRING,DELIM)-1))

Edit: Links to documentation for the $PIECE and $LENGTH functions:

$PIECE: https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=RCOS_fpiece

$LENGTH: https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=RCOS_flength

Hope this helps!

If you're asking what to do to only have one double-quote in the result, you could use this:

s b="""Cat"
w b

Basically, wherever you want a double-quote in the output string, put two double-quotes together in the string. So, if you wanted the string to say: Cat"so"only"one you would use this:

s b="Cat""so""only""one"

When I write code that needs to output comma separated value files, I often create variables to hold both the double-quote and comma characters, to me the resulting code is much easier to read. For example:

S Q="""",C=",",QCQ=Q_C_Q ; Sets Q to " char, C to comma, and QCQ to ","
; so when I need to output some strings, I just use the variables:
W Q_"string1"_QCQ_"string2"_QCQ_"string3"_Q,!

To me, it's easier to read than a whole bunch of imbedded double-quotes all strung together.

Hope this helps!

There are provisions in the $ZDATEH function to cover the "two-digit-year" century issue. If you need all two-digit years to equate to 20xx, try this:

$ZDATEH("01/01/23",,,3,58074,94598)

As in, set the 'yearopt' parameter to 3, then set the startwin & endwin dates (in $H format) to the beginning and ending window to interpret the two-digit years.

If you need a specific range 100 year range - for this example, to evaluate two-digit years between 01/01/1950 and 12/31/2049, you'll need the $H values of those dates:

W $ZDATEH("1/1/1950")
39812
W $ZDATEH("12/31/2049")
76336

Then use those $H values in the window of the $ZDATEH command:

$ZDATEH("01/01/23",,,3,39812,76336)

Here's more documentation on the $ZDATEH command:

https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=RCOS_fzdateh

Hope this helps!

One option would just be a straight telnet session (swap 10.10.10.10 with the IP address of your system):

telnet 10.10.10.10 1972

If the port is closed, you should get the error: "telnet: Unable to connect to remote host: Connection refused" - but if successful you should get the "Connected to 10.10.10.10" you'll know it's open. To exit, type <CTRL>] for a telnet prompt, then type 'quit'.

Hope this helps!

Robert,

I'm not sure if this helps your particular situation, but if it's "OK" to manually tell your container what the local hostname of the host is, then you could try this:

set ip=$SYSTEM.INetInfo.HostNameToAddr("**local_name_NotFQDN**")

with just the local hostname (not the FQDN) of the host machine in quotes... and that should give you the IP address of the active ethernet/wireless adapter. I tried this in a container running IRIS on a Raspberry Pi and the local hostname is "Iris2" (so, it is running Linux, I don't have any container systems running on Windows... sorry!) and this is what I got:

USER> Set ip=$SYSTEM.InetInfo.HostNameToAddr("Iris2")
USER> zw ip
ip="192.168.1.236"

On my network, 172.17.0.x is the internal Docker bridge, 192.168.1.x is my wireless network, and 10.1.2.x is my desktop wired network. (I have many servers, printers & whatnot, so I have multiple VLANs on my home network.) Now... I'm not sure if this is good or bad for your situation, but in my example, if I were to shutdown the container, disable the wireless, hook up an ethernet cable to the network and restart everything, the listed IP from this command would change from the 192.168.1.x to a 10.1.2.x IP address. This could be good if you wanted to know how the main machine was externally connected; or it could be bad if you're using awk/grep/findstr on logs looking for a particular IP. As I said, I'm unsure of your actual use case, so if this had to be portable across several containers and several machines unchanged, this may not help you as you'll manually have to change the machine name in your code.

Hope this helps!

If you have multiple subscript levels, this may help:

SET I=0,G="^test("""")" FOR {SET G=$QUERY(@G) Q:G=""  SET I=I+1 WRITE G,!} WRITE "Total: ",I,!

Here's the data:

set ^test(1)="aa"
set ^test(1,1,1)="aa"
set ^test(2)="aa"
set ^test(2,1)="aa"
set ^test(2,2,1)="aa"
set ^test(3,1)="aa"
set ^test(4,1)="aa"

And here's the output:

^test(1)
^test(1,1,1)
^test(2)
^test(2,1)
^test(2,2,1)
^test(3,1)
^test(4,1)
Total: 7

If you only wanted the total (especially if the global is much larger) omit the 'WRITE G,!' portion out of the line of code above.

Hope this helps!

I'm not sure I understand the class examples you have listed, but there's a possibility that you may not need to do this in a class.

Ensemble / HealthShare has a couple different ways that you can send a process output to multiple operations. The first is if you're not using any form of data translation, you can send the output right from the BP configuration screen:

 

Under the Response Target Config Names, I've selected two different targets here, and you can see that the connectivity verifies this - but I'm sending the output to more than two targets! How can that be? Simple, you can also select different targets in the Ensemble Rule Editor - this can be handy if you wanted to apply two different DTL transformations to two different targets. My example is super-simple (as in I'm not applying different rules or DTLs per target) but with multiple Send directives we can specify multiple targets:

 

You can have different rules with different constraints going to different operations - but I just added a send directive to a sample ruleset to show how to configure multiple targets - and as you can see between the two screenshots how Ensemble is sending the data to 4 different targets in two different ways.

Hope this helps!

Well, everyone has coding styles and ObjectScript offers several different styles - I could have made this prettier (to me, anyway) as I'm more accustomed to the single-letter-command and dot-loop styles... but I tried to keep this in your coding style.

My code isn't pretty - I focused more on making it (barely) functional and demonstrative of the $DATA command - this command will let you know if there's any further subscripts available in a global - documentation page is here:

https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=RCOS_FDATA

Anyway, here's my code - I didn't have a chance to create a class method (again, I prefer the older styles) but just copy-paste the code center into your method and it should function. Again, it's not pretty, but it will demonstrate what you need.

If you wanted to make this more efficient, recoding this to handle subscripts recursively would be much shorter and could handle any number of subscripts, not just 3.

 ZSUBFINDER ; 
 ;
    set subscript = ""
    for {
      set subscript = $order(^Book(subscript))
      quit:(subscript = "")
      set moresub = $data(^Book(subscript))
      if moresub=10 {
      set sub2=""
      for {
           set sub2 = $order(^Book(subscript,sub2))
           quit:(sub2="")
           set moresub2= $data(^Book(subscript,sub2))
           if moresub2=10 {
          set sub3=""
             for {
                 set sub3 = $order(^Book(subscript,sub2,sub3))
                 quit:(sub3="")
                 set moresub3= $data(^Book(subscript,sub2,sub3))
                 if moresub3 = 1 {
                 write !, "subscript=", subscript, ", sub2=", sub2, ", sub3=", sub3, ", value=", ^Book(subscript,sub2,sub3)
                 }
             }
           else {
             if moresub2=1 {
                     write !, "subscript=", subscript, ", sub2=", sub2, ", value=", ^Book(subscript,sub2)
             }
           }
      }
      else {
      if moresub=1 {
      write !, "subscript=", subscript, ", value=", ^Book(subscript)
      }
      }
    }
 quit

Hope this helps!

The 31-character limitation is there in 2018 (I'm using 2017 for this demonstration) - although anything longer doesn't technically error out, only the first 31 characters are recognized.

A quick demo I pulled from a test server:

NAME>s ^HH.LookupLabResultsToPhysiciansD(0)="fluffy"
 
NAME>zw ^HH.LookupLabResultsToPhysiciansD
^HH.LookupLabResultsToPhysicians(0)="fluffy"
 
NAME>s ^HH.LookupLabResultsToPhysiciansDoTryToDemonstrateLongGlobalNames(0)="More Fluffy"
 
NAME>zw ^HH.LookupLabResultsToPhysiciansDoNoFluffy
^HH.LookupLabResultsToPhysicians(0)="More Fluffy"

I underlined the characters that are 'ignored' - you can see on the ZWRITE command that the last 'D' (or anything after it) isn't displayed, and you can type all sorts of characters after that final 'D' and it still changes the 'base' 31-character global.

InterSystems probably put that check in because folks were using longer global names thinking all of the characters were significant, but some data was getting changed inadvertently.

Does that help?

Give the ^rINDEX global a look.

I made a QTEST routine in Studio, and saved it but did not compile it.

QTEST ; JUST A TEST.
 Q
EN ; JUST A TEST.
 Q
 ;

and then I executed this at the programmer prompt:

ZW ^rINDEX("QTEST")
^rINDEX("QTEST","INT")=$lb("2021-08-06 13:21:58.061867",49)

I changed the routine a bit:

QTEST ; JUST A BIGGER TEST.
 Q
EN ; JUST A BIGGER TEST.
 Q
 ;

and I ZW'd the global again:

ZW ^rINDEX("QTEST")
^rINDEX("QTEST","INT")=$lb("2021-08-06 13:24:50.38743",63)

It may be safe to assume that the underlined parameter is the length or number of bytes of storage required.

Now once I compile the routine in Studio, and ZW the global again, this is the output:

ZW ^rINDEX("QTEST")
^rINDEX("QTEST","INT")=$lb("2021-08-06 13:24:50.38743",63)
^rINDEX("QTEST","OBJ")=$lb("2021-08-06 13:26:30",152)

Hope this helps!

Mr. Petrole,

Per the documentation here: https://docs.intersystems.com/latest/csp/documatic/%25CSP.Documatic.cls?&LIBRARY=%25SYS&PRIVATE=1&CLASSNAME=%25Library.DynamicAbstractObject#%25FromJSON

is it possible that at some point your data stream is not encoded as UTF-8? That seems to be a requirement of the %FromJSON method, and if your data string isn't UTF-8 you may need to employ a call to $ZCONVERT, or for a "non-compliant" stream you may need to set the TranslateTable attribute of the stream to "UTF8."

Hope this helps!

@Jack Huser ,

Instead of using a regular pipe, have you tried a Command Pipe ( |CPIPE| )?

OPEN "|CPIPE|321":"/trak/FRXX/config/shells/transportIDE.sh ""login|password|NOMUSUEL|PRENOM|NOMDENAISSANCE|1234567891320|199999999999999999999999|09%2099%2099%2099%2099|31%2F12%2F1999|242%20IMPASSE%20DES%20MACHINCHOOOOOOOOSE%20||99999|SAINT%20BIDULETRUC%20DE%20MACHINCHOSE|isc%24jhu|123456|123456798"" 2>&1":100

Per this documentation: https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=GIOD_interproccomm a CPIPE doesn't suffer from the 256(ish) character limit because the command called is the 2nd parameter.

Hope this helps!

There is a way to "re-route" telnet over a secure SSH tunnel with a product called 'stunnel.' There are free stunnel implementations for Windows, Linux and AIX, and I have tested using it across architectures (Windows Client to an AIX server, Linux client to Windows server, etc.) so it's basically architecture agnostic.

You will have to install stunnel on the server and every client so there is some work, but it can be done "over time" so there shouldn't be much downtime.

First, install stunnel on your server and configure it for a high port - for this example, let's use 6707. Configure stunnel encrypted SSH traffic on port 6707 and 're-route' the decrypted traffic to port 23, but don't disable port 23 just yet; you'll be able to use both ports temporarily until you get all the workstations switched.. On each workstation, install stunnel and re-route unencrypted port 23 traffic to encrypt and send out on port 6707. Once you get all of the workstations converted (which means you're no longer sending unencrypted traffic) then reconfigure the server to disable any input on port 23 on your network card(s), and rely solely on the SSH traffic on port 6707. Keep in mind, you can't disable port 23 on 127.0.0.1 - stunnel will need that, but as that's wholly internal to the OS it should satisfy your network scanner.

If stunnel & your firewall are set up correctly, it works invisibly with Cache / HealthShare / Iris and  every telnet client I've tested (NetTerm, putty, a couple others that don't immediately come to mind) because they still talk on port 23 locally, but stunnel does all the encryption and rerouting automatically. The only issue I've seen is if you _already_ have network issues and timeouts, stunnel can experience disconnects more often than straight telnet due to the increased overhead of the encryption.

Hope this helps!

ED Coder,

Your post is a bit confusing (to me, anyway), as I don't work much with HL7 on a daily basis, and I'm not sure what you mean by $GLOBAL, so I am going to try to clarify -- but I'll warn you, my crystal ball is usually pretty cloudy so I might be way off. If I am, I apologize in advance. But I hope this helps anyway.

Now, I'm going to assume that instead of $GLOBAL you really meant ^GLOBAL for where your info is stored -- as in, you've set up this info beforehand:

Set ^GLOBAL("123","bone issue")="Spur"   ; Sorry, but I added some data. You'll see why later.
Set ^GLOBAL("234","joint issue")="Ache"  ; Also on this one.

I'm going to assume one other thing for this post: that the local variable name that contains your data is called HLSEG. We can simulate this with this command:

Set HLSEG="$GLOBAL(""123"")"

Now, I realize that you may not be able to use carats (where I work we call 'em "uphats" -- don't ask me why...) in HL7 data so that's why you may have substituted the '$' symbol to indicate a global.

So, with those assumptions in place, here's a possible process on how you could get the data from a global into your segment.

First, if we assume that if the data doesn't start with a '$' symbol (indicating the need to reference a global) then just exit the process:

If $Extract(HLSEG,1)'="$" Quit

$Extract (in this case) gets just the first character of the HL Segment and if it isn't a '$' character then just quit. Here's documentation on $Extract: https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=RCOS_FEXTRACT

If it does start with a '$' then we need to turn that into a carat. If your HL7 data can't contain '$' symbols beyond the first one, we can use the Translate function (documentation here: https://docs.intersystems.com/irislatest/csp/docbook/Doc.View.cls?KEY=RCOS_ftranslate )

 Set HLSEG=$Translate(HLSEG,"$","^")

That turns *all* '$' symbols to '^' in your variable, so if your HL7 data _could_ contain more '$' characters, this won't work for you. In that case, we'll just strip off the first character (no matter what it is), save the rest and prefix that with a '^':

 Set HLSEG="^"_$Extract(HLSEG,2,$Length(HLSEG))

Now that we have the actual global name in HLSEG, we can use the Indirection operator to access the global through HLSEG, and use the $Data function (documentation here: https://docs.intersystems.com/irislatest/csp/docbook/Doc.View.cls?KEY=RCOS_fdata )to see to see if that global exists:

 If $Data(@HLSEG)=0 Quit ; no data here.

$Data will return a '10' or '11' if there's a subnode (for ^GLOBAL("123") the subnode is "bone issue") so if there is, let's put that in the local variable SUBNODE - for that we need the $Order (documentation here: https://docs.intersystems.com/irislatest/csp/docbook/Doc.View.cls?KEY=RCOS_forder ) function:

 If $Data(@HLSEG)>9 Set SUBNODE=$Order(@HLSEG@(""))

Now SUBNODE will contain "bone issue". We can use the $Data function to see if that subnode actually contains data [[ this is why I changed your example a bit, to make it easier to demonstrate this ability ]], and if it does we'll put that in the ISSUE variable:

 If $Data(@HLSEG@(SUBNODE))#2=1 Set ISSUE=@HLSEG@(SUBNODE)

Now the variable ISSUE will contain "Spur".

Now that you have the info for the SUBNODE and ISSUE local variables populated, you should be able to  use them to assign what you need in your segments.

Again, it's possible I'm way off base on this and this didn't help you - if that's the case then I apologize but you might want to provide a more descriptive example of what you're trying to accomplish. I also apologize in retrospect if any of my code examples contain bugs. But I really do hope it helps you!

Not sure if this will help, but Ensemble does (or at least did) support SNMP, so if your organization already runs a Network Monitoring platform like SolarWinds, OpenNPM or the like, you may be able to import a custom OID into that tool and let it do the monitoring for you.

That said, I do _not_ know if Ensemble's SNMP support is granular enough to be able to report on HL7 statistics, but at least it might be a place to start.

Full disclosure: I've never worked with Ensemble's SNMP support before, although I've read a bit about it. I was "more than passable" working with SolarWinds about a decade ago, enough that I could create custom OIDs and monitor specific things through custom Python scripts.

Hope this helps!

Eduard,

I'm not 100% sure I understand _exactly_ what you're trying to accomplish -- do you want to execute code when the Production starts, or do you want to execute code when the individual Business Process is fired? Either way, I'll explain both in my example below...


Did you associate your class with a particular production or business process? In Studio, I created a New >> Production >> Business Process, and it will ask you for a Package Name (I used TEST) and Class Name (I used StartItUp) and a description (Testorama) - and if you click Finish it will take you right to the graphical Business Process editor. You then need to add an activity to the process, I used "code" and entered

  set ^gfxdbg(+$o(^gfxdbg("z"),-1)+1)=$h

under the 'Code' section. Then I connected the <start> to the <code> and the <code> to the <end> nodes (graphically). I then saved & compiled the .bpl.  After that's saved & compiled, you can press <SHIFT><CTRL><V> to view the code that Studio created, and you'll see this:

/// Testorama
Class TEST.StartItUp Extends Ens.BusinessProcessBPL
{

/// BPL Definition
XData BPL [ XMLNamespace = "http://www.intersystems.com/bpl" ]
{
<process language='objectscript' request='Ens.Request' response='Ens.Response' height='2000' width='2000' >
<sequence xend='324' yend='440' >
<code xpos='204' ypos='244' >
<![CDATA[ set ^gfxdbg(+$o(^gfxdbg("z"),-1)+1)=$h]]>
</code>
</sequence>
</process>
}
/// except this comment - I added this manually. Insert extra methods here.
}

Except... one line near the bottom; I added this manually to show you where to add the next segment of code. In that spot, I added this:

Method OnInit() As %Status
{
    set ^initdbg(+$o(^initdbg("z"),-1)+1)=$h
    quit $$$OK
}

ClassMethod OnProductionStart() As %Status
{
    set ^proddbg(+$o(^proddbg("z"),-1)+1)=$h
    quit $$$OK
}

I used different globals to highlight the different functions of the methods. Every time that the OnInit() is run, it'll add a new node to ^initdbg, and every time the OnProductionStart() is run, it'll add a new node to ^proddbg.

If you've made it this far... you're still not quite done. :-) You still need to go back to the Management Portal in your production, and click the (+) next to Processes to add a new Business Process.

Under the Business Process Class, click the dropdown and you should now see TEST.StartItUp (scroll to the bottom to see it):

Select that for the BPClass. Give the Process a Name (Below, I used "Fizzle2_Stuff" -- yea, it's a dumb name, but I didn't want to reconfigure _everything_ again just to give it something different. Sorry 'bout that.), and click "Enable Now."  If the process is created and the dot stays green, you're.... still not quite there. But almost! In your command prompt, if you type this:

zw ^proddbg

you _should_ see an entry. This is set every time the production starts -- or more accurately, when the production starts the Business Process. If you type this:

zw ^initdbg

you will not get any output? Why? Because the OnInit() isn't executed on production start - it's executed on "subjob" start. When the Business Process gets some input from a Business Service that uses an InboundAdapter, then it'll populate. I have a test Service that scans a directory on my hard drive (based on the EnsLib.File.InboundAdapter) and can send it to a Business Process. That's set under the "Target Config Names" setting. My example has a _lot_ of testing things that are named weird... so be warned. :-)

Adding Target Config Name to Service.

Hopefully that part's not too confusing... Anyway, once you have the Service pointing to the Process, every time new input goes through the queue on the Service and is sent to the Process, then the OnInit() method fires, and you'll see a new entry in:

zw ^initdbg

Anyway, If this doesn't quite make sense, I apologize and maybe I'll rework the tutorial to make it clearer and highlight every single step it took to get this tested fully... please respond with feedback on how this could be better understood.

If you have any questions, please feel free to ask!

[[ Edited initial paragraph for clarity. ]]

Is it possible to add a bit more information? I'm guessing (but it's just a guess) that this is something set up on your server itself -- the #5001 is a user defined error and can say "anything" the programmer wanted.

If what you're editing is a class, I wonder if the code in that class is defined as [ ReadOnly ] and when modification is attempted that it's checking for changes and throwing the error...

I did try editing a straight .INT routine mapped to a new database, make a change but don't compile, then dismounting and remounting the datatabase in Read Only mode and compiling -- you get quite a few <PROTECT> errors and a "#5883: Item 'QQQ' is mapped from a database that you do not have permission on" showed up, but nothing like what you described above.

I know this isn't much, but I hope it helps!