Hi

The reason that you are getting these errors is that there is a mismatch between the following attributes of the Client and Server Adaptors. Jeffrey mentions Framing which is essentially how Ensemble detects the beginning and end of each HL7 message. Within the message itself though your message consists of 1 or more Segments. In order to determine the end of a segment Ensemble needs to know what terminator to expect which can be <LF> or <CR,LF>. In order to detect these characters Ensemble needs to know what character set the 3rd Party Adapter is using. So you need to confirm that the attributes CHARSET (Character Set) and ENCODING. The default values for these should be 'Latin-1' and 'UTF-8'. On the subject of Framing your choice is dependent of whether you are the HL7 TCP Inbound Adapter (in a Business Service) or you are the HL7 TCP Outbound Adapter. If you are the server then you have the choice of ensuring that your Framing matches the 3rd Party Client Framing. However you also have the choice of Flexible which essentially tells the Ensemble Adapter to interpret the characters being streamed to the Server and based on Pattern matching rules Ensemble can 'auto-detect' the framing. However when you are working with the Ensemble HL7 TCP Outbound Adapter you don't have the option of 'Flexible' and you either need to confirm that you and the 3rd Party are using compatible Framing formats and you can find out if they support Flexible then you can set the value for your  Client Adapter Framing to a value of your choice 

Nigel

The best way of doing this is to create a class with a single property 'property Data as %BinaryStream'

class MyData extends %Persistent

{

property Data as %BinaryStream;

}

Then, I assume you you have a class that inherits from %CSP.REST (assume it is called MyRestClass) and in the XDATA Routes block define a route definition

<Route Url="/myapp" Method="POST" Call="SaveData" />

where /myapp is a Web Application defined in Management Portal -> System Administration -> Security -> Applications -> Web Applications. When you define the application specify the application URL ("/myapp"), the namespace where your data is going to be saved, and in the field "Dispatch Class" specify  "MyRestClass" and then Save the Web Application Definition.

in your REST Class(MyRestClass) define a class method

classmethod SaveData() as %Status

{

    set tSC=$$$OK

    try {

        set obj=##class(MyData).%New()

        set tSC=obj.Data.CopyFrom(%request.Content) if 'tSC quit

        set tSC=obj.%Save() if 'tSC quit

    }

    catch ex {set tSC=ex.AsStatus()} 

    quit tSC

}

This is just one approach. As Robert suggests below there are a number of ways of manipulation images in CSP but my example is one that I have used before

Hi

In one of my Zen applications I built a custom Login Screen and in the Web Application Definition I call that login csp page. In my case I maintain my own User/Role tables and when I add or modify a User I use the Ensemble security method calls to create or update the user in Ensemble.

The Zen Custom Login forms are designed to validate the user against the Ensemble Users Table.

Given that in your login page the User and Password fields can be accessed through the DOM model you can get the values from those fields by specifying a "OnChange" or equivalent JS event and you can then get the value entered by the user and you can then write those values into your own "Login Audit Table". Likewise you can detect when the Zen session has closed and so you could trap that and update your "Login Audit Table" to indicate when the user logged out.

In the Zen documentation they give an example of a simple custom login form:

<page xmlns="http://www.intersystems.com/zen" title="">
  <loginForm id="loginForm" >
    <text name="CacheUserName" label="User:" />
    <password name="CachePassword" label="Password:" />
    <submit caption="Login" />
  </loginForm>
</page>

So you could trap the user name and password values (the password value will have been encrypted so if you needed to see the actual password value you would have to unencrypt it).

The only problem I foresee is that once you have submitted the form you won't know if the user has logged in successfully until your Home page displays so you would have to store those values in a temp global and then if you get to your Home screen you would know that they had logged in successfully and you could create your login Audit. 

Given the way that the Cache/Ensemble/IRIS documentation is structured you may well find a section elsewhere in the Zen documentation that might tell you how to trap the outcome of the 'submit' but I have not attempted to see if my theory returns any more useful information.

Nigel

Hi

The way that I have dealt with this in the past is as follows:

1) Create a DTL that accepts the incoming HL7 message as its source and  a target class of EnsLib.HL7.Message with a message structure of 2.5.1:ACK. I am using HL7 2.5.1 but it will work with any version of HL7 from HL7 2.3 upwards (probably earlier versions as well but I have not worked with versions earlier that 2.3)

2) when you invoke the Transform() method of a DTL you will notice that there are three parameters

  1. pRequest (which is your source message)
  2. pResponse (which is the generated target message)
  3. aux

If you read the documentation, if the transform i invoked from a Business Rule then aux is an object and contains information about bthe Rule that invoked the Transform and a couple of other properties. However if you are invoking the transform from Cache ObjectScript then aux can be an instance of a class you create.

The way that I use 'aux' is as a mechanism for getting information into the DTL that is not present in either the sourc or target objects. In this example I want to send in the ACKCode and the ACKMessage.

So my DTL looks like this: (I apologise for having to paste in the class code but I have never found a way to attach classes to a Community Reply) so here goes.

My DTL Class reads as follows:

Class Example.Transformations.CreateNACKDTL Extends Ens.DataTransformDTL [ DependsOn = EnsLib.HL7.Message ]
{

Parameter IGNOREMISSINGSOURCE = 1;

Parameter REPORTERRORS = 1;

Parameter TREATEMPTYREPEATINGFIELDASNULL = 0;

XData DTL [ XMLNamespace = "http://www.intersystems.com/dtl" ]
{
<transform sourceClass='EnsLib.HL7.Message' targetClass='EnsLib.HL7.Message' sourceDocType='2.5.1:ADT_A01' targetDocType='2.5.1:ACK' create='new' language='objectscript' >
<assign value='source.{MSH:FieldSeparator}' property='target.{MSH:FieldSeparator}' action='set' />
<assign value='source.{MSH:EncodingCharacters}' property='target.{MSH:EncodingCharacters}' action='set' />
<assign value='source.{MSH:SendingApplication.NamespaceID}' property='target.{MSH:SendingApplication.NamespaceID}' action='set' />
<assign value='source.{MSH:SendingApplication.UniversalID}' property='target.{MSH:SendingApplication.UniversalID}' action='set' />
<assign value='source.{MSH:SendingApplication.UniversalIDType}' property='target.{MSH:SendingApplication.UniversalIDType}' action='set' />
<assign value='source.{MSH:SendingFacility.NamespaceID}' property='target.{MSH:SendingFacility.NamespaceID}' action='set' />
<assign value='source.{MSH:SendingFacility.UniversalID}' property='target.{MSH:SendingFacility.UniversalID}' action='set' />
<assign value='source.{MSH:SendingFacility.UniversalIDType}' property='target.{MSH:SendingFacility.UniversalIDType}' action='set' />
<assign value='source.{MSH:ReceivingApplication.NamespaceID}' property='target.{MSH:ReceivingApplication.NamespaceID}' action='set' />
<assign value='source.{MSH:ReceivingApplication.UniversalID}' property='target.{MSH:ReceivingApplication.UniversalID}' action='set' />
<assign value='source.{MSH:ReceivingApplication.UniversalIDType}' property='target.{MSH:ReceivingApplication.UniversalIDType}' action='set' />
<assign value='source.{MSH:ReceivingFacility.NamespaceID}' property='target.{MSH:ReceivingFacility.NamespaceID}' action='set' />
<assign value='source.{MSH:ReceivingFacility.UniversalID}' property='target.{MSH:ReceivingFacility.UniversalID}' action='set' />
<assign value='source.{MSH:ReceivingFacility.UniversalIDType}' property='target.{MSH:ReceivingFacility.UniversalIDType}' action='set' />
<assign value='$tr($zdt($h,3),",: ","")' property='target.{MSH:DateTimeOfMessage}' action='set' />
<assign value='source.{MSH:Security}' property='target.{MSH:Security}' action='set' />
<assign value='source.{MSH:MessageControlID}' property='target.{MSH:MessageControlID}' action='set' />
<assign value='"ACK"' property='target.{MSH:MessageType.MessageCode}' action='set' />
<assign value='source.{MSH:MessageType.TriggerEvent}' property='target.{MSH:MessageType.TriggerEvent}' action='set' />
<assign value='"ACK"' property='target.{MSH:MessageType.MessageStructure}' action='set' />
<assign value='source.{MSH:ProcessingID}' property='target.{MSH:ProcessingID}' action='set' />
<assign value='source.{MSH:VersionID}' property='target.{MSH:VersionID}' action='set' />
<assign value='source.{MSH:SequenceNumber}' property='target.{MSH:SequenceNumber}' action='set' />
<assign value='source.{MSH:ContinuationPointer}' property='target.{MSH:ContinuationPointer}' action='set' />
<assign value='source.{MSH:AcceptAcknowledgmentType}' property='target.{MSH:AcceptAcknowledgmentType}' action='set' />
<assign value='source.{MSH:ApplicationAcknowledgmentTyp}' property='target.{MSH:ApplicationAcknowledgmentTyp}' action='set' />
<assign value='source.{MSH:CountryCode}' property='target.{MSH:CountryCode}' action='set' />
<assign value='source.{MSH:PrincipalLanguageOfMessage}' property='target.{MSH:PrincipalLanguageOfMessage}' action='set' />
<assign value='source.{MSH:AltCharsetHandlingScheme}' property='target.{MSH:AltCharsetHandlingScheme}' action='set' />
<assign value='aux.ACKCode' property='target.{MSA:AcknowledgmentCode}' action='set' />
<assign value='aux.ACKMessage' property='target.{MSA:TextMessage}' action='set' />
</transform>
}

}

My AUX class definition looks like this:

Class Example.Transformations.CreateNACKDTL.AUX Extends %Persistent
{

Property ACKCode As %String;

Property ACKMessage As %String(MAXLEN = 200);

}

To generate the HL7 ACK my code reads:

classmethod GenerateACKMessage(pRequest as EnsLib.HL7.Message, byref pResponse as EnsLib.HL7.Message, pACKCode as %String(VALUELIST=",CA,CE,CR,AA,AE,AR")="AA", pACKMessage as %String(MAXLEN=500)="") as %Status

set tSC=$$$OK

try {

    set aux=##class(Example.Transformations.CreateNACKDTL.AUX).%New()

    set aux.ACKCode=pACKCode,aux.ACKMessage=pACKMessage

    set tSC=##class(Example.Transformations.CreateNACKDTL).Transform(pRequest,.pResponse,.aux) if 'tSC quit

}

catch ex {set tSC=ex.AsStatus()}

quit tSC

}

Notice that the DTL takes the SendingFaclity, SendingApplication, ReceivingFacility and ReceivingApplication and swaps them around in the DTL.

You should keep the MessageControlID the same as the incoming HL7 Message MessageControlId so that the ACK can be linked to the original HL7 request. 

The MessageTimeStamp can be updated to current Date/Time

The 'aux' mechanism is very useful. Unfortunately the documentation has one line of comment that says "if the transform is called from ObjectScript aux can contain anything you want" or words to that effect.

So I tested it in a simple example like the one above and it does indedd work and I now use it in the 30+ DTL's I amworking with at the moment.

Nigel

Hi

Even though LabTak is not really covered in this group let me give you a little insight into how LabTrak works:

The LabTrak data is stored in a number of globals. There are a couple of key globals that you need to be aware of:

^TEPI

^TDEB

^THOS

^TEPI contains all of the LabTrak Episodes and within each episode there is a sub-array of Test Sets and within that a sub array of Test Items.

The global structure was designed in pre-cache Objects and so the globals use delimiters to separate one field from another. Once you can navigate through the global structure you will find that the fields either contain data (String, Integer, Boolean etc) or they contain codes that point to one of about 50 code tables.

All of the logic of LabTrak is written in cache Objectscript routines. 

Most of the routines are hidden in the sense that the source code is not installed, just the compiled code. However there are some callable entry points into the key areas of the application.  Depending on what you want to do there are appropriate labels that can be called that will retrieve data, insert, update or delete data. You really don't want to play with these routines unless you have been given training by Trak or the InterSystems Trak Sales Engineers.

The data structures (globals) have been mapped to classes and so there are classes for all of the different logical components of the database. You can run sql queries against these sql tables but you absolutely do not want to use insert, update or delete sql statements. All updates to the database are controlled through the entry points I have mentioned before.

For basic retrieval of data your best bet is to use the table definitions. You need to create an InterSystems ODBC DNS to connect to the LabTrak database and once connected you can then view the various schemas and the tables within them. You can then write sql select statements to get the data you are looking for. I would recommend that if you want to access data in LabTrak you should use the Disaster Recovery mirror of the LabTrak database. The DR database(s) are real time copies of the production database and most reports are run against the DR data rather than production. LabTrak is a very extensive and complicated application and is very finely tuned by the Trak experts to run at optimal efficiency and with proper data integrity. You don't want to be running SQL queries against the production database as you will potentially affect the performance of the system by running large sql queries. It is far better to run these queries against the DR servers. The system operators would need to give you access to the appropriate servers and secondly bear in mind that you are working with sensitive, confidential patient data and that must be respected at all times.

Every LabTrak sites has customer specific routines that can be edited. These customer specific routines contain 'insert', 'update' and 'delete' methods and you can write code in these labels to pull data from the database and use it to create HL7 messages for example or create entries in a queue that can be processed by an Ensemble Production

The application supports HL7 interoperability that can be used to pass HL7 data into LabTrak and generate HL7 result messages to send out fro LabTrak but again you would need the appropriate training to understand how that functionality works.

As has been mentioned in one of the other replies your best bet is to connect with you LabTrak Project Manager or Sales Engineer to find out more and to find out what training material and documentation exists

Good Luck

Nigel

Hi

The type of content that is most appreciated by members of the community are articles on some aspect of the technology (Cache, Ensemble, IRIS) or specific usage of the technology that you have worked with, have a good understanding of, maybe have learnt a few tips and tricks about the  using the technology that is not covered by the core product documentation. This is especially true when you happen to make use of one of the more obscure features of the technology in a development project you have worked on and maybe battled to get it to work, found the documentation to be lacking, found that there are few if any posts on the subject in past community posts and no examples  in the various samples supplied in the Cache/Ensemble/IRIS "Samples" namespace or the "Dev" directory in the ../intersystems/..../dev directory.

To give you an example, some years back I was working on a project where we were building a prototype Robot. I was writing the Ensemble component that would interface the outside world and translate those instructions into calls to a Java code base that controlled the motors and sensors and other mechanical components of the robot. The developer I was working with knew all of the Java stuff and I knew all the Ensemble stuff and to make the two technologies talk to each other we had to make use of the Java Gateway. We read the documentation. It seemed straight forward enough. I had had a lot of experience working with most of the Ensemble Adapters so I was expecting things to go smoothly.

But they didn't. We re-read the documentation. We looked at the examples, we asked questions in the Developer Community, we contacted WRC but still we could not get it to work. Eventually my colleague fond a combination of bits and pieces of the Java Gateway that he merged together and eventually we got the interface working.

To this day I still don't understand why the gateway did not work the way the documentation said it should. I don't exactly understand how the solution we eventually put in place that did work, worked.   

At the time we were still experimenting with the Java Gateway and realised that the documentation only took us so far, it would have been great if we had been able to find an article in the Developer Community written by someone who had used the Gateway, had found some specific things that needed to be setup correctly for it to work, included some code snippets in the article and so on. If I had found such an article and it helped us get our Gateway to work (we had struggled with it for 2 months, it should have taken 2 days to get it to work) I would have sent a bottle of our famous South African Artisan Gin and a large packet of the South African delicacy, "Biltong" (dried Beef, Kudu, Springbok, Ostrich meat) to that man as a thank you.

These days the focus is on IRIS and IRIS for Health. There is huge interest in  FHIR and the various interoperability options for IHE, PIX, PDQ,  HL7, Dicom, CDA and so on. 

I have been quite active on the DC for many years and since the Covid19 Lockdown I have had more time to answer questions and I too am thinking of articles, code solutions and such like that i can write up or package for the Open Exchange applications page. I have even got to the point where I have invested in some lighting gear and a tool called Doodly which allows you to create animated videos to explain concepts or processes to achieve a desired solution or outcome. I hope to start publishing some of these articles in the near future.

So I hope that these observations will encourage you to find good subject material to write up and publish

Nigel

Hi

The way that I have down this in the past is as follows:

1) Export the classes/DTL's/CSP Pages etc into an Export XML file.

2) Create an array of the strings you want to identify

 e.g.

set array({string1})="",array({string2})=""....array({stringN})=""

classmethod FindStrings(pfile as %String="", byref plist as %String) as %Status

{

set tSC=$$$OK  

try {

   open file:("R"):0

   if '$t set vtSC=$$$ERROR(5001,"Unable to open file: "_file) quit

    use file read line

    set x="" for {

         set x=$o(pList(x)) q:x=""

         if line[pList(x) {

             // Do what ever you want to do when you find a line that contains one of the sting values you are searching for

          }

}

catch ex {

    if $ZE["<ENDOFFILE"> {set tSC=$$$OK}

     else {set tSC=ex.AsStatus()}

  }

close file

quit tSC

}

call the method as follows:

set sc=##(MyClass).FindStrings({File_Name},.array)

Yours

Nigel

H

There are system utilities that allow you to retrieve a list of globals based on a wildcard.

Here is some code that gets a list of Globals in a namespace. You can modify it to suit your needs:

Method GetGlobalList(ByRef globallist As %String(MAXLEN=3000) = "", globalnameprefix As %String) As %Status
{
    set $ztrap="Error",tSC=$$$OK,globallist=""
    set gbl=""
    for {
        set gbl=$o(^$GLOBAL(gbl)) q:gbl=""
        if globalnameprefix[$p(gbl,".",1) set globallist=globallist_gbl_","
    }
    set globallist=$e(globallist,1,$l(globallist)-1)
End ;
    quit tSC
Error ;
    set $ztrap="",tSC=$$$ERROR(5001,"Code Error: "_$ze) goto End
}

This uses old style $ztrap error handling and would be better written as a TRY/CATCH

I hope this helps

Yours

Nigel

Hi

No, the OnInit() method is called when the Business Service Starts Up, the OnTearDown() is invoked when the Business Service stops. The OnInit() is not aware of the request message and therefore It is not aware of any Request messages at this point. The ProcessInput() and more specifically the OnProcessInput() method is the first time you get to see the incoming request message and it is in the OnProcessInput() method that you decide what you are going to do with the request HL& Message, whether you route it to a Business Process based on the Ensemble HL7 Passthrough Architecture or whether you pass it to a custom Business Process However I made the assumption that your Business Service is a conventional HL7 TCP Inbound Adapter based service. 

If however it is an HTTP REST service then that is another matter altogether. If it is an HTTP REST service then by default the request will be handled by the %CSP.REST Dispatch class. The basic functionality of the %CSP.REST class is to route HTTP requests to a Business Service. You can inherit the %CSP.REST class into your own REST dispatcher class. 

I have a REST Dispatcher class in Ensemble that emulates the IRIS FHIR End Point functionality.

I have 4 csp applications defined:

/csp/endpoint

/csp/endpoint/endpointA

/csp/endpoint/endpointB

/csp/endpoint/EndPointC

All 4 csp applications invoke my custom Rest.Dispatch class (which inherits from %CSP.REST) 

I have a Business Service Class named BusinessService.MyEndPointService

I create 4 Business Services in my production

End Point General (/csp/endpoint)

End Point A (/csp/endpoint/endpointA

and so on

In the Rest Dispatch class I look at the request URL and based on the URL I invoke the OnRequest() method of the Appropriate Business Service using the Production Service Name.

However as I am writing this there is something in the back of my mind that is telling me that if you are using the EnsLib.HL7.TCP Adapter that you can reroute an incoming message to another service but I would have to go back to the documentation to remind myself of what exactly you can do and how it works.

The most common way that developers normally use is the EnsLib.HL7.MsgRouter architecture where you create routine rules that test the MSH Message Structure and you can then route the message to another specific Business Process to process that specific Message Type. This is all handled through the Management Portal->Ensemble->Build set of functions which allow you to create Business Processes, Business Rules, Transformations and so on.

If you are using HTTP REST and want more information on how I have implemented that then I would send you a far more detailed description of my implementation.

Nigel

Historically Cache and Ensemble did not support WebSockets and so you could not have two processes using the same (incoming) port but if I remember correctly IRIS supports WebSockets and though I can't remember how these work something in the depths of my mind tells me that I think WebSockets were aimed at this specific requirement

Check out the IRIS documentation on WebSockets

Nigel