Julian Matthews · Apr 16, 2024 go to post

Hey Scott.

I saw this post when you originally made it, and I was curious to know what direction you went with?

Julian Matthews · Apr 9, 2024 go to post

Thanks for sharing this Neil - I hadn't considered the benefits of mapping the credentials and the subsequent SecondaryData, and is an approach I will be looking at going forward.

Julian Matthews · Apr 5, 2024 go to post

As a short term approach, you may want to look into using Stunnel in client mode to encrypt the traffic and then set something up similar to:

This would mean that the traffic between your 2016 instance and stunnel is unencrypted but all on the same machine, and then stunnel handles the encryption between your machine and the external site using TLS1.3.

However, even if you go this route, I would still recommend getting the process started for upgrading to a newer version.

Julian Matthews · Feb 2, 2024 go to post

Increasing the pool value will have some effect on the RAM and CPU usage, but no different than having other jobs running through the production. If you move all the components over to using the actor pool (by setting the individual pool settings to 0) it should be easy enough to raise the value bit by bit while keeping an eye on the performance of the production and the cpu/ram usage of the machine to find a sweet spot.

If the API just needs a bit of extra resource when there's a small spike in the inbound requests, then this should not be of too much concern as it will just calm down once it's processed what has been requested.

If, however, there's a chance that it could be overloaded from inbound requests and the worry is that the server won't cope, then maybe look at using Intersystems API Manager to sit in front of the environment and make use of things like the rate limiting feature.

Or you could go even further and begin caching responses to return if the API is being queried for data that isn't changing much so that there's less processing being done for each request if it's going to get called for the same information by multiple requests in quick succession. You could make your own solution with caché/Iris, or look at something like redis.

Julian Matthews · Feb 2, 2024 go to post

I'm thinking to increase the pool parameter, but I'm not sure if it's a good idea.

If you are not concerned about the order of which you are processing the inbound requests, then upping the pool size to the number of parallel jobs you're looking to run with should do what you need 

However, you may need to then also apply this logic to related components that the Process interacts with, otherwise you will end up just moving the bottleneck to another component.

Alternatively, if it fits your use case, you could use the Actor Pool for your production components and then increase it to a point where you see the bottleneck drop off.

Paolo has provided the link to the documentation on Pools, which has some info on considerations for the use of the two different types of Pool.

Julian Matthews · Dec 22, 2023 go to post

Hi Nimisha.

I see what you mean now.

It does seem like the code within a code block doesn't have access to the methods from Ens.BusinessProcess.

I suspect your only option for is to use the "Call" activity set to synchronous with a "Sync" after it.

Julian Matthews · Dec 22, 2023 go to post

Thanks Luis.

The issue I'd have is that the clock starts on the poll interval at the point the service is started, so a restart of the server/production would then shift the time of day it tries to run, which would not be ideal if I needed a single run at a specific time of day. I might try a combination of the large poll interval and defining a schedule (based on the other responses) and see if that has the desired effect, but I may need to just concede and continue using the task manager. 🙂

Julian Matthews · Dec 22, 2023 go to post

Are you able to share the full error you're seeing?

I have just tested this, and I'm getting no such errors when compiling.

Julian Matthews · Dec 21, 2023 go to post

Hi Mary - thank you for your reply.

Unfortunately the Schedule Option isn't suitable where we need the job to run only once at a set time per day. And, our current solution is very similar to the accepted answer from Ashok Kumar, which is what I was hoping to simplify in some way.

Julian Matthews · Nov 8, 2023 go to post

This may cause some regions to opt for FHIR as a response while others opt for other solutions such as OpenEHR.

There's no reason for these to be seen as competing standards. A model where the data storage is OpenEHR and the data transfer is FHIR is seen by some as the best of both worlds 🙂

Julian Matthews · Oct 11, 2023 go to post

I'm not sure there's a way to select it when creating a production export.

The data in stored within the Global "Ens.Config.BusinessPartnerD" which you could export separately and then import into your new environment?

Julian Matthews · Sep 22, 2023 go to post

Hey Christine.

If I'm reading your question and subsequent replies correctly, you're trying to take the value of PV1:7.1, and then use that in a SQL query. The answer has been given by Ashok when you put their replies together, but hopefully putting it all into a single response will make things easier to follow.

If this is the case, then you will want to do the following:

Step 1: Set a variable to the value of PV1:7.1:

Step 2: Add a code block, and use this to run your sql query:

Step 3: Do what you need to with the value of ID - for the sake of this response, I'm just setting the value of PV1:7.2 to the ID returned from the query that inserted into the variable "Output":


It's worth knowing that, when working with Embedded SQL, prefixing a variable with a colon is how you can pass variables in and out of the Embedded SQL section of code. However it's a bit clearer when working directly with ObjectScript vs a DTL.

For example, if we had the following table:

ID COL_A COL_B
1 ABC 123
2 DEF 234

We could have the following in ObjectScript:

    Set X = "" // X is null
    Set Y = "ABC"
    &SQL(
        SELECT COL_B
        into :X
        From TestTable
        WHERE COL_A = :Y
    )
    WRITE X //X is 123
Julian Matthews · Sep 21, 2023 go to post

I don't believe there is a way of increasing the system limit on string lengths. Even if there is, it's best to approach this by working with the data as a stream.

Otherwise you could end up in a cat and mouse game of needing to increase the length the next time you get a larger document

Julian Matthews · Sep 20, 2023 go to post

The input is a string, so the max length will be your system max (which should be 3,641,144).

Assuming you're trying to retrieve the stream from a HL7 message, you will probably want to use the built in method GetFieldStreamBase64

So you could try something like:

Set tStream = ##class(%Stream.TmpBinary).%New()
Set tSC = pHL7.GetFieldStreamBase64(.tStream,"OBX:5")

And then your decoded file would be in the temp stream.

(You may need to tweak this slightly depending on how you intend to then use the stream, and the correct property path of the Base64 within the HL7 message)

Julian Matthews · Sep 14, 2023 go to post

Stupidly, no.

As the default was set to "TEST" in my function, it worked fine throughout testing. Once the upgrade to Prod occurred, the issue was spotted, and the simple solution was to just reset the values. As I was moving from an adhoc to a official release, I chalked it up to that.

Next time, WRC will be getting a call 🙂

Julian Matthews · Sep 14, 2023 go to post

Word of warning for using this - I have had this value be reset during an upgrade on Healthconnect.

I was using this value in a function to control the output of a Transform, and I hadn't accounted for the chance of this returning null.

My code looked something like this:

ClassMethod WhatEnvAmI()
{
	Set Env = $SYSTEM.Version.SystemMode()
	
	If (Env = "LIVE")||(Env = "FAILOVER"){
		Quit "LIVE"
		}
	Quit "TEST"
}

So, post upgrade, the transform suddenly begun outputting values specific to the test system from the live environment.

Julian Matthews · Sep 6, 2023 go to post

This is a rather subjective based on the skill level of the intended audience.

You could add a comment to the ClassMethod to provide context to what is being done and why. For example:

/// This ClassMethod takes a delimited String from System-X that consists of sets of Questions and Answers. 
/// The Sets are delimited by a pipe "|" and then the questions and answers are delimeted by a colon ":"
/// The response from this ClassMethod is a %Library.DynamicArray object containing the questions and answers
ClassMethod createResponse(data As %String(MAXLEN="")) As %Library.DynamicArray
{
    ;1.- Questions splitted by "|"
    Set listQuestions 			= $LISTFROMSTRING(data, "|")
    Set items 			= []
    Set questionNumber 	= 0
    ;2.- Iterate
    For i=1:1:$LISTLENGTH(listQuestions) {
        Set questionAnswer = $LISTGET(listQuestions, i)
        ;3.- Update variables
        Set questionNumber 	= questionNumber + 1
        Set question 		= $PIECE(questionAnswer, ":", 1)
        Set answer 		    = $ZSTRIP($PIECE(questionAnswer, ":", 2), "<W") //Get rid of initial whitespace
        ;4.- Generate item
        Set item 			= 									
        {
        "definition": ("question "_(questionNumber)),
        "text": (question),
        "answer": 
        [
            {
                "valueString": (answer)
            }
        ]
        }
        Do items.%Push(item)
    }	
    Quit items
}

Or you could go one step further and be more descriptive with your comment at each action within your code. So, instead of:

;2.- Iterate

You could write something like:

;2.- Iterate through the list of Questions and Answers

If your intended audience is not familiar with ObjectScript, then you may want to introduce them to features in stages. For example, you could use $ZSTRIP on both the question and answer in your For loop, but only nest it for the answer and use comments to describe it all. Something like:

// Retrieve the question from the delimited entry
Set tQuestion = $PIECE(questionAnswer, ":", 1)

// Strip any whitespace from the start of the question
Set question = $ZSTRIP(tQuestion, "<W")

// It is also possible to nest functions, so below we will retrieve the answer and remove the whitespace in a single line.
Set answer = $ZSTRIP($PIECE(questionAnswer, ":", 2), "<W")
Julian Matthews · Jul 1, 2023 go to post

If you were to use the Parenthesis () Syntax in your routing rule, you could simply make your rule something like:

The reason this works is that using the parenthesis syntax to access repeating values from a message will return all of the entries in the repeating segment as a *delimited string, and you can then check the string contains the - character using the contains function.

*Do make sure that the delimiter isn't the character you're looking for. Maybe throw in a trace when you first test this and check how it's returned.

Julian Matthews · Jun 26, 2023 go to post

Hey Guillaume.

Funnily enough - it's one of your github repos where I located the demo I'm trying use as a jumping off point (but from https://github.com/grongierisc/InstallEnsDemoHealth/blob/master/src/CLS/Demo/DICOM/Process/WorkList.cls)

Basically, I'm stuck trying to work out if I should scrap the wakeup calls etc, and just call the external data when I get a C-FIND-RQ message and then call "CreateIntermediateFindResponse" for each result set entry, or if it's necessary to use the wakeup calls and somehow hold the result set in context and move to the next result set entry on each Ens.AlarmResponse received.

ETA: The approach taken was to use the initial message as a trigger to call off to an external db, and write the results into a local table, and then use the Ens.AlarmResponse as the trigger to grab the top entry from the local table and return this to the calling system. This then allows for a cancel to come in and interrupt the process (the cancel will trigger a deletion of the appropriate rows in the local table)

Julian Matthews · Jun 6, 2023 go to post

Hey Michael.

A good use of a lookup table would be when working with an integration between two systems that use differing codes for the same values.

For example, you could have System A that records Sex as 1 for Male, 2 for Female, 0 for Not Known, and 9 for Not Specified, whereas System B uses M for Male, F for Female, and O for Other.

You could have a winding If/Else in a transform, or you could simply reference a lookup table in your DTL using the ..Lookup() function:

and then build up your lookup table to look like this:

As you can see, System A has more values than System B so the values for Not Known and Not Specified are being added as Other in my example.

Another example could be you needed to filter messages in a router based on a code within the HL7 message. You could add the codes to a lookup table as the key and a description as the value, and then use the Exists() function within your routing rule:

 

Which becomes:

Julian Matthews · May 17, 2023 go to post

Hey Jon.

One option could be to change the permissions for the source directory to be read only for the account running Ensemble/Healthshare? This way, the adapter will copy the file from the directory but will then be unable to delete it from the directory, while also keeping track of what files it has copied.

The log for the passthrough service will be a bit messy at first. but you'll end up with something like:

However, if the directory you're going to be checking for documents will be an ever growing list of a large number of documents, then your own suggestion of copying the files into a secondary working directory before being picked up by ensemble might actually be the best option as there's a bit of an overhead for the adapter when it's scanning a directory that contains a large number of files.

Depending on how soon after the files creation you need it for onward processing, you could create a scheduled task that copies all files from the previous day into your working directory and then sends an email if that fails for any reason?

Julian Matthews · Apr 19, 2023 go to post

Hey Kurro.
I'm not sure of a built in function for this, but if you wanted to have your own:

Class Demo.FunctionSets.Example
{

ClassMethod Format(InputString As %String, Params... As %String) As %String
{
	Set OutputString = InputString
	For i = 1 : 1 : $GET(Params, 0){
		Set OutputString = $Replace(OutputString,"{"_i_"}",Params(i))
	}
	
	Quit OutputString
}

}

And then:

Write ##Class(Demo.FunctionSets.example).Format("My name is {1} and I'm {2} years","Kurro","18")
My name is Kurro and I'm 18 years
Julian Matthews · Apr 19, 2023 go to post

StackOverflow suggests using svglib and reportlab to achieve this with python:

from svglib.svglib import svg2rlg
from reportlab.graphics import renderPM
drawing = svg2rlg("my.svg")
renderPM.drawToFile(drawing, "my.png", fmt="PNG")
Julian Matthews · Apr 13, 2023 go to post

I would recommend approaching this in three parts:

  1. Create a custom message class for your target system.
  2. Use a DTL to transform your HL7 into your custom message class
  3. Use a custom File Operation to write the file output based on your custom message class

For #1, this could be as basic as:

Class Demo.Messages.SystemX.CustomBody Extends Ens.Request
{

Property Param1 As %String;

Property Param2 As %String;

Property Param3 As %String;

Property Etc As %String;

}

For #2, your transform would then be something like:

And then for #3, your file operation would be something along the lines of:

Class Demo.Operations.SystemX.FileWriter Extends Ens.BusinessOperation
{

Parameter ADAPTER = "EnsLib.File.OutboundAdapter";

Property Adapter As EnsLib.File.OutboundAdapter;

Parameter INVOCATION = "Queue";

Method OnMessage(pRequest As Demo.Messages.SystemX.CustomBody, Output pResponse As Ens.Response) As %Status
{

	Set Line1 = pRequest.Param1_"|"pRequest.Param2
	Set Line2 = pRequest.Param3_"|"pRequest.Param4
	Set Line3 = pRequest.Param5_"|"pRequest.Param6
	Set outString = Line1_$C(13)_$C(10)_Line2_$C(13)_$C(10)_Line3_$C(13)_$C(10)
	
	Set fileName = "Filename"
	Set sc = ..Adapter.PutString(fileName_".dat",outString)
	Quit sc
}

}

Please note that the above is a super rough draft - there's no error handling, and you'd need to consider how you'd make the filename unique per message, but I'm hoping this gets you on the right path.

Julian Matthews · Mar 27, 2023 go to post

Hi @Evgeny Shvarov 

I don't have anything immediately to hand as I still feel that the reuse of code in Step 3 from the original class method is not best practice, although I do have this running in a live production for 2-3 operations where this was needed.

I will try and see if I can get something put together that can be exported and put onto the Open Exchange. Just a warning, I'm not one for Docker, so it'll be a Production deployment export smiley

Julian Matthews · Mar 21, 2023 go to post

Hey Yuri.

The users are held within the SQL table "Security.Users" in the %SYS namespace, so you could use embedded sql to return the information, however as you're unlikely to be executing your code directly from the %SYS namespace, I'd suggest creating a function that you pass the email address, and it returns the username.

Something like:

Class Demo.Utils.General.Users
{

ClassMethod UserFromEmail(Email As %String, Output Username As %String) As %Status
{
	//Initially set this to null, as we want to return it empty when we get no results
	Set Username = ""
	//Hold the Namespace within a variable so we can use the variable to set the namespace back once the SQL has been run.
	Set CurrNamespace = $NAMESPACE
	//Change NameSpace to %SYS
	Set $NAMESPACE = "%SYS"
	//Run query to get the Username based on the email address - note the use of the UPPER function to remove issues with case sensitivity
	&SQL(
	Select ID into :Username
	FROM Security.Users
	WHERE UPPER(EmailAddress) = UPPER(:Email)
	)
	
	//Set namespace back to the namespace the function was run from
	Set $NAMESPACE = CurrNamespace
	
	//Evaluate SQLCODE for result
	//Less than 0 is an error.
	If SQLCODE <0{
		WRITE "SQLCODE="_$SYSTEM.SQL.Functions.SQLCODE(SQLCODE)
		QUIT 0
		}
	
	//Greater than 0 can really only mean Code 100, which is no results found.
	If SQLCODE > 0 {
		QUIT 1 //No Result Found
		}
	Else {
		QUIT 1 //Result Found
		}
}

}
DEMO> WRITE Class(Demo.Utils.General.Users).UserFromEmail("YuriMarx@ACME.XYZ",.Output)
1

DEMO> WRITE Output
YMARX

This is by no means perfect as I have thrown it together for the example - please forgive the messy if/else's! smiley