Julian Matthews · Dec 5, 2024 go to post

My approach would be to make use of the OAuth 2.0 Client configuration via the Management Portal.

You can configure the Issuer Endpoint here, as well as add the details of the Client, Secret, etc.

To then make use of this configuration within an Operation, you can then do something like this:

Method AuthoriseMe(Output AccessToken As %String) As %Status
{
 //Set basic parameters
 Set tSC = $$$OK
 Set myscopes = "profile"
 Set clientName = ..Client
 Set AccessToken = ""
 //Check to see if client is already authenticated
 Set isAuth=##class(%SYS.OAuth2.AccessToken).IsAuthorized(clientName,,myscopes,.accessToken,.idtoken,.responseProperties,.error)

 //If we're not authorised already, we need to authorise ourselves.
 If isAuth=0{
   //Not Authenticated - authenticate client
   //Quit on error is used here as, if we're unable to get the token 
   $$$QuitOnError(##class(%SYS.OAuth2.Authorization).GetAccessTokenClient(clientName,myscopes,,.error))
   $$$QuitOnError(##class(%SYS.OAuth2.AccessToken).IsAuthorized(clientName,,myscopes,.accessToken,.idtoken,.responseProperties,.error))
   }
 
 Set AccessToken = accessToken
 Quit tSC
}

Where ..Client is in the code snippet, the value of this will need to match the name of the client as configured in the management portal.

Julian Matthews · Nov 27, 2024 go to post

Hey Brad.

Apologies, I'm not entirely sure why I typed unicode in full upper-case when that's not present in the helper dialog or the drop down.

How confident are you that what you're receiving is actually unicode?

The adapter by default will look at what's in MSH:18 and will only use the selection in the adapter setting if this is blank in the message.

Firstly, try setting this to "!latin1" (without the quotes) to force it to operate as latin1 as per the support info for DefCharEncoding:

Putting ! before the encoding name will force the use of the named encoding and will ignore any value found in MSH:18.

If that fails, I'd then cycle through the other options starting with "!utf-8" and then one of the variants of Unicode available when using the drop down

Be careful - there are some overlaps when it comes to come encodings where things look fine until certain symbols come into play, at which point you end up with some interesting outputs. 

Julian Matthews · Nov 26, 2024 go to post

Hey Brad.

The adapter has two sets of options here which can lead to confusion. We first have the charset for the adapter for the File adapter elements, and then the Default Char Encoding for the HL7 adapter elements.

As a starting point, I would try changing the Charset setting to Binary, and then setting the DefCharEncoding to UNICODE to match what is in your header.

Julian Matthews · Nov 15, 2024 go to post

So to do what you're trying to do in your DTL, add in a code block and paste in the following:

  Set CHUNKSIZE = 2097144
  Set outputStream=##class(%Stream.TmpCharacter).%New()
  Do source.EncodedPdf.Rewind()
  While ('source.EncodedPdf.AtEnd) {
    Set tReadLen=CHUNKSIZE
    Set tChunk=source.EncodedPdf.Read(.tReadLen)
    Do outputStream.Write($SYSTEM.Encryption.Base64Encode(tChunk,1))
  }
  Do outputStream.Rewind()
  Set Status = target.StoreFieldStreamRaw(outputStream,"OBXgrp(1).OBX:5.5")
  )

Yours is almost doing the same thing but, as Enrico points out with your code sample, you have the "Set tSC = tStream.Write($C(10))" line adding in the line breaks whereas my example has this excluded.

Separately, as alluded to by Scott, when adding the base 64 encoded PDF stream to the HL7, you'll want to use the StoreFieldStreamRaw method for the HL7. Trying to do a traditional set with a .Read() risks the input being truncated.

Julian Matthews · Nov 15, 2024 go to post

Hey Smythe.

Your Base64 has line breaks, so is breaking onto a new line which is then being read as a new line in the HL7.

Depending on what method you are using to convert the PDF to Base 64, you should have a setting to not use line breaks.

Julian Matthews · Oct 22, 2024 go to post

It's a bodge, but can you create a new namespace with the same name, delete the task, and then delete the namespace again?

Julian Matthews · Oct 21, 2024 go to post

Hey Anthony.

Depending on your version of Iris, I would recommend swapping out your use of %GlobalCharacterStream with %Stream.GlobalCharacter as the former is depreciated. Additionally, I would recommend swapping them out for their temp couterparts so you're not inadvertently creating loads of orphaned global streams, especially where you're dealing with files of this size.

Julian Matthews · Oct 18, 2024 go to post

It's a wild shot in the dark, but looking here: https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=ESQL_adapter_methods_creating#ESQL_transactions

has a try/catch where the catch has the following:

catch err{
    if (err.%ClassName(1)="common.err.exception") && ($$$ISERR(err.status)) {
      set tSC = err.status
    }
    else {
      set tSC = $system.Status.Error(err.Code,err.Name,err.Location,err.InnerException)
  }

If you try to recreate this, does the code you're looking for appear in either err.Code,err.Name,err.Location, or err.InnerException?

Julian Matthews · Sep 24, 2024 go to post

I thought that this would be a case of the Tilde being a special character for your target document due to its common use in HL7 for repeating fields. However, I ran a test to see what I got when trying this.

I created a transform for a PV1 segment, and attempted to set the value of PV1:1 to the output of the replace function and the input string contained a few commas:

I then ran this, and got this result:

Not only did it successfully replace the commas with tildes, but the virtual document now see's it as a repeating segment (even though the field is not repeating in it's specification).

I know this doesn't directly help you, but wanted to share my results in case it helped lead you to finding a solution. (for ref, this is from version 2022.1 of Iris For Health)

Julian Matthews · Sep 10, 2024 go to post

Ahh, I see what you mean.

I have not had to work in this way with IRIS and OpenEHR, so unfortunately I wouldn't be able to provide much insight.

Julian Matthews · Sep 9, 2024 go to post

Hey Joost.

What do you mean when you say "handle"?

Our interactions with OpenEHR for reading and creating/updating compositions are generally via traditional HTTP based API's, which is relatively simple to set up with IRIS.

Julian Matthews · Sep 9, 2024 go to post

I 100% agree with Eduard.

Even back when I had two mirrored instances sat running in the same physical location, we were saved many times by mirroring when there had been issues with either IRIS or the server/OS itself.

It's also very helpful for managing upgrades, and even server migrations (by adding in the new servers as async members, and then demoting a failover member on an old server and promoting a new server from async to failover).

Julian Matthews · Sep 4, 2024 go to post

Jumping off of the answer I have given here only earlier today and being in a country currently observing(?) BST, you'll want to use the following approach:

  1. Use $ZDATETIMEH with the dformat of  3 and tformat of 7 (Note for the tformat that it's expecting the input as UTC, which is what you have with your source timestamp)
  2. Use $ZDATETIME with the dformat of 8 and tformat of 1
  3. Realise quickly that the tformat includes separators between the date and time, and within the time itself, so wrap it all in $ZSTRIP to remove all punctuation and whitespace...

Basically this:

WRITE $ZSTRIP($ZDATETIME($ZDATETIMEH("2023-09-28T20:35:41Z",3,7),8,1),"*P")


Gives you this:

And demonstrating this for a date that isn't affected by BST you'll note that the time stays the same as the input because BST isn't in effect in the winter, taking the timezone I'm in back to GMT:

I hope this helps!

Julian Matthews · Sep 4, 2024 go to post

The link in my last reply actually contains the answer, which is always useful. I have tweaked it slightly so that it's a single line, but the output is the same.

To get the current date and time with milliseconds, you can do the following:

WRITE $ZDATETIME($ZDATETIMEH($ZTIMESTAMP,-3),3,1,3)

This is:

  1. Starting with the output of $ZTIMESTAMP
  2. Converting to a $HOROLOG format adjusted for your local timezone using $ZDATETIMEH
  3. Converting to the desired format using $ZDATETIME

I hope this helps!

Julian Matthews · Sep 3, 2024 go to post

This will be that caveat I warned of which is detailed in the documentation.

You could do something like:

Write $ZDATETIME($h_"."_$P($NOW(),".",2),3,1,3)

Which takes the value of $H and appends the milliseconds from $NOW() to then form your datestamp:

However the documentation I linked to warns that there can be a discrepancy between $H and $NOW() so this approach could then lead to your timestamp being off by up to a second. As you are trying to work to the level of milliseconds, I suspect accuracy is very important and therefore I would not recommend this approach.

Take a look here and see if this example of comparing $h, $ZTIMESTAMP, and $NOW() helps, and the example of converting from UTC to the local timezone helps.

Julian Matthews · Sep 3, 2024 go to post

If you replace $h with $NOW(), this should do as you need

However there is a caveat with regards to Timezones mentioned in the online documentation that you may want to review to ensure it works as you'd expect and need.

Julian Matthews · Aug 30, 2024 go to post

Have you tried to open the pdf in a text editor like notepad++ to see what it looks like? It might be that the stream is incomplete, or you're writing the base64 to the file output without decoding?

Julian Matthews · Aug 1, 2024 go to post

Hi David.

As Luis has stated, this doesn't allow you to make direct changes to the message. However, you can use this to set a variable that can then be referenced within a transformation. The Property variable can only be "RuleActionUserData"

To use this in an action:

And then within the DTL, you can reference "aux.RuleActionUserData":

Julian Matthews · Jul 26, 2024 go to post

Although I have seen environments where namespaces are used to separate Dev/Test/Prod etc. I have found that having the Prod environment on the same environment as the Non-Prod Namespaces is a risk to the Prod environment should an active piece of development take down the underlying machine (one example was a developer* making a mistake when working with Streams and had created an infinite loop in writing a stream and the server very quickly gained a 10GB pdf that filled the disk to capacity and the environment stopped working).

A common use case for multiple namespace for me would be for instances where the activity within the namespace is significantly distinct from the others. For example, we have a namespace that is dedicated to DICOM activity. While we could have put this in a single "LIVE" themed namespace, the volume of activity we'd see for DICOM would have filled our servers disk if kept to the same retention period as other standard retention period. So we have a DICOM namespace that has a retention period of around 14 days compared to others that are between 30 and 120 days.

*It was me. I was that developer.

Julian Matthews · Jul 18, 2024 go to post

Thanks Scott.

I'm also not rushing to delete based on counts, but it's still interesting to review.

I ran the "Complete Ensemble Message Body Report" from Suriya's post's Gist against a namespace and it ran for about 8 hours, which now has me nervous to run the Delete option. Although, to be fair, this is a namespace that has been in operation for about 10 years, so I might start smaller and work my way up.

Julian Matthews · Jul 16, 2024 go to post

The key difference between the two scenarios is that you're running the application for the instances where you are getting the correct time.

Have you checked the server timezone settings for the user running Ensemble in your environment?

Julian Matthews · Jul 15, 2024 go to post

This feels like a timezone/locale issue - is the difference between your time zone and UTC also 5 hours apart?

Julian Matthews · Jul 4, 2024 go to post

Hi Joshua.

Is it possible that there is a group policy in place that is being applied to you and not your colleagues? Have you tried forcing an update of your group policies applied to your profile - from the windows terminal/command line:

gpupdate /force

Alternatively, do you have any extensions installed in Edge that you don't have installed for Chrome? Maybe an adblocker?

Finally, have you tried opening devtools on this page, refreshing, and then seeing if there are any meaningful errors appearing under the Console or Network tab?

Julian Matthews · Jul 2, 2024 go to post

The only way I can think of doing this would be to split out the Helper() ClassMethod into its own class, and then ensure the order of compilation is in such a way that the class containing the Helper class method is compiled. Depending on how you're achieving this, could you use the CompileAfter class keyword?

So something like:

Class 1:

Class Foo.Bar.1
{

ClassMethod Helper()
{
// do something
}

}

Class 2:

Class Foo.Bar.2 [ CompileAfter = (Foo.Bar.1) ]
{

ClassMethod Generated() [ CodeMode = objectgenerator ]
{
do ##Class(Foo.Bar.1).Helper()
// do something else
}

}
Julian Matthews · Jun 14, 2024 go to post

There's a good thread from a few years ago that goes into various ways of converting date formats which you can find here.

My approach in that thread was to suggest the use of the class method ##class(Ens.Util.Time).ConvertDateTime()

In your case using this option, you could do this for your main question:

Set Output = ##class(Ens.Util.Time).ConvertDateTime(input,"%Y-%m-%d %H:%M:%S.%N","%Y%m%d")

And then for your followup to include seconds:

Set Output = ##class(Ens.Util.Time).ConvertDateTime(input,"%Y-%m-%d %H:%M:%S.%N","%Y%m%d%H%M%S")

Julian Matthews · May 21, 2024 go to post

If I were to write this in an operation using the EnsLib.HTTP.OutboundAdapter, my approach would be something similar to:


	Set tSC = ..Adapter.SendFormData(.webresponse,"GET",webrequest)

	//begin backoff algorithm
	
	//Get start time in seconds
	Set startInSeconds = $ZDTH($H,-2)
	
	//Set initial params for algorithm
	Set wait = 1, maximumBackoff=64, deadline=300
	
	//Only run while Status Code is 504
	While (webresponse.StatusCode = "504"){
		
		//HANG for x.xx seconds
		HANG wait_"."_$RANDOM(9)_$RANDOM(9)
		
		//Call endpoint
		Set tSC = ..Adapter.SendFormData(.webresponse,"GET",webrequest)
		
		//Increment potential wait periods
		If wait < maximumBackoff Set wait = wait*2

		//Adjust wait if previous action takes us above the maximum backoff
		If wait > maximumBackoff Set wait = maximumBackoff
		
		//Check if deadline has been hit, exiting the While loop if we have
		Set currentTimeInSeconds = $ZDTH($H,-2)
		If (currentTimeInSeconds-startInSeconds>=deadline){Quit}
	}

This is untested however, so massive pinch of salt is required 😅

Julian Matthews · May 8, 2024 go to post

I'm not sure if it's related, but my colleagues and I all notice random performance issues with the management portal when accessing our non-production environment that's using IIS.

It was deployed to our non-production environment for similar experimentation reasons, but I never took it further due to these issues (and it dropped from my radar due to still being on 2022.1.2 with no immediate pressures to upgrade)
I need to upgrade the version of web gateway following a recent email from Intersystems, so I'm going to run that now and then reboot the machine and see if I see any changes.

Beyond that, I'm going to be following this discussion closely to see if our issues are related and if there is a solution.