Hello @Yaron Munz 

Calling the label or subroutine directly without specifying the routine name—for example, set sc=wqm.Queue("subr1")—always works. However, using the label^routine syntax does not work as expected. If I need to execute a subroutine from a different routine, I have to wrap that subroutine inside another function and call that function instead.

test.mac
set wqm = ##class(%SYSTEM.WorkMgr).%New()
set sc = wqm.Queue("f1^test1")
quit
;
test1.mac
 ;
f1()
 do subr1
 quit
subr1
 set ^test($NOW()) = ""
 quit

I agree that return values are always not required when executing a routine, even when using methods like wqm.WaitForComplete() or wqm.Sync(). However,  return values are mandatory for class when use wqm.WaitForComplete() or wqm.Sync().

Class

When invoking class methods, you can call them with or without parameters, and with or without return values. Return values are not required if you don't need to capture the status using wqm.WaitForComplete() or wqm.Sync()

Set wqm = ##class(%SYSTEM.WorkMgr).%New()
Set sc=wqm.Queue("..func1",1)
Set sc=wqm.Queue("##Class(Sample.Person).func1")

return values required if the status by using "wqm.WaitForComplete()" or "wqm.Sync()"

Set wqm = ##class(%SYSTEM.WorkMgr).%New()
Set sc=wqm.Queue("..func1",1)
Set sc=wqm.Queue("##Class(Sample.Person).func1",1)
Set sc=wqm.WaitForComplete() 
If ('sc) W $SYSTEM.OBJ.DisplayError(sc)

thanks!

Hi @Julian Matthews

The Ens.StreamContainer class is a %Persistent class and it stores the different types of stream in it. So, If you delete the row/record by id / or by query it will delete the entry as well as the stream contents(it's also part of the row) from the Ens.StreamContainer table. So, be cautious before deleting the container. %OnDelete  callback method is used to do some additional function while deleting the object.

Hello @Phillip Wu

The "status" is set to 1 for tasks that are either currently suspended ("Suspended Reschedule") or encounter an error during execution. If the "Suspend task on error?" option is set to "no" when scheduling the task, the error message is stored in the status column. However, the task status is not suspended.

From documentation

If not defined by the task default success will be 1
If the job is currently running (JobRunning) Status will be -1
If there was an untrapped error (JobUntrappedError) Status will be -2
If there was an error before execution (JobSetupError) Status will be -3
If the task timed out trying to job (JobTimeout) Status will be -4
If there was an error after execution (JobPostProcessError) Status will be -5
The text of the status code will be in the property Error.
 

SQL Query

select Name,displaystatus,Error,Suspended,%ID from %SYS.Task Where Suspended=1 or Suspended=2

 Query method

set tResult = ##class(%SYS.Task).TaskListFilterFunc("Suspend")
do tResult.%Display()

Hello @Nezla,

The %SYS.Task class has the task details which includes the status of the task. The "Suspended" column has value if incase the task is errored out while running ("Suspend Leave" ) or the task is suspended ("Suspend Reschedule"). Based on this column you can find out the suspended /errored task. Use the below query for the TaskName and additional details.

select Name,Status,TaskClass,Suspended from %SYS.Task

Hello  @Michael Wood,

here are several approaches you can take to handle this. One option is to create a file service that reads the file as a stream and sends it to a custom business process. You can then convert that stream into a JSON object and iterate through each DynamicObject entry. Alternatively, you could send the stream to BPL and process the JSON there

Simplified sample of process the JSON. 


Class Samples.Introp.JSONFileService Extends Ens.BusinessService
{

Parameter ADAPTER = "EnsLib.File.InboundAdapter";

Method OnProcessInput(pInput As %Stream.Object, Output pOutput As %RegisteredObject) As %Status
{
	Do ..SendRequestSync("JSONFileProcess",pInput,pOutput)
	Quit $$$OK
}

}

Class Samples.Introp.JSONFileProcess Extends Ens.BusinessProcess [ ClassType = persistent ]
{

Method OnRequest(pRequest As Ens.Request, Output pResponse As Ens.Response) As %Status
{
	Set json = {}.%FromJSON(pRequest)
	
	Set iter = json.%GetIterator()
	
	while iter.%GetNext(.key,.val)
	{
		s ^test($Classname(),$NOW(),key)=val.%ToJSON()
	}

	Quit $$$OK
}
}

Thanks!

Hello @Krishnaveni Kapu 

%SYS.Task is responsible for store, suspend and resume for all the tasks. so, your can execute the below method programmatically to achieve it. It expects the task id as the first argument

You can execute the below query to get the task id, name and additional information

select Suspended,Name,id from %SYS.Task

suspend

Flag

1 - Suspend the task, but leave it in task queue (default)
2 - Suspend the task, remove from task queue, and reschedule for next time

Set taskId = 1001
Set flag = 2
Write ##class(%SYS.Task).Suspend(taskId,flag)

resume

Set taskId=1001
Write ##class(%SYS.Task).Resume(taskId)

Hi @Sebastian Thiele 

Here is the Encounter resource. I'm using IRIS for Windows (x86-64) 2024.1.1

url : http://localhost:52773/csp/healthshare/learning/fhir/r4/Encounter?&date=lt2024-10-27T15:29:00Z

 
Spoiler

Hello @Kevin Mayfield 

Instead FromJSON() write a custom class to parse the Binary Resource and set it into the FHIR model class. 

Here I convert the the data (which is the Binary resource -pdf or anything) to stream and set it in the data field of the  HS.FHIR.DTL.vR4.Model.Resource.Binary class

ClassMethod SetBinaryR4(json As %DynamicObject)
{
    Set obj = ##class(HS.FHIR.DTL.vR4.Model.Resource.Binary).%New()
    Set obj.contentType = json.contentType
    #; convert to stream to prevent from the <MAXSTRING> error 
    Set dataAsStrm = json.%Get("data",,"stream")
    Set obj.data = dataAsStrm
    Set obj.id = json.id
    
    #; set binary data element as stream for "Binary" resource. 
    ZWrite obj.data.Read()
}

Thanks!

Hello @Kevin Mayfield 

%GetNext retrieves values from the JSON object/array and assigns them to a local variable. However, the BLOB/streams exceed the maximum local length (3641144), MAXLEN doesn't cause the issue because it's a registered object, and the values are currently stored in memory. Therefore, AFAIK FromJSON is not suitable for handling such a large dataset.

Hello @Ali Chaib

  1. You need to create you're custom DTL under "HS.Local.FHIR.DTL"  
  2. Once DTL is created. Execute set status = ##class(HS.FHIR.DTL.Util.API.ExecDefinition).SetCustomDTLPackage("HS.Local.FHIR.DTL")  this and this will configure the customize DTL entry in the  ^HS.XF.Config global it's responsible for execute your DTL
  3. Here is the documentation which covers most customization.
  4. As I mentioned earlier, You need to the additional properties in respective SDA extension class.