Ashok Kumar T · May 8, 2025 go to post

I agree that both Streams extend from %Stream.Object, and yes, I noticed that the storage mechanisms are also different. I'm curious—what are the advantages of using DynamicBinary instead of TmpCharacter?

Ashok Kumar T · May 7, 2025 go to post

We can use the %SYS.LockQuery class and its List query function to check whether the global is already locked. If it is, we can skip attempting to acquire the lock.

Check for the specific process

ClassMethod LOCK(Lock, Mode)
{
    If '..IsLocked("^A","X") {
        Lock +^A
    }
    Else {
        Write "Locked"
    }
}
// X - Exclusive
// S - Shared
ClassMethod IsLocked(Global As %String, Mode As %String)
{
    #dim status = 0
    Set tResult = ##class(%SYS.LockQuery).ListFunc($J)
    While tResult.%Next() {
        If tResult.Mode=Mode&&(tResult.LockString=Global) Set status= 1
    }
    Return status
}

However, the above code only checks for a specific process and does not account for other processes with Xclusive or Shared locks. The sample below checks for all acquired locks, returning their status and lock type.

ClassMethod IsLocked(Global As %String, Mode As %String)
{
	#dim status = 0
	Set tResult = ##class(%SYS.LockQuery).ListFunc()
	While tResult.%Next() {
		If tResult.LockString=Global {
			If tResult.Mode?1(1"X",1"S") Set status= 1_","_tResult.Mode
		}
	}
	Return status #; status be like "1,S" or "1,X"
}
Ashok Kumar T · May 7, 2025 go to post

The *C - Control character removes works from ASCII (0-31, 127-159). However, the unicodes ascii's are greater than CC.

Ashok Kumar T · May 6, 2025 go to post

Hello @Scott Roth 
You can see the http request format by using "The %Net.HttpRequest Send() method has a test argument as its 3rd option; setting it to 1 outputs the request, 2 the response, and 3 the response headers." Can you check this from the post

Ashok Kumar T · May 6, 2025 go to post

Thank you, @Mark Hanson — your explanation clarified my question. Once the object goes out of scope(removed from memory) / reference count drops to zero the entire queue is deleted. Therefore, using Sync/WaitForComplete is essential to ensure the work is properly completed.

Ashok Kumar T · May 5, 2025 go to post

Thank you for sharing this @Keren Skubach

The SDA to FHIR / FHIR to SDA DTL(which is the primary method we're using for conversion) doesn't support primitive Extension conversion by default. I'll need to implement a custom DTL to achieve this. Alternatively, I can handle it in the code if I generate the FHIR resource programmatically. Can you attach the documentation link for the set resource.property = $listbuild(""original property value",<primitive extension index>,...) primitive extension index set as well. I could see the HS.FHIRModel.R4.Patient from 2024.1 version

Thank you!

Ashok Kumar T · May 4, 2025 go to post

When you extend a class with %JSON.Adaptor, you must also ensure that any child classes or classes used as object properties also extend %JSON.Adaptor. Otherwise, you'll encounter an error like:
" ERROR #9411: A class referenced by a %JSONENABLED class must be a subclass of %JSON.Adaptor" during JSON export.

In your case, %DynamicObject/%DynamicArray is a system class that does not extend %JSON.Adaptor, which is why you're unable to export it using methods like %JSONExport, %JSONExportString, or %JSONExportToStream.

To work around this, you can use the Property xAxis As %Library.DynamicObject(%JSONINCLUDE="NONE"); parameter on the problematic field to exclude it from the JSON export.

Ashok Kumar T · May 2, 2025 go to post

It depends on your implementation. you can add some conditions like if the date is "00010101" then convert to "1840-12-31" and use that date into $ZDH ($ZDH("1840-12-31",3)) or skip the $ZDH conversion and set 0 directly. 

Ashok Kumar T · May 2, 2025 go to post

$ZDateH is used to convert the actual date into IRIS internal Horolog format. You can use $ZDate convert the Horolog to date format. The start date is $ZDate(0) = "12/31/1840" $ZDH("12/31/1840") = 0. If the date is earlier than the allowed range, it throws a 'VALUE OUT OF RANGE' error.

Ashok Kumar T · Apr 30, 2025 go to post

That's correct. I’ve explored the new UI feature, and it appears to be a commendable initiative. The following observations have been noted.

  • The shortcuts for Rules, DTL, and Production at the top of the page, along with their options (Partial/Full/Open in New Tab), greatly improve navigation and allow for quick access within a single page.
  • The file paths (File Path, Archive Path, and Work Path) within the File Service configuration are not selectable, and the target configuration name is not rendered correctly.
  • The absence of the target configuration results in improper rendering of the connection arrows.
  • In the new user interface, the Production Settings—specifically Queues, Logs, Messages, and Jobs—are not displaying any data, and the Action tab is missing. These elements function as expected in the standard interface.
  • While small Data Transformation Layers (DTLs) and mappings display correctly, larger DTLs and mappings exhibit instability and are not consistently reliable in production configuration Singe page application.
  • The popup displayed during the start or stop of a business host or production instance is helpful for identifying errors; and the 'Update' button is absent.
  • And The dots appear feels somewhat unusual.

It takes some time to adapt to the new UI; however, it is quite effective and well-designed.

Thanks! 

Ashok Kumar T · Apr 29, 2025 go to post

Hi @Pravin Barton

The {Contents} has the id of the stream property. So, We can open the the stream object  by using OID and then convert to JSON like below. and the stream write need to use %ToJSON() for JSON serialization stream.Write({"msg":"hello world!"}.%ToJSON())


Trigger ExtractKeys [ Event = INSERT/UPDATE, Foreach = row/object, Time = AFTER ]
{
    new contentsJSON, id, msg
    
    if {Contents*C} {
        set contentsJSON = {}.%FromJSON(##class(%Stream.GlobalCharacter).%Open($lb({Contents},"%Stream.GlobalCharacter","^PAB.DebugStreamS")))
        set id = {ID}
        set msg = contentsJSON.msg
        &sql(update learn_Smp.NewClass13 set msg = :msg where Id = :id)
        $$$ThrowSQLIfError(SQLCODE, %msg)
    }
}

Thanks!

Ashok Kumar T · Apr 29, 2025 go to post

Hello @Evgeny Shvarov 

I intended to implement the same logic written by Mr @David Hockenbroch  - specifically, fetching the `rowID` using a query, then opening the ID and invoking JSON adaptor methods to create a JSON object and write it to a stream. However, instead of following that approach, I tried same in SQL by constructed the query using SQL’s JSON_OBJECT and JSON_ARRAY functions to consolidate rows directly at the database level.

Unlike the JSON adaptor, which conveniently exports all columns and fields in a single method, this approach required me to manually specify each column and field in the query. Additionally, I had to use implicit joins to handle object properties, and I couldn’t export entire values of stream properties as well. If the JSON_OBJECT function offered a more direct and extensive way to gather all necessary data, the process would be much more straightforward. 

So, I’ve submitted this as an idea in the idea portal to add this JSON_OBJECT() include all the fields dynamically. It's simple and eliminates the need for object instances.

Sample SQL

SELECT JSON_OBJECT(
        'Name' :Name,
        'Email' :JSON_OBJECT('EmailType' :Email->EmailType->Type,'EmailId':Email->Email),
        'Phone':JSON_ARRAY($LISTTOSTRING(Phone)),
        'Address': JSON_OBJECT(
            'Door':Address->Door,
            'State':JSON_OBJECT(
                'stateId':Address->state->stateid,
                'state':Address->state->state
            ),
            'City':JSON_OBJECT(
                'cityId':Address->city->cityid,
                'city':Address->city->city
            ),
            'Country':JSON_OBJECT(
                'countryId':Address->Country->Countryid,
                'country':Address->Country->Country
            )
        )
    )
FROM Sample.Person
WHERE ID=1

Thank you!

Ashok Kumar T · Apr 29, 2025 go to post

Hello @Yaron Munz 

I’m using Set sc = workMgr.Queue("..Cleanup", QueueId). I intentionally commented out Set sc = workMgr.WaitForComplete() because I'm not concerned (for now) with the completion status. Is it mandatory to include WaitForComplete()/Sync() when working with WorkMgr?

Ashok Kumar T · Apr 24, 2025 go to post

Hello @Evgeny Shvarov 
I directly use %DymaicObject and %DynamicArray and its methods depends on the response type (array or object) to set the values into the JSON response and I use %WriteResponse to write that response to the API call

 Class sample.Impl Extends  %REST.Impl {
 
 ClassMethod GetAllPersons() As %Stream.Object
{
    d ..%SetContentType("application/json")
    set res =  {"name":"test" 
    do ..%SetStatusCode(200)
    do ..%WriteResponse(res)
    q 1
    
}

Mixed of stream data types and other datas then creates a stream and convert into DynamicObject if required and write that response.

Ashok Kumar T · Apr 15, 2025 go to post

Thank you @Eduard.Lebedyuk. Could you please paste the link of DP-422635 and DP-424156.that would be helpful.

Ashok Kumar T · Apr 11, 2025 go to post

Hello @Yaron Munz 

Calling the label or subroutine directly without specifying the routine name—for example, set sc=wqm.Queue("subr1")—always works. However, using the label^routine syntax does not work as expected. If I need to execute a subroutine from a different routine, I have to wrap that subroutine inside another function and call that function instead.

test.mac
set wqm = ##class(%SYSTEM.WorkMgr).%New()
set sc = wqm.Queue("f1^test1")
quit
;
test1.mac
 ;
f1()
 do subr1
 quit
subr1
 set ^test($NOW()) = ""
 quit

I agree that return values are always not required when executing a routine, even when using methods like wqm.WaitForComplete() or wqm.Sync(). However,  return values are mandatory for class when use wqm.WaitForComplete() or wqm.Sync().

Class

When invoking class methods, you can call them with or without parameters, and with or without return values. Return values are not required if you don't need to capture the status using wqm.WaitForComplete() or wqm.Sync()

Set wqm = ##class(%SYSTEM.WorkMgr).%New()
Set sc=wqm.Queue("..func1",1)
Set sc=wqm.Queue("##Class(Sample.Person).func1")

return values required if the status by using "wqm.WaitForComplete()" or "wqm.Sync()"

Set wqm = ##class(%SYSTEM.WorkMgr).%New()
Set sc=wqm.Queue("..func1",1)
Set sc=wqm.Queue("##Class(Sample.Person).func1",1)
Set sc=wqm.WaitForComplete() 
If ('sc) W $SYSTEM.OBJ.DisplayError(sc)

thanks!

Ashok Kumar T · Apr 11, 2025 go to post

Hi @Julian.Matthews7786

The Ens.StreamContainer class is a %Persistent class and it stores the different types of stream in it. So, If you delete the row/record by id / or by query it will delete the entry as well as the stream contents(it's also part of the row) from the Ens.StreamContainer table. So, be cautious before deleting the container. %OnDelete  callback method is used to do some additional function while deleting the object.

Ashok Kumar T · Apr 1, 2025 go to post

Hello @Phillip Wu

The "status" is set to 1 for tasks that are either currently suspended ("Suspended Reschedule") or encounter an error during execution. If the "Suspend task on error?" option is set to "no" when scheduling the task, the error message is stored in the status column. However, the task status is not suspended.

From documentation

If not defined by the task default success will be 1
If the job is currently running (JobRunning) Status will be -1
If there was an untrapped error (JobUntrappedError) Status will be -2
If there was an error before execution (JobSetupError) Status will be -3
If the task timed out trying to job (JobTimeout) Status will be -4
If there was an error after execution (JobPostProcessError) Status will be -5
The text of the status code will be in the property Error.
 

SQL Query

select Name,displaystatus,Error,Suspended,%ID from %SYS.Task Where Suspended=1 or Suspended=2

 Query method

set tResult = ##class(%SYS.Task).TaskListFilterFunc("Suspend")
do tResult.%Display()
Ashok Kumar T · Mar 31, 2025 go to post

Hello @Nezla,

The %SYS.Task class has the task details which includes the status of the task. The "Suspended" column has value if incase the task is errored out while running ("Suspend Leave" ) or the task is suspended ("Suspend Reschedule"). Based on this column you can find out the suspended /errored task. Use the below query for the TaskName and additional details.

select Name,Status,TaskClass,Suspended from %SYS.Task
Ashok Kumar T · Mar 31, 2025 go to post

You can use the $STACK to get the previous stack call information by using the level and code string. Try the below code in your method "B". it returns the previous call stack information. 

$STACK($STACK(-1)-1,"PLACE")
Ashok Kumar T · Mar 28, 2025 go to post

You can use $$$CurrentMethod from Ensemble.inc used to get the current classmethod in your methods.

Ashok Kumar T · Mar 27, 2025 go to post

Hello  @Michael Wood,

here are several approaches you can take to handle this. One option is to create a file service that reads the file as a stream and sends it to a custom business process. You can then convert that stream into a JSON object and iterate through each DynamicObject entry. Alternatively, you could send the stream to BPL and process the JSON there

Simplified sample of process the JSON. 


Class Samples.Introp.JSONFileService Extends Ens.BusinessService
{

Parameter ADAPTER = "EnsLib.File.InboundAdapter";

Method OnProcessInput(pInput As %Stream.Object, Output pOutput As %RegisteredObject) As %Status
{
	Do ..SendRequestSync("JSONFileProcess",pInput,pOutput)
	Quit $$$OK
}

}

Class Samples.Introp.JSONFileProcess Extends Ens.BusinessProcess [ ClassType = persistent ]
{

Method OnRequest(pRequest As Ens.Request, Output pResponse As Ens.Response) As %Status
{
	Set json = {}.%FromJSON(pRequest)
	
	Set iter = json.%GetIterator()
	
	while iter.%GetNext(.key,.val)
	{
		s ^test($Classname(),$NOW(),key)=val.%ToJSON()
	}

	Quit $$$OK
}
}

Thanks!

Ashok Kumar T · Mar 5, 2025 go to post

Hello @Krishnaveni Kapu 

%SYS.Task is responsible for store, suspend and resume for all the tasks. so, your can execute the below method programmatically to achieve it. It expects the task id as the first argument

You can execute the below query to get the task id, name and additional information

select Suspended,Name,id from %SYS.Task

suspend

Flag

1 - Suspend the task, but leave it in task queue (default)
2 - Suspend the task, remove from task queue, and reschedule for next time

Set taskId = 1001
Set flag = 2
Write ##class(%SYS.Task).Suspend(taskId,flag)

resume

Set taskId=1001
Write ##class(%SYS.Task).Resume(taskId)