I set $T to 0 before the if statement. After the syntax error occurred, I checked the value of $T. It seems that the interpreter doesn’t differentiate between a block of code and a single line — it still executes the subsequent lines inside the condition

I modified the post slightly. I ran a routine via the terminal, and as expected, it threw a <SYNTAX> error because the variable x is undefined. Then I continued the program, and now both the single-line if statement and the block if statement (with braces) set all the global variable. I thought that block of code wouldn’t be executed

 if $D(@x@(a,b,c)) s ^zz1=1,^x=1,^y=2,^xx=1
 ;
 ;
 if $D(@x@(a,b,c)) {
     set ^zz1=1212
     set ^dd=1
     set ^fg=2
 }

Maybe can you try to wrap the HL7 message like below format and set the ContentType As "application/xml"  in additional setting in the Business operation "EnsLib.HL7.Operation.HTTPOperation" and check for the response headers as well if required.

<?xml version="1.0" encoding="UTF-8"?>
<HL7Message>
<![CDATA[
MSH|^~\&|... your message ...
PID|... etc ...
]]>
</HL7Message>

We can use the %IsDefined method  to check the key is defined, and %Size() to determine the number of the elements in an Object or Array. These methods help to prevent from the <INVALID OREF> and <UNDEFINED>

// verify the "items" is present and it has values
If responseData.%IsDefined("items")&&(responseData.items.%Size()) {
    Set item1 = responseData.items.%Get(0)
    If $IsObject(item1) {
        Write item1.portalUrl
    }
    /*another*/
    If $IsObject(item1)&&(item1.%IsDefined("portalUrl")) {
        Write item1.portalUrl
    }
}

The %GetTypeOf method is used to determine the type of a key, and it returns 'unassigned' if the key does not exist

If responseData.%GetTypeOf("items")="array" {
    Set item1 = responseData.items.%Get(0)
    If $IsObject(item1) {
        Write item1.portalUrl
    }
}

I agree that both Streams extend from %Stream.Object, and yes, I noticed that the storage mechanisms are also different. I'm curious—what are the advantages of using DynamicBinary instead of TmpCharacter?

We can use the %SYS.LockQuery class and its List query function to check whether the global is already locked. If it is, we can skip attempting to acquire the lock.

Check for the specific process

ClassMethod LOCK(Lock, Mode)
{
    If '..IsLocked("^A","X") {
        Lock +^A
    }
    Else {
        Write "Locked"
    }
}
// X - Exclusive
// S - Shared
ClassMethod IsLocked(Global As %String, Mode As %String)
{
    #dim status = 0
    Set tResult = ##class(%SYS.LockQuery).ListFunc($J)
    While tResult.%Next() {
        If tResult.Mode=Mode&&(tResult.LockString=Global) Set status= 1
    }
    Return status
}

However, the above code only checks for a specific process and does not account for other processes with Xclusive or Shared locks. The sample below checks for all acquired locks, returning their status and lock type.

ClassMethod IsLocked(Global As %String, Mode As %String)
{
	#dim status = 0
	Set tResult = ##class(%SYS.LockQuery).ListFunc()
	While tResult.%Next() {
		If tResult.LockString=Global {
			If tResult.Mode?1(1"X",1"S") Set status= 1_","_tResult.Mode
		}
	}
	Return status #; status be like "1,S" or "1,X"
}

Hello @Scott Roth 
You can see the http request format by using "The %Net.HttpRequest Send() method has a test argument as its 3rd option; setting it to 1 outputs the request, 2 the response, and 3 the response headers." Can you check this from the post

Thank you, @Mark Hanson — your explanation clarified my question. Once the object goes out of scope(removed from memory) / reference count drops to zero the entire queue is deleted. Therefore, using Sync/WaitForComplete is essential to ensure the work is properly completed.

Thank you for sharing this @Keren Skubach

The SDA to FHIR / FHIR to SDA DTL(which is the primary method we're using for conversion) doesn't support primitive Extension conversion by default. I'll need to implement a custom DTL to achieve this. Alternatively, I can handle it in the code if I generate the FHIR resource programmatically. Can you attach the documentation link for the set resource.property = $listbuild(""original property value",<primitive extension index>,...) primitive extension index set as well. I could see the HS.FHIRModel.R4.Patient from 2024.1 version

Thank you!

When you extend a class with %JSON.Adaptor, you must also ensure that any child classes or classes used as object properties also extend %JSON.Adaptor. Otherwise, you'll encounter an error like:
" ERROR #9411: A class referenced by a %JSONENABLED class must be a subclass of %JSON.Adaptor" during JSON export.

In your case, %DynamicObject/%DynamicArray is a system class that does not extend %JSON.Adaptor, which is why you're unable to export it using methods like %JSONExport, %JSONExportString, or %JSONExportToStream.

To work around this, you can use the Property xAxis As %Library.DynamicObject(%JSONINCLUDE="NONE"); parameter on the problematic field to exclude it from the JSON export.

It depends on your implementation. you can add some conditions like if the date is "00010101" then convert to "1840-12-31" and use that date into $ZDH ($ZDH("1840-12-31",3)) or skip the $ZDH conversion and set 0 directly. 

$ZDateH is used to convert the actual date into IRIS internal Horolog format. You can use $ZDate convert the Horolog to date format. The start date is $ZDate(0) = "12/31/1840" $ZDH("12/31/1840") = 0. If the date is earlier than the allowed range, it throws a 'VALUE OUT OF RANGE' error.

That's correct. I’ve explored the new UI feature, and it appears to be a commendable initiative. The following observations have been noted.

  • The shortcuts for Rules, DTL, and Production at the top of the page, along with their options (Partial/Full/Open in New Tab), greatly improve navigation and allow for quick access within a single page.
  • The file paths (File Path, Archive Path, and Work Path) within the File Service configuration are not selectable, and the target configuration name is not rendered correctly.
  • The absence of the target configuration results in improper rendering of the connection arrows.
  • In the new user interface, the Production Settings—specifically Queues, Logs, Messages, and Jobs—are not displaying any data, and the Action tab is missing. These elements function as expected in the standard interface.
  • While small Data Transformation Layers (DTLs) and mappings display correctly, larger DTLs and mappings exhibit instability and are not consistently reliable in production configuration Singe page application.
  • The popup displayed during the start or stop of a business host or production instance is helpful for identifying errors; and the 'Update' button is absent.
  • And The dots appear feels somewhat unusual.

It takes some time to adapt to the new UI; however, it is quite effective and well-designed.

Thanks! 

Hi @Pravin Barton

The {Contents} has the id of the stream property. So, We can open the the stream object  by using OID and then convert to JSON like below. and the stream write need to use %ToJSON() for JSON serialization stream.Write({"msg":"hello world!"}.%ToJSON())


Trigger ExtractKeys [ Event = INSERT/UPDATE, Foreach = row/object, Time = AFTER ]
{
    new contentsJSON, id, msg
    
    if {Contents*C} {
        set contentsJSON = {}.%FromJSON(##class(%Stream.GlobalCharacter).%Open($lb({Contents},"%Stream.GlobalCharacter","^PAB.DebugStreamS")))
        set id = {ID}
        set msg = contentsJSON.msg
        &sql(update learn_Smp.NewClass13 set msg = :msg where Id = :id)
        $$$ThrowSQLIfError(SQLCODE, %msg)
    }
}

Thanks!

Hello @Evgeny Shvarov 

I intended to implement the same logic written by Mr @David Hockenbroch  - specifically, fetching the `rowID` using a query, then opening the ID and invoking JSON adaptor methods to create a JSON object and write it to a stream. However, instead of following that approach, I tried same in SQL by constructed the query using SQL’s JSON_OBJECT and JSON_ARRAY functions to consolidate rows directly at the database level.

Unlike the JSON adaptor, which conveniently exports all columns and fields in a single method, this approach required me to manually specify each column and field in the query. Additionally, I had to use implicit joins to handle object properties, and I couldn’t export entire values of stream properties as well. If the JSON_OBJECT function offered a more direct and extensive way to gather all necessary data, the process would be much more straightforward. 

So, I’ve submitted this as an idea in the idea portal to add this JSON_OBJECT() include all the fields dynamically. It's simple and eliminates the need for object instances.

Sample SQL

SELECT JSON_OBJECT(
        'Name' :Name,
        'Email' :JSON_OBJECT('EmailType' :Email->EmailType->Type,'EmailId':Email->Email),
        'Phone':JSON_ARRAY($LISTTOSTRING(Phone)),
        'Address': JSON_OBJECT(
            'Door':Address->Door,
            'State':JSON_OBJECT(
                'stateId':Address->state->stateid,
                'state':Address->state->state
            ),
            'City':JSON_OBJECT(
                'cityId':Address->city->cityid,
                'city':Address->city->city
            ),
            'Country':JSON_OBJECT(
                'countryId':Address->Country->Countryid,
                'country':Address->Country->Country
            )
        )
    )
FROM Sample.Person
WHERE ID=1

Thank you!

Hello @Yaron Munz 

I’m using Set sc = workMgr.Queue("..Cleanup", QueueId). I intentionally commented out Set sc = workMgr.WaitForComplete() because I'm not concerned (for now) with the completion status. Is it mandatory to include WaitForComplete()/Sync() when working with WorkMgr?

Hello @Evgeny Shvarov 
I directly use %DymaicObject and %DynamicArray and its methods depends on the response type (array or object) to set the values into the JSON response and I use %WriteResponse to write that response to the API call

 Class sample.Impl Extends  %REST.Impl {
 
 ClassMethod GetAllPersons() As %Stream.Object
{
    d ..%SetContentType("application/json")
    set res =  {"name":"test" 
    do ..%SetStatusCode(200)
    do ..%WriteResponse(res)
    q 1
    
}

Mixed of stream data types and other datas then creates a stream and convert into DynamicObject if required and write that response.

Thank you @Eduard.Lebedyuk. Could you please paste the link of DP-422635 and DP-424156.that would be helpful.

Hello @Yaron Munz 

Calling the label or subroutine directly without specifying the routine name—for example, set sc=wqm.Queue("subr1")—always works. However, using the label^routine syntax does not work as expected. If I need to execute a subroutine from a different routine, I have to wrap that subroutine inside another function and call that function instead.

test.mac
set wqm = ##class(%SYSTEM.WorkMgr).%New()
set sc = wqm.Queue("f1^test1")
quit
;
test1.mac
 ;
f1()
 do subr1
 quit
subr1
 set ^test($NOW()) = ""
 quit

I agree that return values are always not required when executing a routine, even when using methods like wqm.WaitForComplete() or wqm.Sync(). However,  return values are mandatory for class when use wqm.WaitForComplete() or wqm.Sync().

Class

When invoking class methods, you can call them with or without parameters, and with or without return values. Return values are not required if you don't need to capture the status using wqm.WaitForComplete() or wqm.Sync()

Set wqm = ##class(%SYSTEM.WorkMgr).%New()
Set sc=wqm.Queue("..func1",1)
Set sc=wqm.Queue("##Class(Sample.Person).func1")

return values required if the status by using "wqm.WaitForComplete()" or "wqm.Sync()"

Set wqm = ##class(%SYSTEM.WorkMgr).%New()
Set sc=wqm.Queue("..func1",1)
Set sc=wqm.Queue("##Class(Sample.Person).func1",1)
Set sc=wqm.WaitForComplete() 
If ('sc) W $SYSTEM.OBJ.DisplayError(sc)

thanks!

Hi @Julian.Matthews7786

The Ens.StreamContainer class is a %Persistent class and it stores the different types of stream in it. So, If you delete the row/record by id / or by query it will delete the entry as well as the stream contents(it's also part of the row) from the Ens.StreamContainer table. So, be cautious before deleting the container. %OnDelete  callback method is used to do some additional function while deleting the object.

Hello @Phillip Wu

The "status" is set to 1 for tasks that are either currently suspended ("Suspended Reschedule") or encounter an error during execution. If the "Suspend task on error?" option is set to "no" when scheduling the task, the error message is stored in the status column. However, the task status is not suspended.

From documentation

If not defined by the task default success will be 1
If the job is currently running (JobRunning) Status will be -1
If there was an untrapped error (JobUntrappedError) Status will be -2
If there was an error before execution (JobSetupError) Status will be -3
If the task timed out trying to job (JobTimeout) Status will be -4
If there was an error after execution (JobPostProcessError) Status will be -5
The text of the status code will be in the property Error.
 

SQL Query

select Name,displaystatus,Error,Suspended,%ID from %SYS.Task Where Suspended=1 or Suspended=2

 Query method

set tResult = ##class(%SYS.Task).TaskListFilterFunc("Suspend")
do tResult.%Display()

Hello @Nezla,

The %SYS.Task class has the task details which includes the status of the task. The "Suspended" column has value if incase the task is errored out while running ("Suspend Leave" ) or the task is suspended ("Suspend Reschedule"). Based on this column you can find out the suspended /errored task. Use the below query for the TaskName and additional details.

select Name,Status,TaskClass,Suspended from %SYS.Task