Thanks @Eduard Lebedyuk
Sure will create WRC.
- Log in to post comments
Thanks @Eduard Lebedyuk
Sure will create WRC.
I agree that both Streams extend from %Stream.Object, and yes, I noticed that the storage mechanisms are also different. I'm curious—what are the advantages of using DynamicBinary instead of TmpCharacter?
We can use the %SYS.LockQuery class and its List query function to check whether the global is already locked. If it is, we can skip attempting to acquire the lock.
Check for the specific process
ClassMethod LOCK(Lock, Mode)
{
If '..IsLocked("^A","X") {
Lock +^A
}
Else {
Write "Locked"
}
}
// X - Exclusive
// S - Shared
ClassMethod IsLocked(Global As %String, Mode As %String)
{
#dim status = 0
Set tResult = ##class(%SYS.LockQuery).ListFunc($J)
While tResult.%Next() {
If tResult.Mode=Mode&&(tResult.LockString=Global) Set status= 1
}
Return status
}However, the above code only checks for a specific process and does not account for other processes with Xclusive or Shared locks. The sample below checks for all acquired locks, returning their status and lock type.
ClassMethod IsLocked(Global As %String, Mode As %String)
{
#dim status = 0
Set tResult = ##class(%SYS.LockQuery).ListFunc()
While tResult.%Next() {
If tResult.LockString=Global {
If tResult.Mode?1(1"X",1"S") Set status= 1_","_tResult.Mode
}
}
Return status #; status be like "1,S" or "1,X"
}
Hello @Scott Roth
You can see the http request format by using "The %Net.HttpRequest Send() method has a test argument as its 3rd option; setting it to 1 outputs the request, 2 the response, and 3 the response headers." Can you check this from the post.
Thank you, @Mark Hanson — your explanation clarified my question. Once the object goes out of scope(removed from memory) / reference count drops to zero the entire queue is deleted. Therefore, using Sync/WaitForComplete is essential to ensure the work is properly completed.
Thank you for sharing this @Keren Skubach
The SDA to FHIR / FHIR to SDA DTL(which is the primary method we're using for conversion) doesn't support primitive Extension conversion by default. I'll need to implement a custom DTL to achieve this. Alternatively, I can handle it in the code if I generate the FHIR resource programmatically. Can you attach the documentation link for the set resource.property = $listbuild(""original property value",<primitive extension index>,...) primitive extension index set as well. I could see the HS.FHIRModel.R4.Patient from 2024.1 version
Thank you!
When you extend a class with %JSON.Adaptor, you must also ensure that any child classes or classes used as object properties also extend %JSON.Adaptor. Otherwise, you'll encounter an error like:
"
In your case, %DynamicObject/%DynamicArray is a system class that does not extend %JSON.Adaptor, which is why you're unable to export it using methods like %JSONExport, %JSONExportString, or %JSONExportToStream.
To work around this, you can use the Property xAxis As %Library.DynamicObject(%JSONINCLUDE="NONE"); parameter on the problematic field to exclude it from the JSON export.
It depends on your implementation. you can add some conditions like if the date is "00010101" then convert to "1840-12-31" and use that date into $ZDH ($ZDH("1840-12-31",3)) or skip the $ZDH conversion and set 0 directly.
$ZDateH is used to convert the actual date into IRIS internal Horolog format. You can use $ZDate convert the Horolog to date format. The start date is $ZDate(0) = "12/31/1840" $ZDH("12/31/1840") = 0. If the date is earlier than the allowed range, it throws a 'VALUE OUT OF RANGE' error.
iris list /iris all to list all the instance installed on the Linux and check this documentation
That's correct. I’ve explored the new UI feature, and it appears to be a commendable initiative. The following observations have been noted.
dots appear feels somewhat unusual.It takes some time to adapt to the new UI; however, it is quite effective and well-designed.
Thanks!
The {Contents} has the id of the stream property. So, We can open the the stream object by using OID and then convert to JSON like below. and the stream write need to use %ToJSON() for JSON serialization stream.Write({"msg":"hello world!"}.%ToJSON())
Trigger ExtractKeys [ Event = INSERT/UPDATE, Foreach = row/object, Time = AFTER ]
{
new contentsJSON, id, msg
if {Contents*C} {
set contentsJSON = {}.%FromJSON(##class(%Stream.GlobalCharacter).%Open($lb({Contents},"%Stream.GlobalCharacter","^PAB.DebugStreamS")))
set id = {ID}
set msg = contentsJSON.msg
&sql(update learn_Smp.NewClass13 set msg = :msg where Id = :id)
$$$ThrowSQLIfError(SQLCODE, %msg)
}
}
Thanks!
Hello @Evgeny Shvarov
I intended to implement the same logic written by Mr @David Hockenbroch - specifically, fetching the `rowID` using a query, then opening the ID and invoking JSON adaptor methods to create a JSON object and write it to a stream. However, instead of following that approach, I tried same in SQL by constructed the query using SQL’s JSON_OBJECT and JSON_ARRAY functions to consolidate rows directly at the database level.
Unlike the JSON adaptor, which conveniently exports all columns and fields in a single method, this approach required me to manually specify each column and field in the query. Additionally, I had to use implicit joins to handle object properties, and I couldn’t export entire values of stream properties as well. If the JSON_OBJECT function offered a more direct and extensive way to gather all necessary data, the process would be much more straightforward.
So, I’ve submitted this as an idea in the idea portal to add this JSON_OBJECT() include all the fields dynamically. It's simple and eliminates the need for object instances.
Sample SQL
SELECT JSON_OBJECT(
'Name' :Name,
'Email' :JSON_OBJECT('EmailType' :Email->EmailType->Type,'EmailId':Email->Email),
'Phone':JSON_ARRAY($LISTTOSTRING(Phone)),
'Address': JSON_OBJECT(
'Door':Address->Door,
'State':JSON_OBJECT(
'stateId':Address->state->stateid,
'state':Address->state->state
),
'City':JSON_OBJECT(
'cityId':Address->city->cityid,
'city':Address->city->city
),
'Country':JSON_OBJECT(
'countryId':Address->Country->Countryid,
'country':Address->Country->Country
)
)
)
FROM Sample.Person
WHERE ID=1Thank you!
Hello @Yaron Munz
I’m using Set sc = workMgr.Queue("..Cleanup", QueueId). I intentionally commented out Set sc = workMgr.WaitForComplete() because I'm not concerned (for now) with the completion status. Is it mandatory to include WaitForComplete()/Sync() when working with WorkMgr?
Hello @Evgeny Shvarov
I directly use %DymaicObject and %DynamicArray and its methods depends on the response type (array or object) to set the values into the JSON response and I use %WriteResponse to write that response to the API call
Class sample.Impl Extends %REST.Impl {
ClassMethod GetAllPersons() As %Stream.Object
{
d ..%SetContentType("application/json")
set res = {"name":"test"
do ..%SetStatusCode(200)
do ..%WriteResponse(res)
q 1
}Mixed of stream data types and other datas then creates a stream and convert into DynamicObject if required and write that response.
Right, It worked! The issue was a mismatch in the super server port.
Hello @Vitaliy Serdtsev,
I executed the routine invocation commands in native python script.
Thank you @Eduard.Lebedyuk. Could you please paste the link of DP-422635 and DP-424156.that would be helpful.
Hello @Yaron Munz
Calling the label or subroutine directly without specifying the routine name—for example, set sc=wqm.Queue("subr1")—always works. However, using the label^routine syntax does not work as expected. If I need to execute a subroutine from a different routine, I have to wrap that subroutine inside another function and call that function instead.
test.mac
set wqm = ##class(%SYSTEM.WorkMgr).%New()
set sc = wqm.Queue("f1^test1")
quit
;
test1.mac
;
f1()
do subr1
quit
subr1
set ^test($NOW()) = ""
quitI agree that return values are always not required when executing a routine, even when using methods like wqm.WaitForComplete() or wqm.Sync(). However, return values are mandatory for class when use wqm.WaitForComplete() or wqm.Sync().
Class
When invoking class methods, you can call them with or without parameters, and with or without return values. Return values are not required if you don't need to capture the status using wqm.WaitForComplete() or wqm.Sync()
Set wqm = ##class(%SYSTEM.WorkMgr).%New()
Set sc=wqm.Queue("..func1",1)
Set sc=wqm.Queue("##Class(Sample.Person).func1")return values required if the status by using "wqm.WaitForComplete()" or "wqm.Sync()"
Set wqm = ##class(%SYSTEM.WorkMgr).%New()
Set sc=wqm.Queue("..func1",1)
Set sc=wqm.Queue("##Class(Sample.Person).func1",1)
Set sc=wqm.WaitForComplete()
If ('sc) W $SYSTEM.OBJ.DisplayError(sc)thanks!
Hi @Julian.Matthews7786
The Ens.StreamContainer class is a %Persistent class and it stores the different types of stream in it. So, If you delete the row/record by id / or by query it will delete the entry as well as the stream contents(it's also part of the row) from the Ens.StreamContainer table. So, be cautious before deleting the container. %OnDelete callback method is used to do some additional function while deleting the object.
Hello @Phillip Wu
The "status" is set to 1 for tasks that are either currently suspended ("Suspended Reschedule") or encounter an error during execution. If the "Suspend task on error?" option is set to "no" when scheduling the task, the error message is stored in the status column. However, the task status is not suspended.
From documentation
SQL Query
select Name,displaystatus,Error,Suspended,%ID from %SYS.Task Where Suspended=1 or Suspended=2Query method
set tResult = ##class(%SYS.Task).TaskListFilterFunc("Suspend")
do tResult.%Display()Hello @Nezla,
The %SYS.Task class has the task details which includes the status of the task. The "Suspended" column has value if incase the task is errored out while running ("Suspend Leave" ) or the task is suspended ("Suspend Reschedule"). Based on this column you can find out the suspended /errored task. Use the below query for the TaskName and additional details.
select Name,Status,TaskClass,Suspended from %SYS.TaskYou can use $$$CurrentMethod from Ensemble.inc used to get the current classmethod in your methods.
Hello @Michael Wood,
here are several approaches you can take to handle this. One option is to create a file service that reads the file as a stream and sends it to a custom business process. You can then convert that stream into a JSON object and iterate through each DynamicObject entry. Alternatively, you could send the stream to BPL and process the JSON there
Simplified sample of process the JSON.
Class Samples.Introp.JSONFileService Extends Ens.BusinessService
{
Parameter ADAPTER = "EnsLib.File.InboundAdapter";
Method OnProcessInput(pInput As %Stream.Object, Output pOutput As %RegisteredObject) As %Status
{
Do ..SendRequestSync("JSONFileProcess",pInput,pOutput)
Quit $$$OK
}
}
Class Samples.Introp.JSONFileProcess Extends Ens.BusinessProcess [ ClassType = persistent ]
{
Method OnRequest(pRequest As Ens.Request, Output pResponse As Ens.Response) As %Status
{
Set json = {}.%FromJSON(pRequest)
Set iter = json.%GetIterator()
while iter.%GetNext(.key,.val)
{
s ^test($Classname(),$NOW(),key)=val.%ToJSON()
}
Quit $$$OK
}
}Thanks!
I use the same code and as well as use $ZU(41,-2) to get stack information .
Kudos to all participants and winners🎉!
Hello @Krishnaveni Kapu
%SYS.Task is responsible for store, suspend and resume for all the tasks. so, your can execute the below method programmatically to achieve it. It expects the task id as the first argument
You can execute the below query to get the task id, name and additional information
select Suspended,Name,id from %SYS.Task
Flag
1 - Suspend the task, but leave it in task queue (default)
2 - Suspend the task, remove from task queue, and reschedule for next time
Set taskId = 1001
Set flag = 2
Write ##class(%SYS.Task).Suspend(taskId,flag)
Set taskId=1001
Write ##class(%SYS.Task).Resume(taskId)Can you click the Spoiler to view the message.