go to post Ashok Kumar T · 4 hr ago The *C - Control character removes works from ASCII (0-31, 127-159). However, the unicodes ascii's are greater than CC.
go to post Ashok Kumar T · 13 hr ago Hello @Scott Roth You can see the http request format by using "The %Net.HttpRequest Send() method has a test argument as its 3rd option; setting it to 1 outputs the request, 2 the response, and 3 the response headers." Can you check this from the post.
go to post Ashok Kumar T · May 6 Thank you, @Mark Hanson — your explanation clarified my question. Once the object goes out of scope(removed from memory) / reference count drops to zero the entire queue is deleted. Therefore, using Sync/WaitForComplete is essential to ensure the work is properly completed.
go to post Ashok Kumar T · May 5 Thank you for sharing this @Keren Skubach The SDA to FHIR / FHIR to SDA DTL(which is the primary method we're using for conversion) doesn't support primitive Extension conversion by default. I'll need to implement a custom DTL to achieve this. Alternatively, I can handle it in the code if I generate the FHIR resource programmatically. Can you attach the documentation link for the set resource.property = $listbuild(""original property value",<primitive extension index>,...) primitive extension index set as well. I could see the HS.FHIRModel.R4.Patient from 2024.1 version Thank you!
go to post Ashok Kumar T · May 4 When you extend a class with %JSON.Adaptor, you must also ensure that any child classes or classes used as object properties also extend %JSON.Adaptor. Otherwise, you'll encounter an error like:" ERROR #9411: A class referenced by a %JSONENABLED class must be a subclass of %JSON.Adaptor" during JSON export. In your case, %DynamicObject/%DynamicArray is a system class that does not extend %JSON.Adaptor, which is why you're unable to export it using methods like %JSONExport, %JSONExportString, or %JSONExportToStream. To work around this, you can use the Property xAxis As %Library.DynamicObject(%JSONINCLUDE="NONE"); parameter on the problematic field to exclude it from the JSON export.
go to post Ashok Kumar T · May 2 It depends on your implementation. you can add some conditions like if the date is "00010101" then convert to "1840-12-31" and use that date into $ZDH ($ZDH("1840-12-31",3)) or skip the $ZDH conversion and set 0 directly.
go to post Ashok Kumar T · May 2 $ZDateH is used to convert the actual date into IRIS internal Horolog format. You can use $ZDate convert the Horolog to date format. The start date is $ZDate(0) = "12/31/1840" $ZDH("12/31/1840") = 0. If the date is earlier than the allowed range, it throws a 'VALUE OUT OF RANGE' error.
go to post Ashok Kumar T · May 2 iris list /iris all to list all the instance installed on the Linux and check this documentation
go to post Ashok Kumar T · Apr 30 That's correct. I’ve explored the new UI feature, and it appears to be a commendable initiative. The following observations have been noted. The shortcuts for Rules, DTL, and Production at the top of the page, along with their options (Partial/Full/Open in New Tab), greatly improve navigation and allow for quick access within a single page. The file paths (File Path, Archive Path, and Work Path) within the File Service configuration are not selectable, and the target configuration name is not rendered correctly. The absence of the target configuration results in improper rendering of the connection arrows. In the new user interface, the Production Settings—specifically Queues, Logs, Messages, and Jobs—are not displaying any data, and the Action tab is missing. These elements function as expected in the standard interface. While small Data Transformation Layers (DTLs) and mappings display correctly, larger DTLs and mappings exhibit instability and are not consistently reliable in production configuration Singe page application. The popup displayed during the start or stop of a business host or production instance is helpful for identifying errors; and the 'Update' button is absent. And The dots appear feels somewhat unusual. It takes some time to adapt to the new UI; however, it is quite effective and well-designed. Thanks!
go to post Ashok Kumar T · Apr 29 Hi @Pravin Barton The {Contents} has the id of the stream property. So, We can open the the stream object by using OID and then convert to JSON like below. and the stream write need to use %ToJSON() for JSON serialization stream.Write({"msg":"hello world!"}.%ToJSON()) Trigger ExtractKeys [ Event = INSERT/UPDATE, Foreach = row/object, Time = AFTER ] { new contentsJSON, id, msg if {Contents*C} { set contentsJSON = {}.%FromJSON(##class(%Stream.GlobalCharacter).%Open($lb({Contents},"%Stream.GlobalCharacter","^PAB.DebugStreamS"))) set id = {ID} set msg = contentsJSON.msg &sql(update learn_Smp.NewClass13 set msg = :msg where Id = :id) $$$ThrowSQLIfError(SQLCODE, %msg) } } Thanks!
go to post Ashok Kumar T · Apr 29 Hello @Evgeny Shvarov I intended to implement the same logic written by Mr @David Hockenbroch - specifically, fetching the `rowID` using a query, then opening the ID and invoking JSON adaptor methods to create a JSON object and write it to a stream. However, instead of following that approach, I tried same in SQL by constructed the query using SQL’s JSON_OBJECT and JSON_ARRAY functions to consolidate rows directly at the database level. Unlike the JSON adaptor, which conveniently exports all columns and fields in a single method, this approach required me to manually specify each column and field in the query. Additionally, I had to use implicit joins to handle object properties, and I couldn’t export entire values of stream properties as well. If the JSON_OBJECT function offered a more direct and extensive way to gather all necessary data, the process would be much more straightforward. So, I’ve submitted this as an idea in the idea portal to add this JSON_OBJECT() include all the fields dynamically. It's simple and eliminates the need for object instances. Sample SQL SELECT JSON_OBJECT( 'Name' :Name, 'Email' :JSON_OBJECT('EmailType' :Email->EmailType->Type,'EmailId':Email->Email), 'Phone':JSON_ARRAY($LISTTOSTRING(Phone)), 'Address': JSON_OBJECT( 'Door':Address->Door, 'State':JSON_OBJECT( 'stateId':Address->state->stateid, 'state':Address->state->state ), 'City':JSON_OBJECT( 'cityId':Address->city->cityid, 'city':Address->city->city ), 'Country':JSON_OBJECT( 'countryId':Address->Country->Countryid, 'country':Address->Country->Country ) ) ) FROM Sample.Person WHERE ID=1 Thank you!
go to post Ashok Kumar T · Apr 29 Hello @Yaron Munz I’m using Set sc = workMgr.Queue("..Cleanup", QueueId). I intentionally commented out Set sc = workMgr.WaitForComplete() because I'm not concerned (for now) with the completion status. Is it mandatory to include WaitForComplete()/Sync() when working with WorkMgr?
go to post Ashok Kumar T · Apr 24 Hello @Evgeny Shvarov I directly use %DymaicObject and %DynamicArray and its methods depends on the response type (array or object) to set the values into the JSON response and I use %WriteResponse to write that response to the API call Class sample.Impl Extends %REST.Impl { ClassMethod GetAllPersons() As %Stream.Object { d ..%SetContentType("application/json") set res = {"name":"test" do ..%SetStatusCode(200) do ..%WriteResponse(res) q 1 } Mixed of stream data types and other datas then creates a stream and convert into DynamicObject if required and write that response.
go to post Ashok Kumar T · Apr 15 Right, It worked! The issue was a mismatch in the super server port.
go to post Ashok Kumar T · Apr 15 Hello @Vitaliy Serdtsev, I executed the routine invocation commands in native python script.
go to post Ashok Kumar T · Apr 15 Thank you @Eduard Lebedyuk. Could you please paste the link of DP-422635 and DP-424156.that would be helpful.
go to post Ashok Kumar T · Apr 11 Hello @Yaron Munz Calling the label or subroutine directly without specifying the routine name—for example, set sc=wqm.Queue("subr1")—always works. However, using the label^routine syntax does not work as expected. If I need to execute a subroutine from a different routine, I have to wrap that subroutine inside another function and call that function instead. test.mac set wqm = ##class(%SYSTEM.WorkMgr).%New() set sc = wqm.Queue("f1^test1") quit ; test1.mac ; f1() do subr1 quit subr1 set ^test($NOW()) = "" quit I agree that return values are always not required when executing a routine, even when using methods like wqm.WaitForComplete() or wqm.Sync(). However, return values are mandatory for class when use wqm.WaitForComplete() or wqm.Sync(). Class When invoking class methods, you can call them with or without parameters, and with or without return values. Return values are not required if you don't need to capture the status using wqm.WaitForComplete() or wqm.Sync() Set wqm = ##class(%SYSTEM.WorkMgr).%New() Set sc=wqm.Queue("..func1",1) Set sc=wqm.Queue("##Class(Sample.Person).func1") return values required if the status by using "wqm.WaitForComplete()" or "wqm.Sync()" Set wqm = ##class(%SYSTEM.WorkMgr).%New() Set sc=wqm.Queue("..func1",1) Set sc=wqm.Queue("##Class(Sample.Person).func1",1) Set sc=wqm.WaitForComplete() If ('sc) W $SYSTEM.OBJ.DisplayError(sc) thanks!
go to post Ashok Kumar T · Apr 11 Hi @Julian Matthews The Ens.StreamContainer class is a %Persistent class and it stores the different types of stream in it. So, If you delete the row/record by id / or by query it will delete the entry as well as the stream contents(it's also part of the row) from the Ens.StreamContainer table. So, be cautious before deleting the container. %OnDelete callback method is used to do some additional function while deleting the object.
go to post Ashok Kumar T · Apr 1 Hello @Phillip Wu The "status" is set to 1 for tasks that are either currently suspended ("Suspended Reschedule") or encounter an error during execution. If the "Suspend task on error?" option is set to "no" when scheduling the task, the error message is stored in the status column. However, the task status is not suspended. From documentation If not defined by the task default success will be 1If the job is currently running (JobRunning) Status will be -1If there was an untrapped error (JobUntrappedError) Status will be -2If there was an error before execution (JobSetupError) Status will be -3If the task timed out trying to job (JobTimeout) Status will be -4If there was an error after execution (JobPostProcessError) Status will be -5The text of the status code will be in the property Error. SQL Query select Name,displaystatus,Error,Suspended,%ID from %SYS.Task Where Suspended=1 or Suspended=2 Query method set tResult = ##class(%SYS.Task).TaskListFilterFunc("Suspend") do tResult.%Display()
go to post Ashok Kumar T · Mar 31 Hello @Nezla, The %SYS.Task class has the task details which includes the status of the task. The "Suspended" column has value if incase the task is errored out while running ("Suspend Leave" ) or the task is suspended ("Suspend Reschedule"). Based on this column you can find out the suspended /errored task. Use the below query for the TaskName and additional details. select Name,Status,TaskClass,Suspended from %SYS.Task