I did get my code to work as expected but there's still some unknowns.

I used the example as shown above but also added another block of code to capture the properties that changed into a log.  I also did a KILL on the ^test array before the lines of code were generated.

I either the KILL on the array and/or the number of saves and background calls happening in our system had something to do with my initial issues.  My most apparent issue however, was I wasn't using %code.WriteLine to produce the generated code.  

At the end of the day this will probably go into a local array and not a global so all the update operations don't compete.

Hi @Pravin Barton, I was looking for this exact advice however, I am having trouble debugging.  I have looked at the generated INT code and I'm a bit stumped.  

All our %Persistent objects are updated via object access in CSP pages calling class methods.  I utilzed the example you have below but I used set tableName = %compiledclass.NameGet() to get the tableName.  The properties wrote to ^test no problem, so that's not an issue.

The issue is after updating objects using the CSP pages (i.e. the front end), all the checks for {property*C} are false (0).  I see expected operations being passed into the zFunctions with the pChanged array being setup.  All our inserts and updates are accomplished by %Save().  We never %Delete().

I also included Time=After as a parameter in the trigger generator.  So according to the doc, the trigger should happen similar to then %OnAfterSave() method.  

Any thoughts on what I may have missed or why this isn't working in my setup?

I suppose another way of stating the issue.  The doc says you might want to use sessions to:

  • Preserve data across REST calls — in some cases, preserving data across REST calls may be necessary to efficiently meet your business requirements.

Well, I'm not finding an easy way to do this using spec first dev because the %session variable is undefined in the generated impl.cls. Even when I extend impl.cls to use %CSP.REST.  

@Eduard Lebedyuk I did catch one article you wrote that said sessions are basically only good for authentication (I assume you just meeting keep one logged into the IRIS server over the session), however since the doc does mention preserving data, I would like to see if I can utilize that feature.

For now I've reverted to the manual creation of rest services which now allows me the %session variable to use.

Welp, this was total user error.  I was trying to read class property that didn't exist.  Also there's some good debugging advice here: Debugging Web | InterSystems Developer Community | CSP|Debugging|Frontend

@Rich Taylor maybe you can help me?

I am passing in a very simple structure in the body of a post request and I'm getting an error returned in Postman saying that the Dynamic Object I'm trying to create from the stream has a property that doesn't exist.

HTTP/1.1 500 Internal Server Error
Content-Type: application/json; charset=utf-8
{
    "errors":[ {
            "code":5002,
            "domain":"%ObjectErrors",
            "error":"ERROR #5002: ObjectScript error: <PROPERTY DOES NOT EXIST>

...

Sample data: {"quote_id":2000}

ClassMethod deleteQuote(quote As %Stream.Object) As %DynamicObject

{

    s quoteObj={}.%FromJSON(quote)

    s quoteId=quoteObj."quote_id"

.....

So a few problems:

1) I can't debug this easily . . . I can't pass in a %Stream.Object in VS Code using the debugger and on the command line I can't seem to create a %New() %Stream.Object (my guess is because it's an interface).  Any tips on debugging when you're passing in a stream?

2) The one thing I did do was set my stream data to a global and I'm getting some hints as to what's wrong but I'm not understanding.  A ZW of the global shows me "9430@%Library.DynamicObject" after I have done the %FromJSON() method (I'm pretty sure a global can't store an object so no wonder it's quoted but there's more). If I just save the Read() of the stream to the global then go to the command line and set the string to a variable and then do the {}.%FromJSON(data) method I get the expected data=<OBJECT REFERENCE>[3304@%Library.DynamicObject] and can access the property.

I did notice when using postman, if you 'beautify' the JSON string you get lots of white space and other characters that need to be stripped, so I'm aware that could be a problem, but right now I have the data as a single line with no extra characters.  It's working on the command line when I break it down, but I can't see what the issue is when running the actual POST request.

Any experience with these kinds of issues?

Can this be used to temporarily set a dynamic entity to a DocDB so one could run a query on it and return a result set object?

I'm trying to find a solution for returning a result set object to a CSP page when I have a dynamic entity or JSON object.

It's basically a complex, ad hoc query that's easier (for me) to write in ObjectScript rather than write a SQL query and aggregate and transform all kinds of data.  I really need something in memory just to run the report, nothing saved to disk, but it seems creating a database using %DocDB will create a new class def weather I like it or not.

I'm just poking around here to gain some knowledge.  When you say routine to you mean a compiled MAC file?  I'm curious to know how ZR would work if you've compiled the routine and, as Robert said, it's not on the stack.  Any time I've run a routine from the CLI and do a zprint, there's nothing there, so I'm not sure what ZR would remove it ZP isn't showing in the buffer.  When you call an entry point and there's a quit, isn't there an inherent ZR to get the routine out of the buffer?  

I guess I'm simply asking: did ZR solve your issue?

Well we are in IRIS 2021 and that's the documentation I was looking in.  So I'm not sure what's going on.  

Here's where I landed:

Given a %Stream.FileBinary I calculate the size to read as such

s readSize=($J((stream.SizeGet()/12000),"",0)+1)*12000

And then 'cast' my stream to a %xsd.base64Binary datatype (ODBC type VARBINARY) as such

s base64string=##class(%xsd.base64Binary).LogicalToJSON(stream.Read(readSize,.sc))

In my command line testing I'm able to decode this base64string, write it to a file stream and save and I have a very much in tact PDF.  This is new to me however, so I hope I'm not tricking myself into thinking this is working correctly. 

When I run w ##class(%xsd.base64Binary).IsValid(myVarBinaryData) I get 1 so I think it's working correctly! 

Wondering however about the reading of 12,000 at a time.  Just as long as the read len is divisible by four it should work?

@Evgeny Shvarov and @Alex Woodhead 

Lots of good progress here, but still a few issues:

- Thanks to @Dmitry Maslennikov and his help, I am using a Dockerfile and compose to do the build, so everything is a little more contained and standard (I used that coffee store template as a starting place)

- After running App.Installer.setup() via the Dockerfile commands, I run do ##class(%EnsembleMgr).EnableNamespace("$NAMESPACE").  After that I do some application setup logic including a recompile of those classes that depend on those Ensemble classes and I finally get that clean compile of those classes and no error messages saying HL7.Class does not exist.  

- As I mentioned before, the namespace element in the Installer.cls manifest has interoperability turned on <Namespace Name="${NAMESPACE}" Code="${NAMESPACE}-CODE" Data="${NAMESPACE}-DATA" Create="yes" Ensemble="1">

- How can I run the command do ##class(%EnsembleMgr).EnableNamespace("$NAMESPACE") after the creation of the application's namespace but BEFORE loading and compiling the source code? My only workaround is to recompile after running App.Installer.setup()  and do ##class(%EnsembleMgr).EnableNamespace("$NAMESPACE").  I'm not sure how and where the manifest for the application in App.Installer is used and where I can insert this enable method.  That said, I feel like that <Namespace> tag should be turning it on.  I can't find a reference in the documentation for this manifest.

- Another question/issue is CSP pages.  Following the template, in the docker file I 'COPY csp csp`.  In the installer I mapp the source director to all the code I put in the src folder (classes and mac files).  Should I copy my CSP file here as well so they compile?  My workaround is similar to above, I use a utility to compile the CSP pages in the csp director since the installer doesn't pick them up since they aren't in the source dir.

@Dmitry Maslennikov 

It's a bit much and there's probably a better way, but the long story short is that many of our static files are in a folder in the root, so I have to build the container, run a 'docker exec' command as root to copy my file to the root, exit out of that, then log in again as a the regular user to start the iris session and run the build script.  

The script uses the sleep command to wait (now at 70 seconds and working fine) for iris to finish starting so I can start the session and run the script.  Otherwise I get the startup error.