User bio
404 bio not found
Member since Jan 25, 2022
Replies:

@Timothy Leavitt to the rescue! 

Thank you, this has been the most successful approach by far.  I got this to work by defining a <script language='cache' runat='server'> block where within it I wrote a <script language='javascript'> block and then set my global variable there which I later user in my $(function() { . . . method.  

The only challenge is that since I was using write commands to write out a script within a script, it wouldn't let me simply w "</script>" at the end, so I had to kind of hack it and w "</"_script>"

That said, I obviously designed this poorly and the payload coming back didn't scale up and it's making the webpage slow overall.  This will get us up and working again, but I need to redesign this so it's more efficient.

For what it's worth, I thought for this particular tool I was designing that I would just return a whole payload to the client that represented the configuration for this particular setting not knowing that it scales up very quickly.  The idea was that instead of making calls to the server over and over, we would let them modify the whole configuration on the client and once it's done, send it back over. 

I guess fodder for another post!

Hi @Stephen Canzano and thank you for the reply!

I essentially tried a similar approach doing this

set stream = ##class(%Stream.TmpCharacter).%New()

do myDynamicObject.%ToJSON(stream)

Where now the whole dynamic object is set to stream.  The problem comes when I send that stream back to the client.  I need the data in that stream for a JavaScript function so I read from the stream into a variable on the server.  However because it's so big, reading the stream into a variable to use in the JavaScript is causing the MAXSTRING error as well.

I tried wrapping it in a REST API but under the hood in the %Rest.Impl class the %WriteResponse is just calling a %ToJSON on the dynamic object which is causing the error.  It doesn't seem to be handling streams or if it does, it still wants to get my dynamic object into a JSON string but it's just too big.

The data does change over time, so I don't think the JSON adaptor is the right approach.

Thanks for any additional advice!

@Ali Nasser digging into this more this morning I am trying to determine the best use cases be between SQLProcs and Calculated data.

There are some obvious differences like do you want something to be calculated when saved or updated.

But in general since a lot of my data exists, it's going to be calculating on the fly (I suppose I could write a script to save/update all the existing records for force a calculated field). I start to wonder though, are calculated fields and SQLProcs savinging me query performance or am I just offloading my post processing into the query itself?  Does that make sense?

The reality is that our swizzled references will knock us down a few pegs no matter how I configure it but I do hope wrapping it all in the query can be more efficient.  I have a feeling we'll see some gains from the filtering and sorting end since I'll be able to simply select this, that, this and that and then the filters and sorts are working with that table data already calculated.

Certifications & Credly badges:
Michael has no Certifications & Credly badges yet.
Global Masters badges:
Followers:
Following:
Michael has not followed anybody yet.