@Julie Bolinsky and @Stephen Canzano 
 

Thanks for your replies!

To back up and offer more clarity: the issue was that the WHERE clause of our query used a stored procedure to calculate a data element. This was causing the query to churn and for the Zen reports to timeout.  We wanted to break the query apart so that it could select a subset of the data by date range first and then loop through that data and run the store proc only that data to filter it further.   A subquery didn't end up being any more efficient here.  Views and CREATE TABLE ... AS wouldn't work at scale.  Since Zen wants a %Library.ResultSet object, I researched fetching each record and deleting data as needed OR creating a new %Library.ResultSet object and adding data to that, but there are no methods to support that.

The question was: can you give a Zen report a dynamic object as a data source?

Sounds like the answer is only %Library.ResultSet or XML file (or stream).  The stream idea is ok, but it's a big report and it would have been a big lift to transform the %Library.ResultSet into XML.

The solution was 1) move the stored proc out of the WHERE clause and into the SELECT and 2) use the filter='%val(""DataElement"")=VALUE' in the report tag to evaluate the table alias created by the stored proc in the select to skip the rows we didn't want to generate PDFs for.

Happy to talk more about this as I'm sure I'll be touching some Zen stuff again.

@Benjamin De Boe 
 

I went nuts today working with some of this stuff.  Can I confirm a few points:

- $SYSTEM.SQL.Schema.QueryToTable() does not exist in my IRIS version however $SYSTEM.SQL.QueryToTable() does.  I have IRIS 2021. Are these different things?

- I ended up using a CREATE OR REPLACE VIEW because doing a CREATE TABLE ... AS SELECT would create a table but return no data (I would even do a DROP TABLE operation first to make sure all was clear.  Am I correct in saying that a create table as select is strictly as COPY operation of an existing table.  Since I was effectively creating a new table with new headers, the view worked better here.  Is that the difference?

- To clarify, I used the $SYSTEM command to create table from query, because I kept getting errors with create table as select (didn't like my AS statement).  What could have gone wrong there?

@Timothy Leavitt 

Another question popping up.  How do I access the test result on the command line to see if it failed?  When I run DebugRunTestCase, the the URL spits out with the result index in it.  I created an instance of Manager and then ran the method but I don't seem to get any properties or output that say "this failed."

Thoughts here?

@David Hockenbroch 

Thanks!

I think the major assumption I forgot to mentioned was that this is an existing web app built on CSP that we would like to modernized with REST APIs.  

So to begin with the methods are defined in the CSP file and now we want to port them over to a REST API.  Since it's a generated class I get nervous calling it like I am but I don't see another way.

@Eduard Lebedyuk 

Thanks for this article.  I have followed this along with the online documentation and I am having an issue that maybe you can offer guidance on?

I have my web app setup, enabled, set to no authentication (just testing and learning right now so trying to make it as simple as possible), and my generated "Package.disp" class in the dispatch parameter.  

It WAS working.  Then I tried to call it from a React app using Axio to make the call and there were some CORS issues.  So I started playing with those settings 1) added "Access-Control-Allow-Origin=*" to the header of the request 2) added "Parameter HandleCorsRequest = 1;" to the Package.spec and 3) now that I've seen it in your article, added "x-ISC_CORS": true, to the path setting in the spec.

401 error every time. 

I even changed the auth in Postman from no auth to basic auth and put in my localhost creds.  401.  I've looked at the Auit Log and I see some activity every time I make the call but I'm not sure how to interpret. Even when I try to revert to my setup pre-playing-around-with-CORS, I get the 401 again.

What could be causing a 401 when everything is setup to be so relaxed?

@Brett Saviano 

Sorry, back again. Feeling like a dunce here . . . I updated my API using my OpenApi 2.0 JSON spec and the /api/mgmt POST call and looking in Studio, I can see my spec.cls is showing v 1.5 as I expect. When I refresh the explorer view (which is the live server, yes?) I'm still on v 1.4 which is my old spec.

Is there some other setting that might be preventing this refresh?

I looked here: Settings Reference - InterSystems ObjectScript for VS Code (intersystems-community.github.io), but I didn't find anything that stood out as a setting that would stop a refresh of the explorer view from showing updated classes on the server.

Thank you, Brett.  I must still be missing something.

In the ObjectScript explorer, if I right click on the project I have an option to 'Export Project Contents'.  If I right click on any package or file, I only have three options: Remove from Project, Server Source Control and Server Command Menu.

I've skimmed the doc again but don't see anything obvious in terms of settings that I might be missing. 

Any ideas?

@Brett Saviano 

Thanks for your reply!  I am familiar with that doc but didn't reference it because we already set up for client side development.  Yes I could just edit the spec and impl classes on the server, but since our source control isn't hooked into the server, so I would still have the issue of getting the code generated and updated on the server (from /api/mgmt/) to my local repo.  

I tried the ObjScript Explorer to export, but it seems I can only export ALL the files.  That's not practical when I just want to export one package.  

However, using the Export Code From Server command does let me select individual classes so that's great! 

Question: if we scale up with use of OpenAPI 2.0, the use of /api/mgmt/ and stick with client side dev, how can we make source control more smooth?  In other words, is there a way to automatically export impl and spec classes to the client whenever they are updated?

Good to know about the disp class and makes sense.  One less thing to worry about.  

While I was playing around with this, a few times after compiling changes to the impl class, the disp class was then removed from the server (this was in Studio before I figured out to get them on the client).  Have you experienced this before and know what might cause that to happen?

Opening this back up because getting the classes to my local repo is proving to be challenging.

In VS Code I was able to open the code on the server, click 'Download' on the package and download it to my local repo.  Problem is that the only two classes that show up are impl and spec, but not disp.

In a similar vein, if I used the API to update the spec the updated code was now on the server (just as it first got there when I did the create using the API).  Now when I try to download the updated package to my local, well, it won't let me.  I have to download each file which isn't too cumbersome, but it's just one.  

How can I:

1) Get the disp class to get to my local so I can ship it to the remote repo and other servers

2) Make the update process using the API smoother

3) Bonus: create a web app programmatically so when my colleague starts to use it for his front end dev, he's ready to go.