I don't really understand the best practices on building the IRIS REST apps and passing authentication through. 

I.e. do people just tend to authenticate the CSP page it goes to and that is fine .

Or after the authentication method do they tend to use that same user logged in to make the API call? 

i.e. (although should be in .env) do rest apps tend to look for if iris is authenticated or should it just use a set up user and pass to make the API call?

Thanks i had looked at the documentation the insert data from another table and the defining a table from another table. 

Yes the syntax wasn't quite like documentation (was solid normal sql syntax i think) . Still doesn't seem to like it via linked table 

ERROR #5475: Error compiling routine: %sqlcq.SRFT.cls301. Errors: %sqlcq.SRFT.cls301.cls ERROR: %sqlcq.SRFT.cls301.1(19) : <UNDEFINED>parseExtFromNode+1^%qaqcmx *mt("f","1^SAMPLE.TEST") :

EnumerateJobStatus returns what portal sees 

while resultset.%Next() {do resultset.%Print()}
"PAS Outbound Process"  OK  "2024-12-05 14:22:55.320" 1
"PAS Outbound Process"  OK  "2024-12-05 14:22:55.322" 1
"PAS Outbound Process"  OK  "2024-12-05 14:23:00.425" 1
"PAS Outbound Process"  OK  "2024-12-05 14:37:09.665" 1
"PAS Outbound Process"  OK  "2024-12-05 14:37:09.666" 1
"PAS Outbound Process"  OK  "2024-12-05 14:37:09.667" 1
"PAS Outbound Process"  OK  "2024-12-05 14:37:09.669" 1
"PAS Outbound Process"  OK  "2024-12-05 14:37:09.670" 1
"PAS Outbound Process"  OK  "2024-12-05 14:37:09.671" 1
"PAS Outbound Process"  OK  "2024-12-05 14:37:09.672" 1
"PAS Outbound Process"  OK  "2024-12-05 14:37:09.673" 1
"PAS Outbound Process"  OK  "2024-12-05 14:37:09.674" 1
"PAS Outbound Process"  OK  "2024-12-05 14:37:09.676" 1
"PAS Outbound Process"  OK  "2024-12-05 14:37:09.677" 1
"PAS Outbound Process"  OK  "2024-12-05 14:37:43.162" 1
"PAS Outbound Process"  OK  "2024-12-05 14:38:31.334" 1
"PAS Outbound Process"  OK  "2024-12-05 14:39:08.718" 1
"PAS Outbound Process"  OK  "2024-12-05 14:39:31.265" 1
"PAS Outbound Process"  OK  "2024-12-05 14:41:07.591" 1
"PAS Outbound Process"  OK  "2024-12-05 14:42:07.282" 1
"PAS Outbound Process"  OK  "2024-12-05 14:47:07.486" 1
"PAS Outbound Process"  OK  "2024-12-05 14:47:55.697" 1
"PAS Outbound Process"  OK  "2024-12-05 14:50:31.971" 1
"PAS Outbound Process"  OK  "2024-12-05 14:50:55.610" 1
"PAS Outbound Process"  OK  "2024-12-05 14:51:12.143" 1
"PAS Outbound Process"  OK  "2024-12-05 14:51:49.692" 1
"PAS Outbound Process"  OK  "2024-12-05 14:51:59.348" 1
"PAS Outbound Process"  OK  "2024-12-05 14:52:08.554" 1
"PAS Outbound Process"  OK  "2024-12-05 14:52:32.065" 1
"PAS Outbound Process"  OK  "2024-12-05 14:53:00.146" 1
"PAS Outbound Process"  OK  "2024-12-05 14:53:56.552" 1
"PAS Outbound Process"  OK  "2024-12-11 06:59:22.608" 1
"PAS Outbound Process"  OK  "2024-12-11 06:59:46.682" 1
"PAS Outbound Process"  OK  "2024-12-11 07:00:34.744" 1
"PAS Outbound Process"  OK  "2024-12-11 07:00:46.823" 1
"PAS Outbound Process"  OK  "2024-12-11 07:01:22.856" 1
"PAS Outbound Process"  OK  "2024-12-11 07:03:46.967" 1
"PAS Outbound Process"  OK  "2024-12-11 07:07:53.180" 1
"PAS Outbound Process"  OK  "2024-12-11 07:09:41.327" 1
"PAS Outbound Process"  OK  "2024-12-11 07:15:35.702" 1
"PAS Outbound Process"  OK  "2024-12-11 07:17:47.825" 1
"PAS Outbound Process"  OK  "2024-12-11 07:18:35.831" 1
"PAS Outbound Process"  OK  "2024-12-11 07:23:24.191" 1
"PAS Outbound Process"  OK  "2024-12-11 07:27:00.354" 1
"PAS Outbound Process"  OK  "2024-12-11 07:33:18.946" 1
"PAS Outbound Process"  OK  "2024-12-11 07:50:25.979" 1
"PAS Outbound Process"  OK  "2024-12-11 07:50:38.109" 1
"PAS Outbound Process"  OK  "2024-12-11 07:51:14.008" 1
"PAS Outbound Process"  OK  "2024-12-11 07:52:38.147" 1
"PAS Outbound Process"  OK  "2024-12-11 07:54:26.438" 1
"PAS Outbound Process"  OK  "2024-12-11 17:56:14.557" 1
"PAS Outbound Process"  OK  "2024-12-13 15:22:56.105" 1
"PAS Outbound Process" 12648 OK  "2024-12-18 14:15:16.105" 18

I think it might be my misunderanding of how the rule editor works with foreach but the documentation is not clear 

I would expect to see trace "1" send, trace "2" send, not it running through the foreach until the end have a count of how much it succeeded and then send with userdata "2" twice. Is there a way to change this? Why it is this way around?

Hi:

The reason for counts rather than days was a more granular approach to deleting data with no day information about it i.e. Stream data which would not have a date assigned to it.

Also a lot of custom data ends up in the same table if you don't define a new location for the message data. This would cause an inability to run proper sql against it to clear out the data so a count based approach of where the last message id was required 

Hi Josh:

It's interesting you find it is an intermittent issue. We had some very large rules and when we turned on the new rule editor no matter what would get an :( face. 

If we deleted a few rules up to a certain amount this would load up correctly. 

If you look in your devtools when this happens it is a timeout issue where the connection is being closed prior to the rule having loaded. 

With asking WRC and pointing out the issue the :( issue was replicated in 2023.1 but did not happen in 2024.x. 

What I think it is is the gateway timeout time is too small and if you update the timeout to a bit longer it might allow the web application to load. 

Alternatively go to web application /ui/interop/rule-editor and disable the application to go back to the old rule editor. This issue and No crtl+f find functionality has caused many people to divert from the new editor until it is optimised, as well as i believe in 2023 too much whitespace.