Does anyone have a query that I could run to show a Vendor the time difference between when a message was sent out a BO, and when we received the HL7 ACK back that is associated with the message sent?

I am trying to prove to this vendor of the delay we are seeing getting the ACK back because of a Timeout

I know how to pull Ens.MessageHeader, and EnsLib.HL7.Message but not sure how to match up the Message with the HL7 Acknowledgement received.

0 1
0 29

Hi community.

I was wondering if it was possible to use something like EnsLib.SQL.InboundAdapter with tables in IRIS.

This library monitors when a record has been inserted into a table in an external database, so it requires a DSN to connect to that database.

My goal is to make a call to an external API that takes a long time, it could spend nearly an hour (or more) completing its processes, but I don't want to block the main process.

1 2
0 41

I am testing vectorsearch, while doing so I am trying to paginate my resultset for a "next page" function to give me the first, second, third 15 entries within a table.

For this I have two embedding classes. One with a HNSW Index (vectornomicembedtextlatest) , and one without (vectornomicembedtexttest).

Calling SELECT ID,PRIMKEY FROM SQLUser.vectornomicembedtexttest LIMIT 5 OFFSET 1 works fine with the first entry having the rowID of 486448. (I deleted old entries in the beginning and reused the table)

0 0
0 28

Hi, I am unsure how to remove this restriction; when I am performing dynamic SQL using ##class(%SQL.Statement).%ExecDirectNoPriv(, .query, args...)

It works fine, but the moment I add specific properties from the persistent class I am performing the select on into the WHERE clause, I get: ERROR #5540: SQLCODE: -99 Message: User UnknownUser is not privileged for the operation. Despite using %ExecDirectNoPriv, I've tried with prepared statement as well, exact same situation.

0 5
0 94
Question
· Jul 17, 2018
Cache Dynamic SQL Pagination

Would like to know if there is an alternative or better way to paginate through a dataset using dynamic SQL than what I am using below. The problem is that as the potential pool of data gets larger, this code slows down to the point of not being useable. In analyzing each line of code below, it appears the slow down is related to the initial rset.%Next() iteration. Is there anything available which does not require a subquery/%VID such as a simple LIMIT/OFFSET?

My code is similar to :

s sql=##class(%SQL.Statement).%New()

2 31
3 3.4K

I was wondering if someone could help me. In the past I have been able to call external Stored Procedures through a SQL Outbound Connection and have them return me the EnsLib.SQL.Snapshot to use within a BPL to extract data.

But this time instead of using a SQL Outbound BO to make the Stored Procedure call, I decided to create a Linked Stored Procedure through the %JDBC_Server to point to the Stored Procedure out on MS SQL.

However, I am struggling to get the code just right to return the Column value from the Linked Stored Procedure.

0 2
0 45

Hello,

Our software commonly returns a full result set to the client and we use the DataTables plugin to display table data. This has worked well, but at datasets grow larger, we are trying to move some of these requests server-side so the server handles the bulk of the work rather than the client. This has had me scratching my head in so many ways.

I'm hoping I can get a mix of general best practice advice but also maybe some IRIS specific ideas.

Some background

0 6
0 93

I have the need to query an external database and write the result set/snapshot to an internal %Persistent [ DdlAllowed ] table that I built. I have built inbound SQL Services before and write them externally to replace SSIS jobs, but how would querying a database via a Service and writing the data to an internal table work?

Can I just take the inbound query structure and write it to the class file of the internal table in a DTL? If so, what would be the Target? Or does this need to be done within a BPL as a Code block?

0 2
0 50

Hi Community,

I've created a method in my File Service to do a cleanup for every file load. Currently, I've set it to delete data when LastUpdated date is greater than maxdate. However, I want to do a cleanup for every new file load. Any suggestions or advice on how to do this? Thanks!

Method Cleanup()
{

0 0
0 53

Hello,

So i want to use the INSERT OR UPDATE command so i can update a COUNTER for a given name:

INSERT OR UPDATE myTable
SET name='Omer', counter = counter + 1;


as you can see with the above code - if the row is non-existent then we get an error because COUNTER is NULL!
I tried the following to fix this but all have failed:

3 26
0 276