As we're starting to create more custom message classes to represent out JSON-based integrations, I was pondering how to implicitly grant SELECT privileges to a specific Security Role so they can utilize Message Viewer to search through the message history.
Sometimes we need to import data into InterSystems IRIS from CSV. It can be done e.g. via csvgen tool that generates a class and imports all the data into it.
But what if you already have your own class and want to import data from CSV into your existing table?
There are numerous ways to do that but you can use csvgen (or csvgen-ui) again! I prepared and and example and happy to share. Here we go!
I have an ObjectScript class with a Log property of type %Stream.GlobalCharacter. I need to search the end of each of the logs - is there a way to extract the last 10 lines of the Log stream for each object via SQL? Or is my only option to write an ObjectScript routine to walk the table and inspect each Log property one by one?
So I am working with an inherited SQL query that queries 8 different tables. 5 of which have over a million records. I have 3 different servers.
Server 1, Server 2 and Server 3. They all have the same data/tables/structure across all servers. Server 3 has an Iris database engine and the other 2 are Cache 2015.1.4.
I have a problem with views and performance across the servers being inconsistent.
Thankfully Server 1 is the current live server that performs "fast enough". Server 2 is being synced from Server 1 and acts as a report server.
The index we want to use is called "iFilter". Currently we use the following technique of ignoring all other indices because the automatically chosen index is always too slow.
This is a full example how to use a %ScrollableResultSet for results pagination using %DynamicQuery:SQL and build a JSON response including page details.
I am new to cache. In an interview i was asked how to optimize a sql query.
I just said I will create index on conditions which are present in where clause. But as per interviewer I should check How query plan is getting executed. This will help in optimizing Sql queries.
I want to know what will be the answer for how to optimize SQL query in cache.
After linking in Oracle Table with Field Column's Data Types of NUMBER, my updates into these table fields are resulting in data that is being rounded to 2 decimal places. I insert a record with 1234.1234 and 1234.12 is stored.
It appears Cache xDBC might be manipulating my values prior to sending to Oracle. Is there a setting or system parameter that is controlling this? If so, is there a way to relax this from occurring so the values I send are being stored in Oracle with the same values?
Use case: small in-house hospital systems that query patient demographics via SQL. The new PAS being implemented in the near future will only support query/response via HL7. SQL access is available but intended for reporting so up to 24 hours behind.
Probably the easiest option is to set up a database in Ensemble and keep it up to date via standard ADT feed. This is going to have quite a large footprint and has a risk of getting out of sync.
I'm wanting to write a query in SQL that will return a row with a count for each day for a given month or year for a specific operation or configname. The following is a start but I'm not finding what I want in the documentation to parse out the TimeLogged field of the table. Nay help is appreciated.
SELECT count(TimeLogged), ConfigName FROM Ens_Util.Log where TimeLogged like '2021-07%' and ConfigName = 'operation_Name' group by TimeLogged
I'm working in an application that uses %SIMILARITY to find matches among a set of documents that vary greatly in length. It's generally good but I've noticed issues with ranking short partially-matching documents over longer documents that match the search string entirely.
When for any particular reason I need to update a record and don't want to pull the triggers, the keyword %NOTRIGGER can be used. But I've been trying to do the same when I change the row using the object approach, but I can't find it. Anyone knows if it's possible to avoid pulling triggers when working with objects?
Sadly, the utility "DISABLE^%NOTRIGGER" doesn't seem to exist ;-)
When I'm asked questions about SQL, I like Intersystems staff at the WRC, generally ask about the Query Plan. My feeling is that even before you run a query you should examine the results of Show Plan to confirm that the code is going to give me the results in the manner I would expect. If I expect it to leverage an index and I see that it is not then I go back and look at what I might have done incorrectly and try to re-write the query to get the index behavior I'm looking for. To that end, there's a poll below to ask you if think moving the Show Plan button as the first button would be
Benjamin De Boe wrote this great article about Universal Cached Queries, but what the heck is a Universal Cached Query (UCQ) and why should I care about it if I am writing good old embedded SQL? In Caché and Ensemble, Cached Queries would be generated to resolve xDBC and Dynamic SQL. Now i
Apologies if this isn't the correct forum but here goes..
I'm accessing an Intersystems database from MS SQL Server using Intersystems ODBC driver and using OPENQUERY statement to read metadata of tables using
SELECT * FROM OPENQUERY([<linked server name>], 'SELECT * FROM INFORMATION_SCHEMA.COLUMNS')
but I get an error message saying INFORMATION_SCHEMA.COLUMNS cannot be found.
Intersystems SQL documentation does reference this table so is there some other way to access this table or is it maybe a permissions issue?
I'm close with this but I'm not sure how to grab JUST the GT1.3 data. I know I can do a substring but finding the 3rd | is a tad tricky. I've not been this deep in SQL for 15 years.
SELECT SUBSTRING(hm.RawContent, (CHARINDEX('GT1',hm.RawContent)), 50) as NameDesc FROM Ens.MessageHeader as em, EnsLib_HL7.Message as hm where em.Status = 'Suspended' and em.MessageBodyId = hm.id
I am trying to work out if there are any methods available to be able import a result set returned by SQL query into a persistent class.
I have to connect to some legacy SQL databases through SQL Gateway and run some queries. I need to inster the rows returned into a class to then be able to do a %JSONExport to produce a JSON object. I know I can iterate through the resultset and insert one row at a time into the class but was wondering if there is any other/direct way of importing the resultset rows into a class.
How strange this is to me. I'm querying a table against a numeric(15,2) column call max_client_resp_service. When I query max_client_resp_service > 0 or any number, I get results, but they are not always right, for example where max_client_resp_service > 5 returns values less than 5, e.g.
I'm running into an issue performing UPDATES that I'm not getting on INSERTS. It's probably obvious, but I'm just not seeing it and could use a little help.
I'm going over an HL7 message and depending upon varying criteria, the relevent variables will get items added to them like the following:
Set patientId = pRequest.GetValueAt("PID:3")
Set sqlColumns = sqlColumns_",patient_id"
Set sqlValues = sqlValues_",?"
Set par($i(p)) = patientId
After compiling the variables, I check to see if accession number is found in the table.