I am trying to create a Cache Trigger on a FACS table using the SQL CREATE TRIGGER command but I am getting the message [SQLCODE: <-300>:<DDL not allowed on this table definition>].
I am not creating a persistent class but using the SQL CREATE TRIGGER command.
Hi, we are a veterinary lab and we use both the LAB and FIN systems of Antrim. Now we are looking to expose the data in a SQL/Object compatible way so we were wondering if same / similar things had been done by other community members already? If so, could you please share your approach / experience / gotchas with us and we are all ears. I can be reached at yang.jiao@antechmail.com . Thank you!
I am pleased to announce the next 2016.2 field test kit, 2016.2.0.595.0.
It may look like a slow week, with less than fifty changes having been checked in, but this kit includes the following fixes to problems found by you, the ones running the kits in the field:
Hi, I have used CSP to exec SQL selects from any own NAMESPACE. But in our servers we have many SQL GATEWAY CONNECTIONS.
I'd like to create a CSP page that could use these gateway to exec SQL using these gateway connections, only Administrators will use that page to launch many select at many dsn. I'm not sure if we must deploy that CSP on %SYS namespace and how to use DSN(SQL Gateway connections) that are defined on server.
I've a Service utilising the Adapter EnsLib.SQL.InboundAdapter, which uses a Credentials item set with the details of a local SQL account. This currently works, however, we're looking to use the credentials of an AD domain account.
The domain account is a member of an AD security group, which has the required permissions on the source SQL database. I've checked that access is possible with this account via SQL studio.
We have few queries which are simple selects . For simplicity let's say there is a query that joins two tables and gets few columns and both tables have no indexes.
Select Tab1.Field1, Tab2.Field2 From Table1 Tab1 Join Table2 Tab2 On Tab2.FK = Tab1.PK
When we do query plan for this it shows approx 6 million, however if we make a simple adjustment to the query
It is happening for all the namespaces. Looks like some permission issue. Same issue with Data Export wizard. Help to resolve this will be appreciated.
I am using
Cache for Windows (x86-64) 2017.2.2 (Build 865_0_18763U)
What I find really useful about IRIS when teaching my subject of Postrelational databases is the fact that it is a multi model database. Which means that I can actually go into architecture and structure and all that only once but then show the usage of different models (like object, document, hierarchy) using the same language and approach. And it is not a huge leap to go from an object oriented programming language (like C#, Java etc) to an object oriented database.
However, along with advantages (which are many) come some drawbacks when we switch from object oriented model to relational. When I say that you can get access to the same data using different models I need to also explain how it is possible to work with lists and arrays from object model in relational table. With arrays it is very simple - by default they are represented as separate tables and that's the end of it. With lists - it's harder because by default it's a string. But one still wants to do something about it without damaging the structure and making this list unreadable in the object model.
So in this article I will showcase a couple of predicates and a function that are useful when working with lists, and not just as fields.
I am trying to work out if there are any methods available to be able import a result set returned by SQL query into a persistent class.
I have to connect to some legacy SQL databases through SQL Gateway and run some queries. I need to inster the rows returned into a class to then be able to do a %JSONExport to produce a JSON object. I know I can iterate through the resultset and insert one row at a time into the class but was wondering if there is any other/direct way of importing the resultset rows into a class.
Impedance mismatch is a term commonly used to describe the problem of an object-oriented (OO) application housing its data in legacy relational databases (RDBMS). C++ programmers have dealt with it for years, and it is now a familiar problem to Java and other OO programmers.
FOREIGN TABLES is a rather fresh feature in IRIS (2023.?) So I was motivated to try something new by own hands. Documentation of Foreign Table from File is a good starting point. Also the related promotional video is fine to start with.
For volatile tables (tables with many INSERTs and DELETEs), storage for bitmap indexes can become inefficient over time.
For example, suppose that there are thousands of data with the following definition, and the operation of bulk deletion with TRUNCATE TABLE after being retained for a certain period of time is repeatedly performed.
I've setup ODBC connection so I can access Cache data within SQL Server.
I want to be able to write SQL queries for internal monitoring purposes, similar to what's possible with SQL Server. Specifically I want to be able to check mirroring status (i.e. check which is the current primary mirror member), check the status of any Ensemble productions (started/stopped), check the status of business hosts etc. I want to do all of this from SQL Server to go with our other system monitoring solutions.
https://www.youtube.com/embed/owUeiGtLixE [This is an embedded link, but you cannot view embedded content directly on the site because you have declined the cookies necessary to access it. To view embedded content, you would need to accept all cookies in your Cookies Settings]
https://www.youtube.com/embed/S7v4zDn0N1c [This is an embedded link, but you cannot view embedded content directly on the site because you have declined the cookies necessary to access it. To view embedded content, you would need to accept all cookies in your Cookies Settings]
https://www.youtube.com/embed/YxpMc3sODxk [This is an embedded link, but you cannot view embedded content directly on the site because you have declined the cookies necessary to access it. To view embedded content, you would need to accept all cookies in your Cookies Settings]
Pandas is not just a popular software library. It is a cornerstone in the Python data analysis landscape. Renowned for its simplicity and power, it offers a variety of data structures and functions that are instrumental in transforming the complexity of data preparation and analysis into a more manageable form. It is particularly relevant in such specialized environments as ObjectScript for Key Performance Indicators (KPIs) and reporting, especially within the framework of the InterSystems IRIS platform, a leading data management and analysis solution.
I have created a view to stage some data in a different format and then want to reference that view in a SQL query from a table that filters the data from the view using a property of the table.
Example:
select MsgId, FileName, (select ReportName from custom_view where MsgId = ReportId ) as ReportName from main_table
Is this even possible? When I try this, I get an error table not found for the view?
In Cache 2018, we were using a macro in a query that looked like this:
select $$GetExtraSQL^GetExtra('B',bddtl.odnumb,bddtl.odsnum,bddtl.oddsc1) as "Description", * from sqluser.bddtl
We could save that query as a view, and there was no problem with it.
In IRIS, if we put that query into SQL in the management portal, it still works, but if we save that query as a view, when we try to run a query on that view, we get a big error message:
If you are a customer of the new InterSystems IRIS® Cloud SQL and InterSystems IRIS® Cloud IntegratedML® cloud offerings and want access to the metrics of your deployments and send them to your own Observability platform, here is a quick and dirty way to get it done by sending the metrics to Google Cloud Platform Monitoring (formerly StackDriver).
Why when I use a SQL on the Cache, the condition between "the expression" and "the expression", not permite to me, delete the all ID of this data class?
I would like to capture any NACK's that is sent back to the Operation. The Operation is already setup to "Save Replies/IndexNotOK's", but I would like to see if we can query Cache and pull those NACK's into an extract.