I have several BPL's that act like SQL Server Integration Services (SSIS) where they are transferring data. When I do this they are all being written under 1 session id, is it possible to make the BPL create multiple session ID's as it is cycling through a Snapshot, so it is easier to read in the trace?
Ok so I am way outside of my comfort zone, and had to build an application using CSP to give users the ability to access SQL configuration tables. These SQL Configuration tables will affect the data that is sent to the downstream system.
I saw in the examples where we were able to import GIF's/IMAGES into the CSP folders to use as a reference in our CSP pages. My question is how do you do that? If I try to import through studio, it tells me the file is invalid.
Just trying to make it a little more user friendly then blocks on a page.
I've checked the syntax with some public tools with adjusted Tablename "Fehlermeldung" -> `Fehlermeldung`. It seems to be valid, but cache does not accept it.
I have been trying to pull data through a linked server in SSMS from an InterSystems Cache Database for a while, this is to enable us to join the data to other source systems in our Data Warehouse.
I have set up an ODBC connection and a linked server to the database and can execute queries through OPENQUERY in Management studio, but the data is huge (> 100million rows). So when I execute a SQL query with a WHERE clause the query just spins.
We are trying to convert some of our SQL Service Integration Service jobs from Visual Studio to Ensemble. If we execute a Stored Procedure within SQL Server Management Studio it is returning approx 12,000 rows. However when Ensemble executes the same Stored Procedure it is only returning 250 rows.
Has anyone found an Eclipse plug-in that provides the capability to connect to a Caché server and give the user a way to write SQL queries using the tables from that server? I'm picturing something like a "WinSQL"-client built as an Eclipse plugin.
I've found and tried the following, but I couldn't get it to connect to my local Caché instance.
I have a Recordset object which contains data from a table "XYZ".
Currently i use this object to extract data using %Get(COL1,COL2...) in a loop and than pass it to a function which inserts the data into another dynamically created Table "ABC" for each record. This takes a lot of time when 100's of records.
Is there a way i can directly copy a RecordSet to a dynamic table without looping through..?
Something like copy Recordset (COL1,COL2..)--> "ABC"
I am trying to return the maximum of the value of 2 fields: LastViewed and LastDownloaded AS a local variable -LastAccessed for each row, using a SQL query . These values are stored as $ H format. Is there an existing SQL command that compares two column values ? I could not find one, so I tried using a $Select statement . I got an error that said A term expected beginning with either of: identifier, constant, aggregate, $$,(,:,+....)
I have a class that has a property calledTags (like DescriptiveWords, but tags), where multiple tags are possible. I am trying to decide on list of Objects vs. array of Objects.
Most of my classes are mapped from Globals. I want to access Cache classes from a BI software through ODBC connection.
'Last update' information does not exist in most of the classes. My question is whether there is a 'last update' timestamp that is automatically generated for each line in classes I can extract to external systems?
I am trying to create an ODBC class that includes all of these as records, but I don't see how I can since the first record has 3 subscripts, the rest have 4 subscripts:
Hello, I need to use IRIS to connect to an MSSQL base. It has to be done via ODBC, I can't use JDBC at this time by client option.
I am trying to use Microsoft Driver libmsodbcsql-13.1.so.9.2
But I can't, my attempts result in: Connection failed. SQLState: () NativeError: [11001] Message:
I have done all DSN configuration, and my configuration is listed in SQL Gateway Connections. I know it's working, because when I run a test with isql I have the information that connects to the bank.
I would like to start a discussion regarding Caché Objects and Caché SQL.
It is my understanding that the creators of Caché Objects see Caché SQL as the reporting arm of Objects and as such SQL is essential to Caché Objects.
I once met a Caché Objects programmer who was writing code to $Order through the Globals because that person thought that Caché SQL was too slow and inefficient. I attempted to convince the person otherwise.
So, what say you? Is SQL essential to Caché Objects?
I'm looking to find if there is a datatype convert equivalent in Object Script to SQL convert function. Have a VarBinary string coming in from source application (which is really performing a SQL dump). The source application uses the standard SQL convert function to convert from varchar to varbinary on their side.
I know &sql(Convert()) should work in Object Script, but am wondering if there is a better way of doing this.
Getting data in via flat file (Record Map), then using data transform to transpose this data to SDA3.
I would like to work in SQL developer with the tables from Caché.
Is it possible with JDBC tool?
What is the whole process in order to work in SQL developer? I have the access information to Caché, but i can only choose in SQL Developer software only Oracle or MySQL database type, therefore i think, that have to install any other tool.
I've been accessing Cache tables from a developer/reporting side, but am now involved in a project to create a data warehouse for our application. I'm trying to find a query I can use to return the sizes of all the tables in the database, so we can identify the largest tables and handle those individually. Can someone give me a query I can run against our Cache database to return the sizes of all the tables from largest to smallest?