When I use Escape logic when inserting or updating Oracle Table I'm getting Max-Length exceeded error. With the original value the length is good but after I add Escape Logic, it causes value to be greater than max-length. The original value was "I visited O'Brien before heading out of town." and after added Escape logic it was "I visited O''Brien before heading out of town." Max-Length is 45.
INSERT INTO MyText
(text)
VALUES
('I visited O''Brien before heading out of town.')
/\
right here
How to add the following to a stored procedure (Cache Studio) 1. Select from a few tables and insert result into a TABLEA. 2. Then select data from the TABLEA, apply some SQL logic, insert results into TABLEB. 3. SELECT * FROM TableA UNION ALL SELECT * FROM TableB
I know that you can use Do $SYSTEM.SQL.Schema.ImportDDL() to insert sql files into IRIS however I was wondering if there is a way that I can upload .sqlite files into iris? I have about 20 .sqlite files that I need to get into my database. I tried using the ImportDDL method but it said "SKIPPING non-SQL SOURCE:"
I'm trying to use the EnsLib.SQL.Operation.GenericOperation component in a production to read a column from a Redshift table that is set up as VARCHAR(65535) and am getting the following error.
An error was received : ERROR #5023: Remote Gateway Error: JDBC Gateway getClob(0,1) errorRemote JDBC error: Cannot convert the column of type VARCHAR to requested type long..
Need to make ODBC connection to fetch data from SQL Server to IRIS 2021.1. Once we are able to query the data from SQL server, need to Pass it in ensemble production
i am trying to embed a dynamic SQL into Objectscript code , but the %NEXT() will return 0 , however when i copy the same query and run it in the SQL bit it will give me the result i want the code :
I’m using EnsLib.SQL.OutboundAdapter in my InterSystems Ensemble operation to execute an insert statement in Sql Server Management Studio 20. While inline queries (e.g., constructing the query string with _ concatenation) work fine, I'm facing issues when trying to use parameterized queries.
I HAVE A NEW SQL CLASS THAT DISPLAYS ENTRIES BASED ON THE FIRST 2 NODES OF A GLOBAL. I FOUND OUT THAT THE CLASS ALLOWS FOR AN ADDITIONAL NODE(s) TO BE INSERTED IN THE "User Specification Node:" along the delimiter and the Piece in the NewStorage Map1 for "ModifyDDDD" shown below. it is not working.
The goal is to get data (from half a thousand to 3-4 thousands lines) from DB, calculate standart deviation then use it as logical condition in analyzer.
For example IF std > custom_value = show_the_result ELSE null
There is a STDDEV(MDX) method used in Analyzer but it is a measure and it can not be used as logical condition (correct me if i am wrong)
I tried executing the SQL JSON_TABLE query with large JSON string(more than 200000 characters) and I got the below error. I'm curious about this under the hood workflow and how does it reach reaches MAXSTRING.
I have a business service which is responsible for some batch operations with an SQL table. The process is generally slow but it is possible to scale the performance using multithreading and/or parallel processing and logical partitioning (postgres):
I have a little problem, I need to diplay the data based on input status (first time and after)
this is the data:
and this what I need to display
the notes become new because it is the first time that the data has been inputed, and it will become old if we have the data before (2nd data, 3rd data the notes will become old).