Hi!

I was trying to create a query that can be exposed as a stored procedure (function actually) that would return a resultset with a random number of columns.

Unfortunately, it seems that unless I specify the ROWSPEC annotation on the Query method, I won't get any columns exposed. I was hoping to implement QueryNameGetInfo method and specify the names and number of columns I would be returning dynamically. But it seems that GetInfo information is simply ignored.

Here is my code:

1 5
0 693

Hi! I am working in a java project (Spring Boot+ Maven + Hibernate) using jpa/hibernate to manage the persistance with the IRIS database from the Docker image (store/intersystems/iris:2019.2.0.107.0-community) and I've found an issue using the IRIS instance, when I define tables with relationships OneToMany, ManyToOne or ManyToMany and I try to fetch all the rows of the tables using the default method findAll (JpaRepository implements that method to get all the rows by default) the query automatically exceeds the limit of licenses availables.

0 11
1 662

I defined the variables in Cache, as follows:

Property Content As% Stream.GlobalCharacter [Required, SqlColumnNumber = 3, SqlFieldName = B_Content];

And then use the SQL statement to add:

insert into table_name (B_Content) values ('very long string');

But the error:

0 3
0 635

Hi everyone,

i have a problem with the "ExecuteUpdateParmArray()" methode from the "EnsLib.SQL.OutboundAdapter" adaptater when i try to update a sql server 2008 table with JDBC connection.

..Adapter.ExecuteUpdateParmArray(.nRows,sqlUpdate,.param)

return to me the following error:

0 3
0 606

Hi

I created a jdbc connection in the Caché 2010.2.3 with SQLServer 2008R2.

The connection to this DB works correctly. (Conection Sucess)

I try to perform table binding but this connection, even though it is successful, does not load my tables and schemas.

I did the same test on Caché 2015 , with same jar drivers files and works perfectly!

Any idea?

Caché 2010.2.3

Red Hat Enterprise Linux Server release 6.2 (Santiago)

java version "1.7.0_09"

Caché 2015

0 2
0 595
Question
· Apr 24, 2017
Connect PRTG to Cache

I am trying to set up a sensor in PRTG to connect to Cache, specifically the ens_util.log, so that I can have a live feed of my error count. I am having trouble getting the sensor to log in to Cache. Has anyone had any luck getting PRTG to connect on the database level? Thanks!

0 2
0 572

Without installing Kerberos has anyone Authenticated a SQL JDBC connection? Currently we are using local SQL Accounts to sign onto External SQL Databases, but we are being told that we need to switch to Service accounts that live on a Active Directory Domain.

I wrote with a little help a ZAUTHENICATE to do the Authentication for Ensemble, can I use something like that to connect to an External SQL Database using a Service Account on a Active Directory Domain?

Thanks

Scott

0 3
0 557

I want to INSERT a record in a database using JDBC in OBJECTSCRIPT. At the same time, I want to obtain the insert ID. Is there a way to achieve this using the SQL Outbound adapter?

My code is something like this now:


Property Adapter As EnsLib.SQL.OutboundAdapter;

set sql = " INSERT INTO Prenotazioni_CUP "_
" (ID, cf
" VALUES (SEQTAB.NextVal, ?) "
set status = ..Adapter.ExecuteUpdate(.rs, sql, pRequest.cfAssistito)

0 5
0 493

Hi ,

When I use jdbc driver to query the column info ,the "REMARKS" field always show the same as the "COLUMN_NAME" field.

When I use the sql "select * from INFORMATION_SCHEMA.columns a where a.table_name='some table name ' " to query columns info,there has a "DESCRIPTION" field ,the value is some comment for the current field.

So is there anyway to return the description to the jdbc remarks field? And is there anyway to update the desciption or remarks field? Or is there some other way that I don't know to manage column comment info?

0 1
0 430

Hi all,

I'm trying to use LOAD DATA to insert 11k (11,377) rows of data. LOAD BULK DATA is not available for the version of IRIS I am using.

After calling LOAD DATA it says only 5,500 rows has been inserted. The LOAD DATA docs says any error rows are skipped and a count of skipped rows can be found in %SQL_Diag.Result however there are no results here. There are no errors in the xDBC error log either.

Why have over half the rows been skipped?

3 3
0 416

Once a week we are attempting to load an XML file from Workday into a MS SQL table using JDBC and Store Procedures. There is approx 102999 records in this XML file. We are struggling with processing the entire file within a reason amount of time. We feed the XML through a BPL to then populate values in a stored procedure then call the stored procedure through a Business Operation. I have tried splitting out the Business Operations to make two calls, but we still continue to see an issue loading the XML into MS SQL.

0 4
0 399