I am under the impression that inside HealthShare you can Services, Processes, and Operations. The service takes incoming data, the processes can process that data, and then the operations can send that data out.

I am curious if anyone has any experience, guides, or advice on how I can send data to a service using a Java application. I intend for the data to be in XML format. I am also curious if I succeed at picking up the data in the service, how I send the data back to the Java application using an Operation.

0 2
0 292

Hi Cache team, I am in the need of listing all the user defined schemas that are present my Cache db and also the user defined tables and views and Columns of those tables and views through Queries. So that I can write some JDBC code to run the queries and fetch the above metadata. Any help is appreciated.

Thanks in Advance,

Kranthi kiran.

0 2
0 1,815
Question
· Apr 24, 2017
Connect PRTG to Cache

I am trying to set up a sensor in PRTG to connect to Cache, specifically the ens_util.log, so that I can have a live feed of my error count. I am having trouble getting the sensor to log in to Cache. Has anyone had any luck getting PRTG to connect on the database level? Thanks!

0 2
0 469

Consider a Natural Key with an Identity (Serial) field.

I cannot seem to acquire the generated value after persisting my entity. That is, the returned entity by Spring Data's "save" does not have the generated value.

The value is generated by the database, and I can query it after repository.save(entity).

I have done some testing and created a Github repo with it...

0 1
0 180

When using "IDENTITY" as my primary key, I can select the last inserted ID with

SELECT LAST_IDENTITY() FROM %TSQL_sys.snf;

Actually this is how Hibernate + Iris' Driver acquires the inserted ID when mapping with

@GeneratedValue(strategy = GenerationType.IDENTITY)

Now, considering that I am using the type "SERIAL" as my primary key instead, how can I get the last inserted ID?

Note that with "SERIAL" I can forcefully insert any value for this ID, from which Iris will continue generating values...

0 1
0 176
Question
· Apr 14, 2021
Java Connection Strategy

We are planning to build a REST API with the Java Quarkus Framework.

  • There'll be a connection pool;
  • The DataSource instance will be automatically injected from the connection pool;
  • We will use JDBI to do SQL instructions and this will give us automatic Connection and Statement management;

This Caché instance already have COS applications running and consuming connections and licenses.

After doing this, we will migrate to IRIS.

0 1
0 279
Question
· Jun 14, 2018
Single Row SQL.Snapshot

I have several stored procedures that when I execute them they will only return a single snapshot. In my BP I have been setting this to a Snapshot variable then looping using a WHILE through the snapshot variable just to get that single value.

Since it is only a single row, is there an easier way where I don't have to do a WHILE loop to pull the values out of that row? Can I call First Row or something like that to get me just the row into the Snapshot variable?

0 1
0 382

Hi ,

When I use jdbc driver to query the column info ,the "REMARKS" field always show the same as the "COLUMN_NAME" field.

When I use the sql "select * from INFORMATION_SCHEMA.columns a where a.table_name='some table name ' " to query columns info,there has a "DESCRIPTION" field ,the value is some comment for the current field.

So is there anyway to return the description to the jdbc remarks field? And is there anyway to update the desciption or remarks field? Or is there some other way that I don't know to manage column comment info?

0 1
0 197

Hello,

I work on deploying IRIS using Kubernetes operator and Red Hat OpenShift. I encouraged another team working on Java application to consider using IRIS as database. My team deployed IRIS cluster using two mirrored data pods for the other team. The other team asked me for the connection information.

To learn how to use Java with IRIS, I attempted to deploy two apps from Open Exchange:

https://openexchange.intersystems.com/package/CRUD-GLOBALS-IRISNATIVEAPI...

2 1
0 165

Hi!

I'm trying to connect to one of our Ensemble servers Cache database from a C#-windows form . I'm running the client from my local computer with OS win7. Using .NET FW ver 4.5.2 in the client.

ODBC local setup using "InterSystems ODBC35"

In this ODBC konfigurationview i can put my userID and password and try a testconnect (or ping). And that run successful.

However, we don't want to leave credentials in the ODBC-configuration it self (open up for anybody to use the source) but instead send it from the klients.

0 1
0 358

I use DataGrip (JDBC client) to query Caché server via JDBC.

The problem I encountered, is that if I wait 10 or more minutes between queries I get an error:

[08S01][461] Communication link failure: Socket closed

To fix that I need to disconnect and connect to server again. Is there a JDBC timeout setting somewhere I can change?

0 1
0 1,421

I was attempting to test a Linked JDBC View to MS SQL database and noticed I could not connect. When I look at the JDBC Gateway I noticed that at the Server Level it was down. However the page keeps timing out when ever I attempt to make any changes or start it, it will not respond.

I thought I had found the Cache command to start it but it will not start. Is the following steps correct?

0 1
0 129

Hello,

I have a question regarding the Intersystems Caché Database and its jdbc driver. I need to set the connection timeout for the database, but I couldn't find any documentation stating that the jdbc driver for Caché supports setting the connection timeout. However, I noticed that the jdbc driver for the Intersystems Iris Database appears to support this feature.

My question is, can I use the Iris jdbc driver to set the connection timeout for the Caché Database?

The jdbc driver I use: cache-jdbc-2.0.0.jar

0 0
0 89

Hello everyone!

I am writing a SQL CALL (using JDBC) to a stored procedure that outputs a structured object (Oracle Object).

However, the adapter method is not accepting the corresponding JDBC Data Type STRUCT for the output parameter, returning the following error:

0 0
0 339