Question
· Jan 7

HELP: get and prase Caché databases logs

Hi all,
Does anyone have experience with obtaining logs from Caché databases and parsing them now? Please leave me a message.

I ask because my project involves some hospitals that used Caché when creating databases over 10 years ago. Currently, we cannot replace the database, but we need to check and parse the database logs from the Caché.

Thanks.

Discussion (12)2
Log in or sign up to continue

Hi there, thanks for your quick reply. 

We are currently developing a Data Center to collect business data from hospital HIS systems. I realized that some hospitals use Caché databases. But we used Flickcdc to collect data which do not support Caché databases, such as Oracle and MySQL.

Is there a tool available to collect real-time?  I mean, such as ogg provided by Oracle.

Many thanks.

There are no such tools at the moment, or anything similar to Oracle GoldenGate.

I suppose you meant Apache Flink, https://github.com/ververica/flink-cdc-connectors

Yeah, there is no support for Caché/IRIS. But it can be implemented, I already have experience with it

But, such an old system may not have adoption to SQL, and would require some additional work on the Caché side.

You may try using JDBC connection, to check if the data available through SQL. Then it could worth to implement flink CDC connector for Caché/IRIS.

Just in case you can use other options, I've already adapted Trino and Presto

There were multiple ways to build applications back then

  • Using globals to store data, no this way can be called NoSQL, this case will require additional work on the server side
  • and using persistent classes, which can be accessed via SQL

So, depends on how old this application is, and it has not updated for a long time

If you already know for certain, that the data you need can be accessed through SQL, then it's possible to make Flink working

I'd like to collect tables based on logs

In general, such approach is far from optimal for Caché like databases because logs (which are usually called  "journals" in Caché/M world) are being written on global (= lowest possible) level, disregard of data model used by an app (SQL, Persistent classes, etc). Reconstructing app level data from underlying globals can be a tricky task even for Caché guru.

That was one of the reasons why colleagues of mine took another approach for close to real-time data export from Caché to external system. In a few words, they used triggers to form application level data packets on Caché side and pipe them to the receiver. This approach saved CPU time preventing its waste for filtering out unnecessary journal records and minimized cross-system network traffic.