HELP: get and prase Caché databases logs
Hi all,
Does anyone have experience with obtaining logs from Caché databases and parsing them now? Please leave me a message.
I ask because my project involves some hospitals that used Caché when creating databases over 10 years ago. Currently, we cannot replace the database, but we need to check and parse the database logs from the Caché.
Thanks.
What do you mean by parsing logs, what do you expect to extract from it?
Hi there, thanks for your quick reply.
We are currently developing a Data Center to collect business data from hospital HIS systems. I realized that some hospitals use Caché databases. But we used Flickcdc to collect data which do not support Caché databases, such as Oracle and MySQL.
Is there a tool available to collect real-time? I mean, such as ogg provided by Oracle.
Many thanks.
There are no such tools at the moment, or anything similar to Oracle GoldenGate.
I suppose you meant Apache Flink, https://github.com/ververica/flink-cdc-connectors
Yeah, there is no support for Caché/IRIS. But it can be implemented, I already have experience with it
But, such an old system may not have adoption to SQL, and would require some additional work on the Caché side.
You may try using JDBC connection, to check if the data available through SQL. Then it could worth to implement flink CDC connector for Caché/IRIS.
Just in case you can use other options, I've already adapted Trino and Presto
What is the additional work you mentioned doing on the Caché side? I hope to be able to monitor and collect data in real-time when adding or modifying data, rather than just querying it.
There were multiple ways to build applications back then
So, depends on how old this application is, and it has not updated for a long time
If you already know for certain, that the data you need can be accessed through SQL, then it's possible to make Flink working
Is there a detailed method there? Sorry we are not very familiar with cache databases.
You can try using DBeaver, connect to Caché, and see if there are any tables you interested
Yes, it can connect to Caché, but I'd like to monitor the real-time datas and place them in Kafka.
Have you checked Open Exchange?
https://openexchange.intersystems.com/?search=cache%20log&sort=r
Yes, thank you. But I'm not sure if there a tool that support Caché databases and can collect data in real-time, such as the ogg provided by Oracle.
Hi, I have checked the link, but it doesn't seem to be available.
I'd like to collect tables based on logs, parse the logs and place them in Kafka.
In general, such approach is far from optimal for Caché like databases because logs (which are usually called "journals" in Caché/M world) are being written on global (= lowest possible) level, disregard of data model used by an app (SQL, Persistent classes, etc). Reconstructing app level data from underlying globals can be a tricky task even for Caché guru.
That was one of the reasons why colleagues of mine took another approach for close to real-time data export from Caché to external system. In a few words, they used triggers to form application level data packets on Caché side and pipe them to the receiver. This approach saved CPU time preventing its waste for filtering out unnecessary journal records and minimized cross-system network traffic.