InterSystems Caché database is a file where all the data, application scripts, and users, roles and security configurations are stored. Typically the name of the file is cache.dat.
Thanks for all replies in advance. We have a security vulnerability that we have to get rid of. We use Putty software to connect to cache as a terminal allowing several users to do maintenance work in cache. this uses telnet Plain text. I know that we can configure telnet to be encrypted using the super server service and I'm looking for software that can work like Putty as a terminal using encryption compatible with cache telnet encryption. If I have cache installed on my PC and setup a connection to the server using Kerberos with encryption and use the terminal option to connect to
I am trying to get the time difference between two time stamps one is recorded earlier to the one happening current but the problem is sql expect string while I have the other stored in a variable and if I do the following I get errors any help please
We have been storing raw messages in a MySQL database for DR and ad hoc purposes. We are thinking of using an Ensemble instance as our data lake instead. We could segregate the source data by namespace or by global. But either way we'll want a custom global to index the data for data retrieval performance purposes.
InterSystems Caché globals provide very convenient features for developers. But why are globals so fast and efficient?
Theory
Basically, the Caché database is a catalog having the same name as the database and containing the CACHE.DAT file. On Unix systems, the database can also be an ordinary disk partition.
I am trying to convert a string to date but can not get it to work I have function that I would like to take in a date string and covert it to date object
here is the ezample so far can not get it to work any help appreciated
set p="12/03/2019"
w $System.SQL.TODATE(p,"YYYY-MM-DD")
<ILLEGAL VALUE>todate+32^%qarfunc
I am planning to implement Business Intelligence based on the data in my instances. What is the best way to set up my databases and environment to use DeepSee?
We have a very old version of Ensemble with one of our clients and they have no desire to upgrade anytime soon. We have gotten the all-clear to purge really old messages from the database, changing the days kept from 60 to 30. The option to Compact/Truncate is displayed in this version of Ensemble, but does not execute as it mentions not being actually present in this version.
There is an option in ^d DATABASE that restores unused space, however this does not return nearly as much free space as the refined Compact/Truncate procedure.
Is there a way or can it be done to use conditional logic in sql like so
Query Q1(formal as %String) As %SQLQuery [ Final ]
{
SELECT patientnumber,ID, CASE
WHEN ID = 50 THEN "The is 50"
WHEN ID = 30 THEN "This is 30"
ELSE "The quantity is under 30"
END FROM Audit.Table WHERE ID = :formal AND EndDate is null}
How we can reduce the size of cache.dat file? Even after deleting the globals of a particular database from management portal size of its cache.dat file is not reduced.
I have this query That I am trying to use in my class when testing on the terminal I expect to get the results printed on the terminal but I am only getting zero printed please can anyone out there advice on what I am doing wrong
We are currently using Ensemble on AIX. We are on 2015.2.2. If I install Field Test on a windows desktop, is it possible that I can import the Cache.dat from my AIX server, so I can do some Proof of Concept development?
The following post outlines an architectural design of intermediate complexity for DeepSee. As in the previous example, this implementation includes separate databases for storing the DeepSee cache, DeepSee implementation and settings. This post introduces two new databases: the first to store the globals needed for synchronization, the second to store fact tables and indices.
Let's say you have about 100TB of data in multiple CACHE.DAT. The biggest one is about 30TB but mostly more than 1TB. You have limited time for maintenance during a day, and it is only a few hours at night. You have to check Integrity as much often as possible. And of course backup it.
Trying to evaluate it and work out how we could use it.
As a standard application database. Object or relational etc. does not matter.
Issue is ObjectScript.
So:
1) Can we develop, maintain and use an IRIS database and never use ObjectScript i.e. use only Java, Python, C++ interfaces etc. (exactly which one does not matter)? Would that make designing and using the IRIS database more prone to inefficiency and error?
i want to create an iris document database with Atelier with some properties, where i can import my JSON formatted data from an API to the database which i created. Right now i know how to import my local JSON formatted data to my created database:
This is the third article (see Part 1 and Part 2) where I continue to introduce you to the internal structure of Caché databases. This time, I will tell you a few interesting things and explain how my Caché Blocks Explorer project can help make your work more productive.
I need an example of what I need to "map" to have a common dashboard defined so it will visible/usable in multiple namespaces.
I have created a dashboard in "SAMPLES" (namespace and database) and I would like to have this dashboard be accessable/useable from a 2nd namespace, but I'm not having any success in doing mappings (global/package/routine/data) to be able to get DeepSee to be able to see/display the dashboard.
We have a new requirement being push down by our Data Security to no longer use Local SQL Accounts to access our Databases. So they asked me to create a Service Account that is on the Domain for our connections to each database.
I tried just changing my JDBC connection to using this Service Account and Password but I am not having any luck trying to connect to the database.
" Connection failed. Login failed for user 'osumc\CPD.Intr.Service'. ClientConnectionId:ade97239-c1c8-4ed1-8230-d274edb2e731 "
Hi I am getting below error while upgrading cache instance. Please suggest.
Error: ERROR #70: *** Error while formatting volume because
ERROR #18: failed creating a new volume initializing CACHETEMP, /*****/databases/cachetemp/ - Shutting down the system
An error was detected during Cache startup.
** Startup aborted **
Im just wondering if there is any possibility to "Listen" to a cache DB? We have our cache DB somewhere else provided by a different company, we are provided the interface to connect to that cache DB so we can extract the cache DB every night.
Im just curious if theres a way to "listen" to the cache DB, so if theres any changes on the table in the cache DB, I could make a trigger to extract the table again.
I know i could just set my ETL every hour or so... but that would extract all the tables in cache DB.
I work for a large NHS Trust in the UK and we are using Healthshare and we process 1000s of messages each day. Many of these are standard HL7 messages however for several months now we also pickup and drop off 1000s of PDF files.
We have our message purge set to 365 as we have to keep a years worth of messages as we have a retrieval and send process that enables us to replay any set of messages to any destination which we use to prepopulate end systems with activity and result history.