We are currently using Ensemble on AIX. We are on 2015.2.2. If I install Field Test on a windows desktop, is it possible that I can import the Cache.dat from my AIX server, so I can do some Proof of Concept development?
Thanks
Scott
This tag includes content related to databases as systems for storing, organizing, and managing data. Covers concepts such as data models, querying, indexing, performance, and best practices for working with structured and unstructured data.
We are currently using Ensemble on AIX. We are on 2015.2.2. If I install Field Test on a windows desktop, is it possible that I can import the Cache.dat from my AIX server, so I can do some Proof of Concept development?
Thanks
Scott
The following post outlines an architectural design of intermediate complexity for DeepSee. As in the previous example, this implementation includes separate databases for storing the DeepSee cache, DeepSee implementation and settings. This post introduces two new databases: the first to store the globals needed for synchronization, the second to store fact tables and indices.

Let's say you have about 100TB of data in multiple CACHE.DAT. The biggest one is about 30TB but mostly more than 1TB. You have limited time for maintenance during a day, and it is only a few hours at night. You have to check Integrity as much often as possible. And of course backup it.
How would you do it?
Hi
Totally new to IRIS and Cache.
Trying to evaluate it and work out how we could use it.
As a standard application database. Object or relational etc. does not matter.
Issue is ObjectScript.
So:
1) Can we develop, maintain and use an IRIS database and never use ObjectScript i.e. use only Java, Python, C++ interfaces etc. (exactly which one does not matter)? Would that make designing and using the IRIS database more prone to inefficiency and error?
2) Can we import an existing Cache database into IRIS and convert its ObjectScript code into Java, Python whatever?
Hello everyone,
i want to create an iris document database with Atelier with some properties, where i can import my JSON formatted data from an API to the database which i created. Right now i know how to import my local JSON formatted data to my created database:
Class User.Classtest
{
ClassMethod getFile() as %Status
{
set filename = "/home/student/Downloads/own_scrobble.json"
IF $SYSTEM.DocDB.Exists("db.Streamingdatabase") {
SET db = ##class(%DocDB.Database).%GetDatabase("db.
Hello,
I have a question about creating properties with curl.
I already did create properties in Java with the following command.
It created the property TotalSteps with the type %Integer and the data path $.TotalSteps (since the header of my data source is also TotalSteps).
Now I would like to create the same property in curl with the following command
I have already mentioned my project CacheBlocksExplorer recently in two articles

Now I would like to inform that this project can be easily run with docker.
Version for Caché and for InterSystems IRIS, now publicly available on docker hub.
Remember that you need the appropriate license key (for RedHat Linux) to be able to run this project.
docker run -d --rm \ -p 8080:57772 \ -v ~/cache.key:/usr/cachesys/mgr/cache.key \ -v /some/your/cache/db:/opt/blocks/db/test \ daimor/blocksexplorer:cache
This is the third article (see Part 1 and Part 2) where I continue to introduce you to the internal structure of Caché databases. This time, I will tell you a few interesting things and explain how my Caché Blocks Explorer project can help make your work more productive.
Hi -
I need an example of what I need to "map" to have a common dashboard defined so it will visible/usable in multiple namespaces.
I have created a dashboard in "SAMPLES" (namespace and database) and I would like to have this dashboard be accessable/useable from a 2nd namespace, but I'm not having any success in doing mappings (global/package/routine/data) to be able to get DeepSee to be able to see/display the dashboard.
What is the minimum that I need to map?
We have a new requirement being push down by our Data Security to no longer use Local SQL Accounts to access our Databases. So they asked me to create a Service Account that is on the Domain for our connections to each database.
I tried just changing my JDBC connection to using this Service Account and Password but I am not having any luck trying to connect to the database.
" Connection failed.
Login failed for user 'osumc\CPD.Intr.Service'.
Hi I am getting below error while upgrading cache instance. Please suggest.
Error: ERROR #70: *** Error while formatting volume because ERROR #18: failed creating a new volume initializing CACHETEMP, /*****/databases/cachetemp/ - Shutting down the system An error was detected during Cache startup. ** Startup aborted **
Hello everyone,
Im just wondering if there is any possibility to "Listen" to a cache DB? We have our cache DB somewhere else provided by a different company, we are provided the interface to connect to that cache DB so we can extract the cache DB every night.
Im just curious if theres a way to "listen" to the cache DB, so if theres any changes on the table in the cache DB, I could make a trigger to extract the table again.
I know i could just set my ETL every hour or so... but that would extract all the tables in cache DB.
Thanks a lot for any help and information.
Kind regards,
mark
I work for a large NHS Trust in the UK and we are using Healthshare and we process 1000s of messages each day. Many of these are standard HL7 messages however for several months now we also pickup and drop off 1000s of PDF files.
We have our message purge set to 365 as we have to keep a years worth of messages as we have a retrieval and send process that enables us to replay any set of messages to any destination which we use to prepopulate end systems with activity and result history.
Is there any COS API to create the database (with designated path) and the namespace ?
Also, the API to attach a given database to a given namespace?
Thanks.
Hey, the question is simple!
Is possible one classe is not generated journal?
I would like a given class not to generate this data but on a configured basis to generate, is possible?
I'm creating a new namespace by the installation manifest XML and in the "database" tag configuration I don't see attribute to configure if I what jounal globals or not to this database.
In the database wizard of the "portal administration", have this option.
Regards,
Lucas Boeing Scarduelli
I have a CACHE.DAT file that is working under CACHE version 2015.1.0.429.0
I have a second machine with CACHE version 2017.2.0.741.0
When I attempt to add/use the CACHE.DAT in the new version of CACHE, it will not mount.
How do I upgrade/convert the DAT file to make it work under the new CACHE version?
Thx. Larry...
Hi,
I have a client who is considering encryption options in order to comply with a tendering requirement.
Were they to encrypt the production database then what would be a reasonable expectation forthe impact on message throughput. Or possibly more easily answered: what would be the expected impact be on I/O rate and CPU utilization. Are there any benchmarks to which could support an estimate ?
How would this compare with plan B: to use disk encryption ?
Thanks
The following post concludes the series with a list of all databases seen in the example for the fully flexible architecture.

Hi
The message is received at all times.
cconsole.log
(12660) 0 Failed to mount c:\intersystems\ensemble\mgr\ensemble\ because its default collation (20) is not available...(repeated 60 times)
What it this?
Help-me
Tks
The following post is a guide to implement a basic architecture for DeepSee. This implementation includes a database for the DeepSee cache and a database for the DeepSee implementation and settings.

Is there something in Cache that is equivalent to partitioning a table in Oracle? I'm trying to break some big tables into groups so that the most frequently accessed data is faster to retrieve.
Here is some information on this concept from Oracle.
https://docs.oracle.com/cd/B28359_01/server.111/b32024/partition.htm
I am trying to read data from MySQL Server 2012 to a stream and I keep getting an error here is my code so far
ERROR
An error was received : ERROR <Ens>ErrException: <INVALID OREF>zGetBatchDetails+14^DDQTools.DQTGetBatchOpp.3 -- logged as '-' number - @' set sc=reStream.WriteLine(rec) '< set tSC=$$$OK
set ^tvalue=1
set ^tvalue=$INCREMENT(^tvalue)
#dim rs as EnsLib.SQL.GatewayResultSet
#dim reStream as %Stream.GlobalCharacter
if $$$ISERR(pRequest) quit pRequest
set sc=..Adapter.ExecuteQuery(.rs,..GetTheSubmissionData(pRequest.pMonth, pRequest.pApp, pRequest.pRef, pRequest.pInPat))
if $$$ISERR(sc) quit sc
while rs.Next() {
set (comma,rec)=""
for i=1:1:rs.GetColumnCount() {
set rec=rec_comma_""""_rs.GetData(i)_""""
set comma=","
}
set reStream=""
set sc=reStream.WriteLine(rec)
}
do reStream.%Save()
if ($$$ISOK(sc)){ set pResponse.pReqDetails=reStream
set reStream=""}else{$$$TRACE("There is nothing on the stream")}
set tSC=pResponse.%Save()
set tSC=..SendRequestSync(..TargetConfigNames,pResponse,.pOutput)
Quit tSCIs there a way to pull a user name and password from the Credentials list that is kept in Ensemble? Right now I have a LDAP user that I have hard coded into my ZAUTHENTICATE, which I would like to get away from. I am not to familiar with settings Global, or calling them at least.
Thanks
Scott
We have noticed in the course of the last 18 days our CACHE.dat has grown by 20 GB. Is there a way we can break down the data in CACHE.dat to see what could be growing in size?
Let me state it another way.....Is there a way to see what space an Operation/Service/Process is taking up within a certain Production?
Thanks
Scott Roth
The Ohio State University Wexner Medical Center
Hello,
I have a property which I need to move from one class definition to another as follows:
Old definition:
Class SCHED.SchedEntry
{
Property Experiment as %String;
Property ScanSlot as list of TracerEntry;
}
Class SCHED.TracerEntry
{
Property Tracer As %String
}I want to move the Experiment property to the TracerEntry class so that there is a different Experiment allowed for each ScanSlot, like this:
Class SCHED.SchedEntry
{
Property ScanSlot as list of TracerEntry;
}Some of our interfaces use globals for lookups, but we are currently looking at putting a groups of (document) interfaces in a separate production with a shared ‘Default Database for Routines’ to reduce code duplication.
Hello everyone,
i want an automatic unidirectional syncronization of multiple databases (some tables from .dat file, not the whole .dat file) .
So far i have tried everything from package %SYNC and the best working class is SyncSet with journals and guids. The problem is initial database transport for example when i want to add another server. The easiest solution i have found is to transport syncing globals ^OBJ.SYNC.N to another database and then call %SYNC.SyncSet.Import(), however it seems not to work with only the global structure, although it works fine using import files.
Hi, community
i am defining a new role on my system , i want that this role has access only to my own data base , do you konw which roles can i add to this one ?
thank you