I've been reading the documentation guide for 2018.1 over frozen query plans several times in the last days (link) and there is an answer I can't seem to find directly.
I want to make the LOCATION for a %Stream.FileCharacterGzip property we have in a class parameterisable (as in stored in a global), so it can be different for different customers. Is that possible? I've tried doing it a few different ways: Creating a parameter Parameter STREAMLOC = {^websystem("auditstream-location")}; then I tried to reference that in the
Property HTMLZIPDoc As %Stream.FileCharacterGzip(LOCATION = ..#STREAMLOC);
To develop simple applications, wouldn't the Caché database be too heavy? Or are there simpler and less complex versions that can serve in the same way?
How to configure Remotedatabase into our local instance. While connecting remote database i am getting "ERROR #463: Database C:\InterSystems\HealthShare2\mgr\Remote\ is not allowed for ECP Mirror Connection". so can anyone help me to sort this out.
we have some legacy ZEN applications build upon CSS2 style definitions. We moved the application due to an upgrade to a newer version of cache (2017.2.1). Anyway I have in mind that in one of the prior relases css3 style interpretation was enforced by ZEN and you could explecitly tell the framework to use CSS2 by setting a global. Anyway I can´t found any hints in the docs on that. Does anyone of the %ZEN gurus have this in mind?
In earlier cache version, I can see full data values in SQL management portal.
But IRIS restricted the view only to 100 chars.
"If the data in a field is longer than 100 characters, the first 100 characters of the data are displayed followed by an ellipsis (...) indicating additional data." - From Documentation.
Is there a way to change this behavior ? I would like to see all the values in the particular SQL field.
In the Windows Ressource Manager I can observe multiple parallel processes coming from cache.exe with read operations to journaling files.
All except one of these processes have the same reads(Byte/s). The processes point to different journal files and constantly read between 200 and 3000 Bytes/s.
The corresponding process via PID in the management portal of Caché shows the process %SYS.Monitor.Control.1. In 3 days of uptime on the server it has run 181.632.583 commands and modified 32.140.642 globals.
How can I get the current date and time in the destination file when I'm using Stream.CopyFrom because Stream.CopyFrom preserves the date and time of the source file.
Hello, I'm curious to see how other people deal with this: we have a text file that was created on someone's Windows machine and it was copied and pasted into a text file on someone's Mac machine. After some examination we realized that the line feeds were originally CRLF (for Windows) and when copied and pasted they were changed to LF (Mac). The diff program we used didn't pick up on this and the program we wrote to read the file was getting each line of the CRLF file and treating the whole file as one line for the LF file.
When we go to specific name space to search messages in message viewer, at the moment we are adding the search criteria to add criterion type as SearchTable Field, and clicking the dropdown in Class field, there is an error says below in the image:
Could anyone please let me know what exactly the error is from? Thanks so much.
Some time ago, I changed the configuration in SQL Runtime Statistic to "Turn on Stats code generation to gather stats at the Open and Close of a query". With this change, the CACHE base (cache/mgr/cache/) has grown a lot to reach 198GB.
Yesterday, I returned the configuration of SQL Runtime Statistic to the default which is "Turn off Stats code generation" and the cache base is no longer growing.
I am writing a report for a client that will create a report of the current process with a format that mimicks the Management Portal process display. I am writing a cterm script file to generate the report.
In the loop that process the results I am writing the columns in a formatted manner that will result in a CSV format very similar in content and order the process page. However, it appears that the write statement is limited in size such that I can not write out all of the elements of the sys.process query. My query result processing that works correctly is of this format:
Some IF statements reference macro $$$WindowsCacheClient as a boolean flag to mark if the client calling the LDAP server is running Windows. Other IF statements reference $$$ISWINDOWS. Are they not the same thing? That is, does the routine need $$$WindowsCacheClient at all?
I am working with a client database that is growing exceptionally fast ( about 15G/day). As I understand it, the usage of the global ^STMONITS to gather statistics can consume a large amount of database space. This client in question is using this global. I would like to determine of the 1.7T size of the database, how much is being consumed with statistics data by the usage of the ^STMONITS global. Is there a method to get this value whether a size or percentage of the database overall size.