Scott Roth · Jan 30, 2018

Managing and Monitoring CACHE.dat

We have noticed in the course of the last 18 days our CACHE.dat has grown by 20 GB. Is there a way we can break down the data in CACHE.dat to see what could be growing in size?

Let me state it another way.....Is there a way to see what space an Operation/Service/Process is taking up within a certain Production?


Scott Roth

The Ohio State University Wexner Medical Center

0 1,155
Discussion (3)3
Log in or sign up to continue

Scott, there's a utility called %GSIZE that does just this; it analyzes global storage and reports allocated vs. actual use.

namespace> d ^%GSIZE

It will prompt you for the database directory (defaulted to your current namespace's), whether you want to include all globals (Yes), globals that contain no data (No), and whether it should show details (Yes). Hit Enter for Device: or specify a path/filename if you want the report written to disk, and Enter again for the Right margin.

If your environment is mirrored, you can run it against the mirror. It could take a while to run; an Ensemble database I've worked with recently is 10TB in size and it took a month to complete.

EDIT: Should've paid attention to your qualification, too ... that's something that will take a bit of query development, thinking in terms of message volume between specific source and target config items along with an analysis of message content size. You'd be working with Ens.MessageHeader for that ...

Assuming that you did not disable Journaling for the critical CACHE.DAT
you can take a look into Mgmt Portal / System Operation / Journal

There you click on PROFILE and sort by Size:
This gives you the fastest moving parts.
And though it also contains updates your fastest GROWING globals will leave their traces there.

In a next step you may analyze the Journal itself filtered by CACHE.DAT and SET operation
but that may take more time and effort.

Hi Scott,

Here is some code that stores the sizes of all globals for a database, if you create an entry in the task manager to run this code every day, you will see in the global  ^tempSize which globals grow faster than others :

do ##class(Utils.Database).GlobalSize()

(please change the code to store the results in a global that does not exist, or better, in a persistent class)

Class Utils.Database

ClassMethod GlobalSize(dir As %String = "")
   If dir="" Set dir = $zu(12,"") ;current directory
   #Dim today as %Integer = $zdate($H,8)
   #Dim sqlStatement as %SQL.Statement = ##class(%SQL.Statement).%New()
   #Dim sqlResult as %SQL.StatementResult
   #Dim sc as %Status = sqlStatement.%PrepareClassQuery("%SYS.GlobalQuery","Size")
   Set sqlResult = sqlStatement.%Execute(dir,,,,,1) ;1=fast mode
   While sqlResult.%Next() {
      set ^tempSize(sqlResult.%GetData(1),today)=sqlResult.%GetData(2) ;^tempSize(yyyymmdd,global)=allocated blocks