· Sep 9

Compacting and Truncating very large datasets

Does anyone have experience compacting and truncating IRIS datasets that are greater than 10 TB in size?  How long did it take and what was the size?  Did you run into any issues?  



Product version: IRIS 2020.1
Discussion (1)1
Log in or sign up to continue

First my experience is on IRIS.DATs that were 2Tb is size and experience will vary widely since where the free space is is what matters.

There are two different compacts:
Compact globals in a database
Compact free space in a database

If all you want to do is compact free space things are easier.
Do it in small steps with ^DATABASE

Current Size: 2307950 MB
Total freespace: 8564 MB
Freespace at end of file: 18 MB

Target freespace at end of file, in MB (18-8564):200

One great feature is you can cancel if you need to.
Press 'Q' to return, 'P' to pause, 'C' to cancel:

Truncation can be done in the same way in steps.

If you have one I suggest that you do this on your DR instance, move to DR and then do it on what was production.

If you are using 8k blocks for IRIS you should start thinking about the 32Tb limit for 8k databases.

You need to run an integrity check after you are done.

If you wanted to run compact globals that is much more complex.

If the underlying storage does deduplication you should make sure you have extra space at that layer.