At the heart of IRIS and Cache is a very interesting database architecture that we, at M/Gateway Developments, refer to as "Global Storage". If you ever wanted to know more about the fundamentals and capabilities of this underlying database, you might want to read a major analysis we've put together:
https://github.com/robtweed/global_storage
Amongst other things you'll discover that:
- Global Storage provides the underpinnings of a full multi-model database, something we refer to as "Universal NoSQL", though, as you already know from IRIS and Cache, it also supports relational too.
- Global Storage can actually be successfully implemented on top of a number of other databases, in particular Redis, BerkeleyDB and LMDB. These databases also happen to be regarded as some of the fastest databases on the planet.
- However, even using a simple key/value store database model, these databases are comprehensively out-performed by the Native Global Storage databases, eg IRIS.
So, if you wanted to understand why IRIS (and Cache before it) is such a capable database technology, you should read our analysis.
Nice repository and a must-read, uncovering the best kept secret about the fastest databases on the IT planet!
Yes, I think that everyone who uses Cache or IRIS should take the time to discover what lies beneath! The power and flexibility of Global Storage is way beyond anything else out there - something that has been true ever since I first started working with such databases way back in the early 1980s. In all that time, I've never come across any other database architecture that is as quick and simple to grasp and yet as devastatingly powerful. I'm hoping our efforts in putting together these resources will help a new generation of developers discover and harness that unique magic.
Fantastic Rob. Really nice resource. Thanks!
That was a good read and really appreciated the visuals for helping wrap my head around Globals as I'm new to this, coming from a relational background, the article really solidified my understanding. Thanks.