Announcement
· Mar 26

mg-dbx-napi: Game-changing JavaScript performance for IRIS

You may have heard about our mg-dbx-napi interface for IRIS which provides insanely fast access from Node.js.  If you've been following recent developments in the server-side JavaScript world, you'll be excited to know that mg-dbx-napi also works with Bun.js, the latter proving to be significantly faster than Node.js for many/most purposes.

Of course, if you're a Node.js user, you'll probably wonder how mg-dbx-napi compares with the Native API for Node.js that is included with IRIS.

With all that in mind, we've created a Github repository: mg-showcase

mg-showcase includes Dockerfiles that you can use to quickly and easily build an IRIS container that includes all our key JavaScript technologies, pre-installed and ready to try out and explore.

mg-showcase also includes a growing set of documentation and tutorials to take you through all our technologies and show you how to use them with IRIS, Node.js and Bun.js

The containers include lots of benchmark tests that you can try out for yourself, but just to give a flavour of the kind of performance we're talking about:

- on a standard M1 Mac Mini, running Node.js on our ARM64-based IRIS Container, a simple loop saving key/value pairs to IRIS and then reading them back:

  - using our mg-dbx-napi interface: 1.1 million sets/sec and 1.8 million reads/sec

By comparison, using the Native API for Node.js on the same Container with Node.js:

  - 110,000 sets/sec and 96,000 reads/sec

So orders of magnitude higher performance, and in terms of what's possible with our technologies, take a look at our glsdb abstraction: you'lll find a complete set of tutorials that will show you how to use it with Node.js and the Fastify web framework and also with the insanely fast Bun.serve HTTP server, and allowing you to consider IRIS data storage in a completely and radically different way: think persistent JavaScript Objects and you're on the right track.

For more information, start exploring the mg-showcase repository.  It's a work in progress, so keep watching it as we add more stuff to it!

Discussion (8)4
Log in or sign up to continue

Bear in mind that a superserver of any sort is only necessary if you configure mg-dbx-napi to connect to IRIS over the network.  The best performance is realized by connecting to a local IRIS instance via its API.

That said, the reason why we developed our own superserver is because, as far as I know, the IRIS superserver does not have a public interface.  In other words, it only works for InterSystems products.  If this situation ever changed, we would of course converge on the superserver embedded in IRIS.  This would make our installation process much simpler.

One important thing to note about mg-dbx-napi: it not only gives you access to the underlying Global storage, it also provides APIs for accessing IRIS Classes and SQL from within JavaScript too.  See:

https://github.com/chrisemunt/mg-dbx-napi#direct-access-to-intersystems-...

https://github.com/chrisemunt/mg-dbx-napi#direct-access-to-sql-mgsql-and...

These APIs are also made available if you're using our QOper8 package for handling requests in a Node.js or Bun.js child process - via the this.mgdbx object.

Of course in the Node.js world, caching of key/value data is usually done using Redis, usually considered one of the fastest key/value stores available.  So how does it compare with an equivalent loop creating and reading key/value pairs?  Well you can test it for yourself: the IRIS containers include a pre-installed copy of Redis and both benchmark tests, but on our M1 Mac Mini, using the standard Redis connector for Node.js, we get a mere 17,000/sec: thats for both reads and writes.  Even in pipelined mode it only maxes out at around 250,000/sec.

And of course, IRIS can do so much more than just key/value pairs.

Try our benchmarks out for yourself and perhaps let us know your results.