I would recommend seeing what's stored in the database, for this, you can use %GSIZE and count all the globals,
and you may find which global(s) so big.

And you can stop the server, delete CACHE.DAT, and start it again, and in this case, it should be recreated anyway

Try these two variants

write $extract($random(10000) _ "0000", 1, 4)
write $random(10)_$random(10)_$random(10)_$random(10)

You should know that even in such a database with switched-off journalling, data will still appear in journals if data changed in transactions.

My question was, just exactly about that data stored for Ensemble, should I keep it in a mirror, or, may omit it?

Dmitry Maslennikov · May 28, 2021 go to post

What the component did you use for connection?

There are a few technologies that was deprecated with IRIS, or under an additional license option.

Dmitry Maslennikov · May 27, 2021 go to post

Did you see these examples? I'm not an expert in .Net, but as I understood the documentation correctly, you must pass stream as is to .Net method with byte[] as is, and the Gateway should map it, for you.

Dmitry Maslennikov · May 27, 2021 go to post

For performance reasons, it's possible to define an Index in a way, that some of the columns will be as part of the index itself, just for search, and some data could be in the data part of that index, which will be used to display in the result if requested. So, if your index is somehow corrupted, the SQL engine will expect the values there, and will look for it, so, it will not go to the place where the data originally seat. And as a result, you may not see some of the data in the output rows.

Dmitry Maslennikov · May 27, 2021 go to post

Sure, it’s possible to do so. React application is just a frontend side, and IRIS itself can be as a backend server. Or you can write backend server on some other language, e.g. NodeJS, Python, Java or .Net. Which will connect to IRIS as a database.

you can look at my Realworld project, in particular  realization of backend server. The project itself offers, the wide variety of frontends and backends on different languages, and with using different databases. So, you find React frontend which will work with backend on IRIS.

and look at my article about this project

Dmitry Maslennikov · May 27, 2021 go to post

In any way, in such cases the first thing is would be to try to rebuild indexes. It should not take to much time, or even if you are able to do it, purge indexes first, then rebuild from scratch.

if it’s still shows differently. In this unexpected case, I would likely to see the your table definition.

Dmitry Maslennikov · May 27, 2021 go to post

It looks the issue with indexing. Did you add some new indexes while you already had some data?

IRIS requires to manually rebuild index after adding or changes.

look at the documentation 

Dmitry Maslennikov · May 26, 2021 go to post

If you currently have running instance with all the namespaces together, for some time, you may look at ^GLOBUFF, to see how your global buffers used by now, and decide how to split that buffer for each instance.

Dmitry Maslennikov · May 24, 2021 go to post

It’s a binary file and it’s contest for the current context not so important, what’s important is an ownership of the file. So

ls -la /usr/local/etc/irissys/iris.reg
Dmitry Maslennikov · May 22, 2021 go to post

Check the ownership of the file iris.reg, which should be in /usr/local/etc/irissys

The owner of this file is supposed to be used to control IRIS

Dmitry Maslennikov · May 21, 2021 go to post

Depends on what are you trying to achieve.

Import as is, with an iterator

Class User.Test Extends (%RegisteredObject, %JSON.Adaptor)
{

Property name As %String;

ClassMethod Import()
{
  Set data = [{
    "name": "test1"
  },
  {
    "name": "test2"
  }]

  Set iter = data.%GetIterator()
  While iter.%GetNext(.key, .value) {
    Set obj = ..%New()
    Set tSC = obj.%JSONImport(.value)
    Write !,obj.name
  }
}

}

Import with a wrapper object

Class User.TestList Extends (%RegisteredObject, %JSON.Adaptor)
{

Property items As list Of User.Test;

ClassMethod Import()
{
  Set data = [{
    "name": "test1"
  },
  {
    "name": "test2"
  }]

  #; wrap to object
  Set data = {
    "items": (data)
  }

  Set list = ..%New()
  Set tSC = list.%JSONImport(.data)

  For {
    set obj = list.items.GetNext(.key)
    Quit:key=""
    Write !,obj.name
  }
}

}
Dmitry Maslennikov · May 20, 2021 go to post

jsonProvider was invented before native JSON support was added. And there is no reasons to use it already. If you need a JSON representation for an object, look at %JSON.Adaptor

Dmitry Maslennikov · May 20, 2021 go to post

Jeffrey, thanks. But if I would have only 16KB blocks buffer configured and with a mix of databases 8KB (mostly system or CACHETEMP/IRISTEMP) and some of my application data stored in 16KB blocks. 8KB databases in any way will get buffered in 16KB Buffer, and they will be stored one to one, 8KB data in 16KB buffer. That's correct?

So, If I would need to separate global buffers for streams, I'll just need the separate from any other data block size and a significantly small amount of global buffer for this size of the block and it will be enough for more efficient usage of global buffer? At least for non-stream data, with a higher priority?

Dmitry Maslennikov · May 19, 2021 go to post

This is only for external access to this label, without it, you would not be able to call this from terminal or from another routine or class.

Curly braces makes it private by stack. 

Dmitry Maslennikov · May 19, 2021 go to post

Having multiple different global buffers for different block sizes, does not make sense. IRIS will use bigger size of block for lower size blocks inefficiently. The only way to separate is, to use a separate server, right for streams.

Dmitry Maslennikov · May 18, 2021 go to post

I suppose, the issue in your settings.json, which I see you have opened. Could you check if it's really a JSON, and correct? If possible, could you attach the screenshot with it as well? 

Dmitry Maslennikov · May 18, 2021 go to post

I've mentioned above a system with a significant amount of streams stored in the database. And just checked how global buffers used there. And streams are just around 6%. The system is very active, including files. Tons of objects created every minute, attached files, changes in files (yeah, our users can change MS Word files online on the fly, and we keep all the versions).

So, I still see no reasons to change it. And still, see tons of benefits, of keeping it as is.

Dmitry Maslennikov · May 17, 2021 go to post

Fragmentations issues, with SSD disks not an issue anymore. 

But in any way, I agree with storing files in the database. I have a system in production, where we have about 100TB of data, while more than half is just for files, stored in the database. Some of our .dat files by mapping used exclusively for streams, and we take care of them, periodically by cutting them at some point, to continue with an empty database. Mirroring, helps us do not to worry too much about backups. But If would have to store such amount of files as files on the filesystem, we would lose our mind, caring about backups and integrity.

Dmitry Maslennikov · May 17, 2021 go to post
LuhnMCheckSum(input) public {
  Set input = $Piece(input, "#", 1)
  Set codePoints = "ABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789/:"
  Set n = $Length(codePoints)

  Set sum = 0
  Set factor = 2
  Set len = $Length(input)  
  For i = len:-1:1 {
    Set codePoint = $Find(codePoints, $Extract(input, i)) - 2
    Set addend = factor * codePoint
    Set factor = $Case(factor, 2: 1, : 2)
    Set addend = (addend \ n) + (addend # n)
    Set sum = sum + addend
  }
  Set remainder = sum # n
  Set checkCodePoint = (n - remainder) # n
  Return $Extract(codePoints, checkCodePoint + 1)
}
LuhnMValidate(input) public {
  Set checksum = $Piece(input, "#", 2)
  Set input = $Piece(input, "#")
  Return $$LuhnMCheckSum(input) = checksum
}
Dmitry Maslennikov · May 14, 2021 go to post

In most cases, it’s enough to just create empty certificate with just default values.

how do you use it?