Dmitry Maslennikov · May 27, 2021 go to post

For performance reasons, it's possible to define an Index in a way, that some of the columns will be as part of the index itself, just for search, and some data could be in the data part of that index, which will be used to display in the result if requested. So, if your index is somehow corrupted, the SQL engine will expect the values there, and will look for it, so, it will not go to the place where the data originally seat. And as a result, you may not see some of the data in the output rows.

Dmitry Maslennikov · May 27, 2021 go to post

Sure, it’s possible to do so. React application is just a frontend side, and IRIS itself can be as a backend server. Or you can write backend server on some other language, e.g. NodeJS, Python, Java or .Net. Which will connect to IRIS as a database.

you can look at my Realworld project, in particular  realization of backend server. The project itself offers, the wide variety of frontends and backends on different languages, and with using different databases. So, you find React frontend which will work with backend on IRIS.

and look at my article about this project

Dmitry Maslennikov · May 27, 2021 go to post

In any way, in such cases the first thing is would be to try to rebuild indexes. It should not take to much time, or even if you are able to do it, purge indexes first, then rebuild from scratch.

if it’s still shows differently. In this unexpected case, I would likely to see the your table definition.

Dmitry Maslennikov · May 27, 2021 go to post

It looks the issue with indexing. Did you add some new indexes while you already had some data?

IRIS requires to manually rebuild index after adding or changes.

look at the documentation 

Dmitry Maslennikov · May 26, 2021 go to post

If you currently have running instance with all the namespaces together, for some time, you may look at ^GLOBUFF, to see how your global buffers used by now, and decide how to split that buffer for each instance.

Dmitry Maslennikov · May 24, 2021 go to post

It’s a binary file and it’s contest for the current context not so important, what’s important is an ownership of the file. So

ls -la /usr/local/etc/irissys/iris.reg
Dmitry Maslennikov · May 22, 2021 go to post

Check the ownership of the file iris.reg, which should be in /usr/local/etc/irissys

The owner of this file is supposed to be used to control IRIS

Dmitry Maslennikov · May 21, 2021 go to post

Depends on what are you trying to achieve.

Import as is, with an iterator

Class User.Test Extends (%RegisteredObject, %JSON.Adaptor)
{

Property name As %String;

ClassMethod Import()
{
  Set data = [{
    "name": "test1"
  },
  {
    "name": "test2"
  }]

  Set iter = data.%GetIterator()
  While iter.%GetNext(.key, .value) {
    Set obj = ..%New()
    Set tSC = obj.%JSONImport(.value)
    Write !,obj.name
  }
}

}

Import with a wrapper object

Class User.TestList Extends (%RegisteredObject, %JSON.Adaptor)
{

Property items As list Of User.Test;

ClassMethod Import()
{
  Set data = [{
    "name": "test1"
  },
  {
    "name": "test2"
  }]

  #; wrap to object
  Set data = {
    "items": (data)
  }

  Set list = ..%New()
  Set tSC = list.%JSONImport(.data)

  For {
    set obj = list.items.GetNext(.key)
    Quit:key=""
    Write !,obj.name
  }
}

}
Dmitry Maslennikov · May 20, 2021 go to post

jsonProvider was invented before native JSON support was added. And there is no reasons to use it already. If you need a JSON representation for an object, look at %JSON.Adaptor

Dmitry Maslennikov · May 20, 2021 go to post

Jeffrey, thanks. But if I would have only 16KB blocks buffer configured and with a mix of databases 8KB (mostly system or CACHETEMP/IRISTEMP) and some of my application data stored in 16KB blocks. 8KB databases in any way will get buffered in 16KB Buffer, and they will be stored one to one, 8KB data in 16KB buffer. That's correct?

So, If I would need to separate global buffers for streams, I'll just need the separate from any other data block size and a significantly small amount of global buffer for this size of the block and it will be enough for more efficient usage of global buffer? At least for non-stream data, with a higher priority?

Dmitry Maslennikov · May 19, 2021 go to post

This is only for external access to this label, without it, you would not be able to call this from terminal or from another routine or class.

Curly braces makes it private by stack. 

Dmitry Maslennikov · May 19, 2021 go to post

Having multiple different global buffers for different block sizes, does not make sense. IRIS will use bigger size of block for lower size blocks inefficiently. The only way to separate is, to use a separate server, right for streams.

Dmitry Maslennikov · May 18, 2021 go to post

I suppose, the issue in your settings.json, which I see you have opened. Could you check if it's really a JSON, and correct? If possible, could you attach the screenshot with it as well? 

Dmitry Maslennikov · May 18, 2021 go to post

I've mentioned above a system with a significant amount of streams stored in the database. And just checked how global buffers used there. And streams are just around 6%. The system is very active, including files. Tons of objects created every minute, attached files, changes in files (yeah, our users can change MS Word files online on the fly, and we keep all the versions).

So, I still see no reasons to change it. And still, see tons of benefits, of keeping it as is.

Dmitry Maslennikov · May 17, 2021 go to post

Fragmentations issues, with SSD disks not an issue anymore. 

But in any way, I agree with storing files in the database. I have a system in production, where we have about 100TB of data, while more than half is just for files, stored in the database. Some of our .dat files by mapping used exclusively for streams, and we take care of them, periodically by cutting them at some point, to continue with an empty database. Mirroring, helps us do not to worry too much about backups. But If would have to store such amount of files as files on the filesystem, we would lose our mind, caring about backups and integrity.

Dmitry Maslennikov · May 17, 2021 go to post
LuhnMCheckSum(input) public {
  Set input = $Piece(input, "#", 1)
  Set codePoints = "ABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789/:"
  Set n = $Length(codePoints)

  Set sum = 0
  Set factor = 2
  Set len = $Length(input)  
  For i = len:-1:1 {
    Set codePoint = $Find(codePoints, $Extract(input, i)) - 2
    Set addend = factor * codePoint
    Set factor = $Case(factor, 2: 1, : 2)
    Set addend = (addend \ n) + (addend # n)
    Set sum = sum + addend
  }
  Set remainder = sum # n
  Set checkCodePoint = (n - remainder) # n
  Return $Extract(codePoints, checkCodePoint + 1)
}
LuhnMValidate(input) public {
  Set checksum = $Piece(input, "#", 2)
  Set input = $Piece(input, "#")
  Return $$LuhnMCheckSum(input) = checksum
}
Dmitry Maslennikov · May 14, 2021 go to post

In most cases, it’s enough to just create empty certificate with just default values.

how do you use it?

Dmitry Maslennikov · May 14, 2021 go to post

what will show the output of locale command in OS?

So, your filesystem may not accept Unicode. And you would need to convert Unicode to a more suitable codepage.

Dmitry Maslennikov · May 13, 2021 go to post

So, you just need help in find the place in the class which cause an error?

I would suggest to try removing each class member one by one, until you’ll narrow it to one member, and maybe you’ll realize why it’s happening.

Dmitry Maslennikov · May 12, 2021 go to post

Well, just noticed the version you have, any chance to upgrade such an old version to at least 2016.2

Dmitry Maslennikov · May 12, 2021 go to post

I would not recommend using such undocumented functions like this. Instead, of this you can switch to something else. Like $system.OBJ.ExportUDL, or some other internal methods

##class(%Atelier.v2.Utils.TextServices).GetTextAsArray() 

##class(%Compiler.UDL.TextServices).GetTextAsArray()

Dmitry Maslennikov · May 11, 2021 go to post

Yeah, sure, it's quite simple to do. JWT tokens contain three parts separated by a pointer sign. 

  • Header, with the algorithm of the signature and the type of token
  • Payload, any data in JSON format
  • Signature needs to verify the token 

All of those parts are encoded with Base64

  Set token = "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c" 

  #; Extract parts of the token
  Set $ListBuild(header, payload, sign) = $ListFromString(token, ".")

  #; Decode and parse Header
  Set header = $System.Encryption.Base64Decode(header)
  Set header = {}.%FromJSON(header)
  Write !,"header"
  Write !,"alg = ",header.alg
  Write !,"typ = ",header.typ

  #; Decode and parse Payload
  Set payload = $System.Encryption.Base64Decode(payload)
  Set payload = {}.%FromJSON(payload)
  Write !!,"data"
  Write !,"name = ", payload.name 
  Write !,"iat = ", payload.iat