New post

Rechercher

Article
· Feb 5 8m read

Cómo montar un cluster de nodos IRIS en sharding en 1 minuto

Os voy a mostrar cómo podéis montaros en vuestra máquina, muy rápidamente, un cluster de nodos InterSystems IRIS en sharding. En este artículo no es mi objetivo hablar del sharding en detalle, ni definir una arquitectura de despliegue para un caso real, sino enseñaros como podéis montar de forma rápida, en vuestro equipo, un cluster de instancias de IRIS en sharding con el que poder jugar y hacer pruebas. Si estáis interesados en indagar más sobre el sharding en IRIS, podéis echar un vistazo a la documentación pinchando aquí.  

Vaya por delante que la tecnología sharding de IRIS nos va a permitir 2 cosas de inicio:

  • Definir, cargar y consultar tablas fragmentadas o shards, cuyos datos serán distribuidos transparentemente entre los nodos del cluster
  • Definir tablas federadas, que ofrecen una vista global y compuesta de datos pertenecientes a tablas distintas y que, fisicamente, están almacenadas en distintos nodos distribuidos.

Así que , lo dicho, dejamos para otros artículos el tema de jugar con shards o con tablas federadas, y nos centramos ahora en el paso previo, esto es, en montar el cluster de nodos en sharding.

Bien, pues para nuestro ejemplo vamos a utilizar Docker Desktop (para Windows o Mac) y nos vamos a apoyar en la funcionalidad de IRIS: CPF Merge o fusionado del fichero de configuración;  que nos permite utilizar un fichero de texto plano en el que incluiremos secciones y propiedades de configuración de IRIS que queremos aplicar para modificar la configuración actual de la instancia de IRIS. Basicamente este fichero se sobrepone al iris.cpf que define la configuración por defecto de la instancia.

Este fusionado lo "activamos" automáticamente al añadir la variable de entorno: ISC_CPF_MERGE_FILEa la que deberemos haberle asignado una ruta válida a un fichero que contenga esas secciones del fichero cpf que queremos cambiar. Cuando IRIS arranca, comprueba si se le ha indicado que haga un merge (basicamente, si existe esa variable de entorno y apunta a un fichero válido). Si es así, hace el fusionado y arranca.

No me enrollo más y os incluyo el fichero docker-compose.yml que hará la magia:

 
docker-compose.yml

Y también un ejemplo de los ficheros CSP.conf y CSP.ini:

 
CSP.conf
 
CSP.ini

En este caso, estamos creando 3 servicios:

  • irisnode1 - Primer nodo del cluster, que tiene un rol especial, y de ahí que le denominemos específicamente como node1
  • irisnode2 - Nodo de datos adicional del cluster, cuyo rol es data (de estos podemos tener tantos como queramos)
  • webgateway - Servidor Web preconfigurado para acceder a las instancias de IRIS (Apache + Webgateway)

Para crear la imagen shardnode:latest, hemos utilizado el siguiente dockerfile:

 
Dockerfile

Dentro del dockerfile llamamos a iris.script, que nos permitirá ejecutar comandos en ObjectScript para realizar configuraciones, importar y compilar código, etc.., de la imagen de IRIS que estamos construyendo:

 
iris.script

Los ficheros utilizados para hacer el merge para el nodo1 y los nodos de data del cluster de IRIS son:

 
merge_first_data-node.cpf
 
merge_data-node.cpf

Podríamos tener más nodos de tipo data en el cluster con tan sólo añadir más servicios con la misma definición que el irisnode2 (cambiando el nombre claro)

Por otro lado, para que nos funcione correctamente el direccionamiento en nuestro servidor web, y podamos acceder a los Portales de Administración de cada una de las instancias, hemos de cambiar en cada una de ellas el parámetro CSPConfigName, y eso lo hacemos con los ficheros: configure_first_data-node.sh y configure_data-node.sh; que en este ejemplo son iguales, pero que he dejado diferentes porque, en un momento dado, podríamos querer hacer acciones distintas en el arranque de cada instancia de IRIS, según se trade del nodo1 o un nodo de tipo data del cluster.

 
configure_data-node.sh

Y básicamente ya estaría.

Se podrían definir los nodos utilizando la API disponible en la clase %SYSTEM.Cluster, pero la verdad es que la posibilidad de introducir acciones conjuntamente con la funcionalidad de CPF Merge nos simplifica la tarea muchísimo. Os recomiendo que miréis aquí, en concreto en el apartado que se refiere a la sección [Actions].

Para construir las imágenes y desplegar el cluster, podríamos construir nuestra imagen sharnode:latest y lanzar el docker-compose desde VS Code o, desde nuestra shell, desde el directorio en que se encuentre el fichero, docker-compose.yml, podríamos ejecutar estos comandos:

docker compose build
docker compose up

Tardará un poco la primera vez porque ha de instanciarse la instancia marcada como node1 antes de que se inicien por primera vez cualesquiera otros nodos de tipo data del cluster. Pero todo debería estar funcionando y listo en un minuto o menos.

Si todo ha ido bien, deberías poder entrar a los portales de gestión de cada una de las instancias:

Y, ¡listo! A partir de aquí, el límite en cuanto a volumen de almacenamiento de BBDD y tamaño de tablas, ya lo pone vuestro hardware.Tendrías un cluster de nodos de IRIS listo para definir tablas en sharding o tablas federadas.

¡¡Espero que os sirva!! Nos vemos por aquí... 

2 Comments
Discussion (2)2
Log in or sign up to continue
Discussion (0)1
Log in or sign up to continue
Question
· Feb 5

Error encountered while calling an API

Hello, I have a problem with a call and I would need some help.

When I call an API with the SendFormDataArray method of the EnsLib.HTTP.OutboundAdapter adapter, I pass it a %Net.HttpRequest object and I receive the following error:

ERROR #5002: ObjectScript error: <SUBSCRIPT>MatchSuffix+1^%Net.HttpRequest.1 ^%qPublicSuffix("")

I have correctly entered the url in my business operation and I pass it in the last parameter of the SendFormDataArray method. Do you have any idea, please?

 

Method XRPUAuthentification(pInput As ANCV.msg.BO.XRPUAuthentificationDmde, Output pOutput As ANCV.msg.BO.XRPUAuthentificationRpse) As %String
{
	
 Set pOutput = ##class(ANCV.msg.BO.XRPUAuthentificationRpse).%New()
 
 Try{
 	Set httpRequest = ##class(%Net.HttpRequest).%New()
 	Set utilisateur = ##class(ANCV.WebObjet.wsXRPUUtilisateur).%New()
	Set utilisateur.login = ..RecupererLogin()
	Set utilisateur.password = ..RecupererMdp()
	Do httpRequest.SetHeader("Content-Type", "application/json")

 	//Transformation du message d'entrée en JSON
 	Set tSC = ..ObjectToJSONStream(utilisateur, .entityBody)
 	$$$ThrowDecomposeIfError(tSC, "Impossible de transformer le message", $$$ErreurRecuperationToken)
 	
 	Set httpRequest.EntityBody = entityBody
 	//Appel à l'api security/authentication
 	set tSC = ..Adapter.SendFormDataArray(.response, "POST",httpRequest,,,..Adapter.URL_"/security/authentication")
 	$$$ThrowDecomposeIfError(tSC, "Impossible d'appeler l'api", $$$ErreurRecuperationToken)
 	//Transformation du JSON de retour en message
 	set tSC = ..JSONStreamToObject(response.Data, .pOutput, "ANCV.msg.BO.XRPUAuthentificationRpse", 1)
 	$$$ThrowDecomposeIfError(tSC, "Impossible de récupérer le token", $$$ErreurRecuperationToken)
 	set pOutput.codeRetour = "OK"
 } Catch Exception {
		Set pOutput.codeRetour = "KO"
		Set pOutput.libErreur = Exception.DisplayString()
		Set pOutput.codeErreur = Exception.Code
	 }
	 Quit $$$OK
}

Method RecupererLogin() As %String
{
	Quit ##class(Ens.Config.Credentials).GetValue(..Adapter.Credentials, "Username")
}

Method RecupererMdp() As %String
{
	Quit ##class(Ens.Config.Credentials).GetValue(..Adapter.Credentials, "Password")
}
1 Comment
Discussion (1)1
Log in or sign up to continue
Article
· Feb 5 8m read

Using DocDB in SQL, almost

From the previous article, we identified some issues when working with JSON in SQL.

IRIS offers a dedicated feature for handling JSON documents, called DocDB.

InterSystems IRIS® data platform DocDB is a facility for storing and retrieving database data. It is compatible with, but separate from, traditional SQL table and field (class and property) data storage and retrieval. It is based on JSON (JavaScript Object Notation) which provides support for web-based data exchange. InterSystems IRIS provides support for developing DocDB databases and applications in REST and in ObjectScript, as well as providing SQL support for creating or querying DocDB data.

By its nature, InterSystems IRIS Document Database is a schema-less data structure. That means that each document has its own structure, which may differ from other documents in the same database. This has several benefits when compared with SQL, which requires a pre-defined data structure.

The word “document” is used here as a specific industry-wide technical term, as a dynamic data storage structure. “Document”, as used in DocDB, should not be confused with a text document, or with documentation.

Let's explore how DocDB can help store JSON in the database and integrate it into projects that rely solely on xDBC protocols.

Let's start

DocDB defines two key components:

  • %DocDB.Database - Although it expects the creation of a "Database," which can be confusing since we already have a database in SQL terms, it is essentially a class in ObjectScript. For those more familiar with SQL, it functions as a table.
  • %DocDB.Document - A base class for a "database" that extends the %Persistent class and introduces DocDB-specific properties:
    • %DocumentId - IdKey
    • %Doc As %DynamicAbstractObject - The actual storage for the JSON document
    • %LastModified - An automatically updated timestamp for each insert and update

Creating a Table (Database)

Now, let's create our first table, or rather, our first "Database." It seems that the expectation was not for someone to create a DocDB.Database using only SQL. As a result, there is no way to create a new "Database" using SQL alone. To thoroughly test this, we will use a plain ObjectScript approach. Below is an example of how to define a class that extends %DocDB.Document:

Class User.docdb Extends %DocDB.Document [ DdlAllowed ]
{

}

Checking the newly created table using SQL shows that it is functional.

Time to give it a first try and insert some data

We can insert any data without validation, meaning there are no restrictions on what can be inserted into %Doc. Implementing validation would be beneficial.

Extracting Values from a Document

%DocDB.Database allows properties to be extracted from documents, making them available as dedicated columns. This also enables indexing on these properties.

Would need to get database first.

USER>set docdb=##class(%DocDB.Database).%GetDatabase("User.docdb")

<THROW>%GetDatabase+5^%DocDB.Database.1 *%Exception.StatusException ERROR #25351: DocDB Database 'User.docdb' does not exist.

USER 2e1>w $SYSTEM.DocDB.Exists("User.docdb")
0

Hmm, database "does not exist", okay, let's create then

USER>set docdb=##class(%DocDB.Database).%CreateDatabase("User.docdb")

<THROW>%CreateDatabase+13^%DocDB.Database.1 *%Exception.StatusException ERROR #25070: The generated class name for the database 'User.docdb' conflicts with another class: User.docdb
USER 2e1>

This suggests that a simple class definition is not sufficient. Instead, we must use %DocDB.Database from the beginning, which is inconvenient, especially when using source control.

To resolve this, we delete the existing class and create the database correctly:

USER>do $system.OBJ.Delete("User.docdb")

Deleting class User.docdb
USER>set docdb=##class(%DocDB.Database).%CreateDatabase("User.docdb")

USER>zwrite docdb
docdb=6@%DocDB.Database  ; <OREF,refs=1>
+----------------- general information ---------------
|      oref value: 6
|      class name: %DocDB.Database
|           %%OID: $lb("3","%DocDB.Database")
| reference count: 1
+----------------- attribute values ------------------
|       %Concurrency = 1  <Set>
|          ClassName = "User.docdb"
|       DocumentType = ""
|               Name = "User.docdb"
|           Resource = ""
|   SqlNameQualified = "SQLUser.docdb"
+-----------------------------------------------------

This time, it works, and previously inserted data remains intact.

Assuming we have a document like this

{"name":"test", "some_value":12345}

Let's extract both of these fields, using %CreateProperty method

USER>do docdb.%CreateProperty("name","%String","$.name",0)

USER>do docdb.%CreateProperty("someValue","%String","$.some_value",0)

And check the table

Upon checking the table, we see two new columns, but they contain null values. It appears that these properties do not apply retroactively to existing data. If a developer later adds properties and indexes for optimization, the existing data will not automatically reflect these changes.

Update with the same value, and check if %doc is json. And we got our value.

Let's have a look at the class now, which is fully created and updated by %DocDB.Database

Class User.docdb Extends %DocDB.Document [ Owner = {irisowner}, ProcedureBlock ]
{

Property name As %String [ SqlComputeCode = { set {*}=$$%EvaluatePathOne^%DocDB.Document({%Doc},"$.name")
}, SqlComputed, SqlComputeOnChange = %Doc ];
Property someValue As %String [ SqlComputeCode = { set {*}=$$%EvaluatePathOne^%DocDB.Document({%Doc},"$.some_value")
}, SqlComputed, SqlComputeOnChange = %Doc ];
Index name On name;
Index someValue On someValue;
}

So, created properties, contains a code to extract value from %Doc, and yes, it only populated if %Doc is changed. And it created Indexes for both fields, no one asked for. Having many extracted values, will increase globals usage just by amount of indexes.

It will be possible to update those created properties, with no harm to original %Doc, but the values will became irrelevant.

 
Insert invalid data

%DocDB.Database, has %GetProperty method

USER>zw docdb.%GetPropertyDefinition("someValue")

{"Name":"someValue","Type":"%Library.String"}  ; <DYNAMIC OBJECT> 
USER>zw docdb.%GetPropertyDefinition("name")

{"Name":"name","Type":"%Library.String"}  ; <DYNAMIC OBJECT>

The path to the value which was used in %CreateProperty is gone, no way to validate it. If path requires incorrect, to update it, %DropProperty required to be called first and %CreateProperty again.

%FindDocuments

%DocDB.Database offers a way to search through the documents

To find one or more documents in a database and return the document(s) as JSON, invoke the %FindDocuments() method. This method takes any combination of three optional positional predicates: a restriction array, a projection array, and a limit key:value pair.

Most important part, %FindDocuments does not care about %Doc itself, it only works on properties. Quite fragile, throws exceptions on anything that does not match expectations. In fact, just constructs some SQL query and executes it.

USER>do docdb.%FindDocuments(["firstName","B","%STARTSWITH"]).%ToJSON() 

<THROW>%FindDocuments+37^%DocDB.Database.1 *%Exception.StatusException ERROR #25541: DocDB Property 'firstName' does not exist in 'User.docdb'

USER>do docdb.%FindDocuments(["name","test","in"],["name"]).%ToJSON()

{"sqlcode":100,"message":null,"content":[{"name":"test"}]}
USER>do docdb.%FindDocuments(["name","","in"],["name"]).%ToJSON() 

<THROW>%FindDocuments+37^%DocDB.Database.1 *%Exception.SQL -12 -12   A term expected, beginning with either of:  identifier, constant, aggregate, $$, (, :, +, -, %ALPHAUP, %EXACT, %MVR %SQLSTRING, %SQLUPPER, %STRING, %TRUNCATE, or %UPPER^ SELECT name FROM SQLUser . docdb WHERE name IN ( )

USER>do docdb.%FindDocuments(["name","test","="]).%ToJSON()

{"sqlcode":100,"message":null,"content":[{"%Doc":"{\"name\":\"test\", \"some_value\":12345}","%DocumentId":"1","%LastModified":"2025-02-05 12:25:02.405"}]}
USER 2e1>do docdb.%FindDocuments(["Name","test","="]).%ToJSON() 

<THROW>%FindDocuments+37^%DocDB.Database.1 *%Exception.StatusException ERROR #25541: DocDB Property 'Name' does not exist in 'User.docdb'

USER>do docdb.%FindDocuments(["%Doc","JSON","IS"]).%ToJSON() 

<THROW>%FindDocuments+37^%DocDB.Database.1 *%Exception.StatusException ERROR #25540: DocDB Comparison operator is not valid: 'IS'
USER 2e1>do docdb.%FindDocuments(["%Doc","","IS JSON"]).%ToJSON() 

<THROW>%FindDocuments+37^%DocDB.Database.1 *%Exception.StatusException ERROR #25540: DocDB Comparison operator is not valid: 'IS JSON'

Plain SQL would be much more reliable 

Storage

Another very interesting moment is how efficiently is JSON stored in the database.

^poCN.bvx3.1(1)=$lb("","2025-02-05 12:25:02.405","test",12345)
^poCN.bvx3.1(1,"%Doc")="{""name"":""test"", ""some_value"":12345}"
^poCN.bvx3.1(2)=$lb("","2025-02-05 12:25:02.405")
^poCN.bvx3.1(2,"%Doc")="[1,2,3]"
^poCN.bvx3.1(3)=$lb("","2025-02-05 12:01:18.542")
^poCN.bvx3.1(3,"%Doc")="test"
^poCN.bvx3.1(4)=$lb("","2025-02-05 12:01:19.445")
^poCN.bvx3.1(4,"%Doc")=$c(0)
^poCN.bvx3.1(5)=$lb("","2025-02-05 12:01:20.794")

JSON is stored as plain text, while other databases use binary formats for more efficient storage and searching. IRIS's DocDB does not support direct searching within document content unless JSON_TABLE is used, which still requires parsing JSON into an internal binary format.

In version 2025.1, %DynamicAbstractObject introduces %ToPVA and %FromPVA methods, which seem to store JSON in a binary format.

USER>do ({"name":"value"}).%ToPVA($name(^JSON.Data(1))) 

USER>zw ^JSON.Data
^JSON.Data(1,0,0)="PVA1"_$c(134,0,6,0,2,0,0,0,0,0,14,0,15,0,2,0,21,9,6,136,0,1,6,0,1,0,2,1,137,0,1,5,8,1,6)_"value"_$c(6,0,6)_"name"_$c(5) 

USER>zw {}.%FromPVA($name(^JSON.Data(1)))

{"name":"value"}  ; <DYNAMIC OBJECT,refs=1>

However, there are inconsistencies when handling certain structures.

USER>do ({}).%ToPVA($name(^JSON.Data(1)))

<SYSTEM>%ToPVA+1^%Library.DynamicAbstractObject.1

USER>do ({"name":{}}).%ToPVA($name(^JSON.Data(1)))

<SYSTEM>%ToPVA+1^%Library.DynamicAbstractObject.1

Conclusion

As of now, %DocDB is only practical within ObjectScript and has limitations in SQL. Performance concerns arise when dealing with large datasets. Everything %DocDB offers can be achieved using plain SQL while maintaining full SQL support. Given the current implementation, there is little incentive to use DocDB over standard SQL approaches.

1 Comment
Discussion (1)1
Log in or sign up to continue
Question
· Feb 5

Erreur rencontrée lors de l'appel d'une API

Bonjour, j'ai un problème avec un appel et j'aurais besoin d'aide.

Lorsque j'appelle une API avec la méthode SendFormDataArray de l'adaptateur EnsLib.HTTP.OutboundAdapter, je lui passe un objet %Net.HttpRequest et je reçois l'erreur suivante :

ERROR #5002: ObjectScript error: <SUBSCRIPT>MatchSuffix+1^%Net.HttpRequest.1 ^%qPublicSuffix("")

J'ai correctement saisi l'url dans mon business operation et je la passe dans le dernier paramètre de la méthode SendFormDataArray. Avez-vous une idée, s'il vous plaît ?

Method XRPUAuthentification(pInput As ANCV.msg.BO.XRPUAuthentificationDmde, Output pOutput As ANCV.msg.BO.XRPUAuthentificationRpse) As %String
{
	
 Set pOutput = ##class(ANCV.msg.BO.XRPUAuthentificationRpse).%New()
 
 Try{
 	Set httpRequest = ##class(%Net.HttpRequest).%New()
 	Set utilisateur = ##class(ANCV.WebObjet.wsXRPUUtilisateur).%New()
	Set utilisateur.login = ..RecupererLogin()
	Set utilisateur.password = ..RecupererMdp()
	Do httpRequest.SetHeader("Content-Type", "application/json")

 	//Transformation du message d'entrée en JSON
 	Set tSC = ..ObjectToJSONStream(utilisateur, .entityBody)
 	$$$ThrowDecomposeIfError(tSC, "Impossible de transformer le message", $$$ErreurRecuperationToken)
 	
 	Set httpRequest.EntityBody = entityBody
 	//Appel à l'api security/authentication
 	set tSC = ..Adapter.SendFormDataArray(.response, "POST",httpRequest,,,..Adapter.URL_"/security/authentication")
 	$$$ThrowDecomposeIfError(tSC, "Impossible d'appeler l'api", $$$ErreurRecuperationToken)
 	//Transformation du JSON de retour en message
 	set tSC = ..JSONStreamToObject(response.Data, .pOutput, "ANCV.msg.BO.XRPUAuthentificationRpse", 1)
 	$$$ThrowDecomposeIfError(tSC, "Impossible de récupérer le token", $$$ErreurRecuperationToken)
 	set pOutput.codeRetour = "OK"
 } Catch Exception {
		Set pOutput.codeRetour = "KO"
		Set pOutput.libErreur = Exception.DisplayString()
		Set pOutput.codeErreur = Exception.Code
	 }
	 Quit $$$OK
}

Method RecupererLogin() As %String
{
	Quit ##class(Ens.Config.Credentials).GetValue(..Adapter.Credentials, "Username")
}

Method RecupererMdp() As %String
{
	Quit ##class(Ens.Config.Credentials).GetValue(..Adapter.Credentials, "Password")
}
Discussion (0)1
Log in or sign up to continue