There are two solutions, either you use the property numDossiersMER as array instead of list, as suggested by David Hockenbroch, or in case when existing application use list methods like insert and FOR-loops to acces list elements, then you can change this property to a kind of list-table property (see below).

Either of the above gives you the possibility to use queries like:

select Titre->ID, Titre->numTitre, Titre->millesime, Titre->codeProduit, Titre->numDossierMer, numDossiersMER
from User_TestList_Data.Titre_numDossiersMER
where numDossiersMER in (123, 234, 345)

The following guidance is based on the fact that Cache/IRIS uses the so called "schema evolution" in class storage, see also:  https://docs.intersystems.com/latest/csp/docbook/Doc.View.cls?KEY=GOBJ_d...

I use to say list-table property if in a class definition a property shows up as

Property PropName As list of WhateverDataType;

but the SQL-projection is array-like

Property PropName As array Of WhateverDataType;

The steps to create a list-table property depends on the state of your project:

a) You not yet have any data (or the data you have can be deleted):

a1) Delete the possibly existing data

a2) Delete the storage definition (Studio-->Inspector-->Storage-->RightClick-->Delete)

a3) Change the property definition to array:

Property numDossiersMER As array of %Integer;

a4) Compile the class

a5) Change the property definotion to list:

Property numDossiersMER As list Of %Integer;

a6) Compile the class

Voila, you got a list-table property:

do obj.MyProp.Insert(data) to add data items
and query property data as it would be a table: select * from class.name_MyProp 

b) You want to keep your data and you want to retain the property name numDossiersMER (because you don't want to change existing applications). Before proceeding, make a backup of your class globals, then:

b1) Rename the existing property and then add it again as a new array property:

from: Property numDossiersMER as list of %Integer
to  : Property OLDnumDossiersMER as list of %Integer

change the property name in the storage definition too

from:  <Value>numDossiersMER</Value>
to  :  <Value>OLDnumDossiersMEROLD</Value>

then add the new property as array

Property numDossiersMER as array of %Integer;

b2) Compile the class

b3) Change the property's collection from array to list

Property numDossiersMER as list of %Integer;

b4) Compile the class

b5) Transfer the list data from old storage to the new and potentially delete the old list data

set id=0
for  {set id=$order(^User.TestList.Data.TitreD(id)) quit:'id
        set obj=##class(User.TestList.Data.Titre).%OpenId(id)
        if 'obj write id,"  ??",! continue
        for i=1:1:obj.OLDnumDossiersMER.Count() do obj.numDossiersMER.Insert(obj.OLDnumDossiersMER.GetAt(i)
       // obj.OLDnumDossiersMER.Clear()
      do obj.%Save()
}

or you use an SQL statement instead of $order(...)

b6) Rebuild the indexes.

c) You want to keep your data and you want to have a new property name too. Again, before proceeding, make a backup of your class globals, then:

c1) Add the new property as an array    

Property numNewDossiersMER As array Of %Integer;

c2) Compile the class

c3) Change the new property collection from array to list    

Property numNewDossiersMER As list Of %Integer;

c4) Compile the class

c5) Transfer the list data from numDossiersMER to numNewDossiersMER according to b5)

It's IMPORTANT to follow the above steps in the given sequence!

Just to keep things complete, the other way around (array items stored as list items) is also possible. You have just to swap the definition sequence: define as list, compile, redefine as array, compile.

Both possible structures are considered. Here, I use the  examples from my previous posting:

set obj=##class(DC.Rick.MemberData).%OpenId(1)
do obj.%JSONExport() --> {"members":[{"dob":"1990-07-18","firstName":"Bob","memberId":123956}]}
set obj=##class(DC.Rick.MemberData).%OpenId(2)
do obj.%JSONExport() --> {}

The second example outputs {} only and not {"members":null}, I don't know why. Maybe there is a parameter which control this behavior, please ask WRC. 

From the view of data value, you can consider {} and {"members":null} as equal.

write {"members":null}.%GetTypeOf("members") --> null
write {}.%GetTypeOf("members") ----------------> unassigned

Both representation mean, the members property has no value. But, yes, but you can philosophize about it ...

I assume (according to the error message you show) you are trying to import some JSON-formatted data into an IRIS class. In addition I recommend the reading of https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls...

To achieve this, you must define two IRIS classes:

Class DC.Rick.MemberData Extends (%Persistent, %JSON.Adaptor)
{
Property members As list Of Member;
}

Class DC.Rick.Member Extends (%SerialObject, %JSON.Adaptor)
{
Property dob As %Date;
Property firstName As %String;
Property middleName As %String;
Property nameSuffix As %String;
Property genderCode As %String;
Property lastName As %String;
Property memberId As %Integer;
Property relationship As %String;
}

Furthermore, I assume you have data like this (I shortened your example to keep things simple):

set memb0={"dob":"1990-07-18", "firstName":"Bob", "memberId":123956}
set memb1={"dob":"1990-05-25", "firstName":"Bill", "memberId":12345}
set memb2={"dob":"1990-10-30", "firstName":"Tommy", "memberId":4567}
set data(1)={"members":[(memb0)]}.%ToJSON()         // one member
set data(2)={"members":null}.%ToJSON()              // no member at all
set data(3)={"members":[(memb1),(memb2)]}.%ToJSON() // two members

check the examples:

for i=1:1:3 write data(i),!

the output should be:

{"members":[{"dob":"1990-07-18","firstName":"Bob","memberId":123956}]}
{"members":null}
{"members":[{"dob":"1990-05-25","firstName":"Bill","memberId":12345},{"dob":"1990-10-30","firstName":"Tommy","memberId":4567}]}

now import those data

for i=1:1:3 {
   set oref=##class(DC.Rick.MembersData).%New()
   if oref.%JSONImport(data(i)), oref.%Save() { write "OK",! } else { write "ERR",! }
}

If everything goes well, you should get three "OK"s and your data global looks like this

zwrite ^DC.Rick.MemberDataD
^DC.Rick.MemberDataD=3
^DC.Rick.MemberDataD(1)=$lb("",$lb($lb($lb(54620,"Bob","","","","",123956,""))))
^DC.Rick.MemberDataD(2)=$lb("","")
^DC.Rick.MemberDataD(3)=$lb("",$lb($lb($lb(54566,"Bill","","","","",12345,"")),$lb($lb(54724,"Tommy","","","","",4567,""))))

check member sizes:

for i=1:1:3 set oref=##class(DC.Rick.MemberData).%OpenId(i) write oref.members.Size,!

and the output should be:

1
0
2

I hope this is a good starting point for you...

It depends on...

Who is sitting at the other end? A Cache/IRIS server or a third-party product?

If Cache/IRIS: Mirroring, shadowing are the catchwords, you have to look for. In case of third-party SQL-DB: how fast (how often) want to do your updates? Once a day or (nearly)realtime?

I did something like that several years ago... the procedure is (just as a starting point):

Our application uses objects, so all the involved classes have an %OnAfterSave() method, something like this

Method %OnAfterSave(insert As %Boolean) As %Status
{
   do ..addToTransfer(..%Id())
}

with some smartness, like do not add if the record is already in the transfer queue, etc.  If you use SQL instead of objects, triggers are your friend.

We have also a task,  which crates (based on the class definition) a series of INSERT/UPDATE statement(s) and does the transfer with the help of  %SQLGatewayConnection.

The simplest solution was already answered by Robert Cemper in https://community.intersystems.com/post/how-select-random-row-table. I just want to show a more "universal variant" of that solution.

First, create an SQL stored procedure

class SP.Utilis Extends %RegisteredObject
{
ClassMethod Random(number As %Integer, dummy As %String) As %Integer [SqlProc]
{
   quit $random(number) // we do not use dummy but we need it!!
}
}

then make your query as follows:

select top 10 * from whatever.table where SP.Utils_Random(100,ID)<50

This has following advantages:

1) works with all classes, i.e. the ID column has not to be an integer (greater 0), can be a compound column too (like part1||part2, etc)

2) by adjusting the comparison:

Random(1000,ID) < 50   // gives you more "greater" distances then

Random(1000,ID) <500  // between the returned rows

For testing of edge conditions you can use

Random(1000,ID)<0    // no rows will be returned or

Random(1000,ID)<1000 // all rows will be returnd

With the right side of the comparison you can fine tune the distances between the returned rows.

For the dummy argument in the above function you can use an arbitrary column name, the simplest is to use ID because the ID column always exists, it's purpose is to force the SQL-Compiler to call this function for each row (thinking, the result of the Random() function is row-dependet). A comparsion like Random(100)<50 is executed just once. Roberts solution works too because he uses Random(100)<ID but this works only for tables where ID is a Integer (>0). You can verify this by just issuing a simple query

select top 10 * fom your.table where SP.Utils_Random(100)<50

You will see (by repeatedly executing the above query) either 10 (subsequente) rows or nothing

Just my 2 cent suggestion, CACHE.DAT and IRIS.DAT can be (usually) well compressed, the catchwords are winzip and winrar (I prefer winrar over winzip). Winrar, despite the word "win" in name, is also available for linux.


An example: winzip turned a 16GB CACHE.DAT into 3.45GB,  winrar (mode=best) topped this with 2.2GB, but as always, your values will depend on your data. And mind the time you need to compress and decompress the files, which, of course will depend on your hardware...

For example (command line)

rar a -m4 -m512 -v4g <pathTo>cachetransfer <pathTo>cache.dat

will create as many compressed files as needed, each (but the last one) with a size of 4GB, with good compression using dictionary of 512KB size.

You will get, in total, roughly 250 (*.rar) files (each with size of 4GB), I assume, 4TB compresses to 1TB.
When the first 4GB (rar)file is ready, start the transfer in parallel (one job does the compression and the other(s) work(s) on transfer - maybe you have multiple internet connections). Further, suppose you have a continuous (internet) connection between your and the target system with 100 Mbps then, again roughly, the job is done in 28 hours... better then transferring 4TB in a week or more (it's easier to restart a 4GB file as a 4TB file)

Counterquestion, do you have an example of a 'non-ASCII' char?

Codepoints 0x00-0x7F (0 - 127) are the C0 controls, aka Basic Latin (ASCII)

Codepoints 0x80-0xFF (128-255) are the C1 controls, aka Latin1

Take a look on https://www.unicode.org/charts/PDF/U0080.pdf

For example, Ä or ä are the german umlaut-A respective umlaut-a,

$ascii("Ä") --> 196 and $ascii("ä") --> 228 type in a terminal session on your system: write $char(196) --> Ä

Download and compare the above pdf with your iso-8859-1 data, there should be no difference.

If you get data as ISO-8859-1 (aka Latin1) and have a Unicode (IRIS/Cache) installation then usually you have nothing to do (except, to process the data). What do you mean with "convert the text to UTF-8"? In IRIS/Cache you have  (and work with) Unicode codepoints, UTF-8 comes into play only when you export your data but in your case, it will rather be ISO-8859-1 or do I something misunderstand?

By the way, if you return your data back to your Latin1 source (as Latin1) then you have to take some precautions because you have an unicode installation, so during the data processing you could mix your Latin1 data with true unicode data from other sources!

See: https://unicode.org/charts/

Also, you may download and read:

https://www.unicode.org/versions/Unicode13.0.0/UnicodeStandard-13.0.pdf

Of course, if you don't want to check each and every write() for error or success, you can do the check just one time at the beginning

set str=##class(%Stream.FileCharacter).%New()
do str.LinkToFile("/root/my_file.txt")
set sts=str.Write("")
if 'sts { write "We have a problem",! quit }

writing a nullstring to stream does not change the stream but the file opening sequence will be executed

Checking status codes is a good starting point...

set str=##class(%Stream.FileCharacter).%New()
write str  --> 4@%Stream.FileCharacter
write $system.OBJ.DisplayError(str.LinkToFile("/root/my_file.txt")) --> 1
write $system.OBJ.DisplayError(str.WriteLine("line-1"))  --> ERROR #5005: Cannot open file '/root/my_file.txt'1

Your %Save() returns with 1 (OK), because there is nothing to save...

Note: on linux,  for a standard user (like me now) is forbidden to write to '/root' directory

I think, there is a "small problem" at start...

zn "%SYS",n=SL ...

will not work as expected ;-))

but what about this line with 552 chars

zn "%SYS" s n="SL" d ##class(Security.SSLConfigs).Create(n):'##class(Security.SSLConfigs).Exists(n),##class(%Net.URLParser).Parse("https://pm.community.intersystems.com/packages/zpm/latest/installer",.c) s h=##class(%Net.HttpRequest).%New(),h.Server=c("host"),h.Port=443,h.Https=1,h.SSLConfiguration=n,s=h.Get(c("path")) q:'s $System.Status.GetErrorText(s) s x=##class(%File).TempFilename("xml"),f=##class(%Stream.FileBinary).%New(),f.Filename=x d f.CopyFromAndSave(h.HttpResponse.Data) d h.%Close(),$system.OBJ.Load(x,"ck") do ##class(%File).Delete(x)

I only rewrote your line, but haven't tried to execute

I think, your solution is an  "ad hoc" solution for a particular task, anyway, I want to point out two problems.

First, the solution fails, if the size of <source> is less then than the size of <target>:

set source={"Name":"Joe", "Age":50 }
set target={"Name":"Joe", "Age":50, "Phone":"123-456"}

write CompareJSON(source,target) ---> 1
write CompareJSON(target,source) ---> 0

The same goes for data like:

set source={"Name":"Joe", "Age":50, "Data":[10,20] }
set target={"Name":"Joe", "Age":50, "Data":[10,20,30]}

Maybe your data do not have such cases.

A quick check could be:

if source.%Size()-target.%Size() { quit "Size-problem" }

Second, in a more "general" case, comparing lists could sometimes lead to an philosophical question:
are two lists with the same elements but in a different sequence (of those elements) equal or not?

list1: [aaa, bbb]
list2: [bbb, aaa]

The answer MAY depend on the question, what is stored in those lists?

If the lists contains, for example, some kind of instructions, then the sequence of those instructions will mattern, but if those list are just list of, say my hobbies, then the sequence is unimportant (except, if one implies some kind of weighting, like the first entry is my preferred hobby).

This implies to me, to compare two list, you MAY need an extra info, something like:
dependsOnSequence = yes | no

One more hint, in the line

s tSC= ..CompareJSON(value,target.%Get(key),RefNo)  

what happens if target.%Get(key) is empty (or not a corresponding object)?

Just my2cents