Julius Kavay · Jul 13, 2021 go to post

Good Morning Vietnam... ach, I meant Good Morning Julius!

After 1977 (the year I first met Mumps) now is the time to learn M the right way and entirely!

OK, the truth is, I never usd neither the call nor the expression codemode., hence there was no need to check, how parameter passing works...angry

Julius Kavay · Jul 13, 2021 go to post

Nice solution, but just one question, how gets your S routine the parameter <t>?

Is there some trick, I don't know? I would have written this way

ClassMethod ToNato(t) [ CodeMode = call ]
{
^S(t)
}

but then makes 5 chars

Julius Kavay · Jul 13, 2021 go to post

Oh, believe me, I can top even myself

include macrodefs
ClassMethod ToNato(t  = "If, you can read?" ) [ CodeMode = expression ]
{
$$$S
}

ClassMethod s(t)
{
// whatever solution you have, put it there.
}

macrodefs.inc
#define S ..s(t)

Are four chars short enough?  

Today, I'm just cheeky and devilish

Julius Kavay · Jul 12, 2021 go to post

The possibilities to get a corrupted file are:

- you do not read the (Base64) encoded file in chunks of N*4 (where N is an integer)
- your stream wasn't rewinded before starting with reading
- your (incoming) stream is (already) corrupted (but this would in some cases trigger an error in Base64Decode() method). 

Just for a test, try this

str = is your incoming Base64 stream

set filename = "test.tiff"
open filename:"nwu":1
if '$test write "Can't open",! quit

do str.Rewind()
use file
while 'str.AtEnd { write $system.Encryption.Base64Decode(str.Read(32000)) } // 32000 = 4 * 8000
close file

If the incoming stream is not corruoted, the right now created tiff file should be readable

Julius Kavay · Jul 12, 2021 go to post

I bet, this one, with 6 chars only, is shorter

ClassMethod ToNato(t  = "If, you can read?" ) [ CodeMode = expression ]
{
..s(t)
}

ClassMethod s(t)
{
// whatever solution you have, put it there.
}
Julius Kavay · Jun 21, 2021 go to post

I have no idea, what is the date format of PID 7.1, but I'm sure, you can convert this date to $h format, so the answer to your question is

set age = $h - $zdh(PID7.1Date,?) \ 365.25

now <age> contains the patients age in full years

Julius Kavay · Jun 10, 2021 go to post

Can you please give us an example for (each) those "variantes" (I mean, those JSON strings)?
Something like:

{"sent":"2021-06-10 09:00:00", "received":"2021-06-10 09:05:00", variante1... }
{"sent":"2021-06-10 09:00:00", "received":"2021-06-10 09:05:00", variante2... }

Thank you.

Julius Kavay · Jun 9, 2021 go to post

If it helped you to understand how things work, then everything is OK. Have a nice day.

Julius Kavay · Jun 8, 2021 go to post

Somehow I don't get you right. To save obj.%Size() in a variable, just do a simple assign

set myVariable = obj.%Size()

but I'm pretty shure, this was not your intended question.

I suppose, you have JSON formatted data (a string or a stream) and you want to store those data in a table. Am I right?

If yes, then follow the next steps:

1) create a class which describes your JSON objects (strings)

Class DC.SehindeRaji Extends (%Persistent, %JSON.Adaptor)
{
Property byr As %String(%JSONFIELDNAME = "byr:");
Property iyr As %String(%JSONFIELDNAME = "iyr:");
Property eyr As %String(%JSONFIELDNAME = "eyr:");
// do the same for all other fields

ClassMethod Import(data)
{
    set obj=..%New()                    // create a new DC.SehindeRaji object
    set sts=obj.%JSONImport(data,"")    // import the (JSON) data
    
    if sts {
        set sts = obj.%Save()
        if sts {
            write "Saved, ID=",obj.%Id(),!
            quit 1
            
        } else {
            write "Not saved, Err=",$system.Status.GetOneErrorText(sts),!
            quit 0
        }
        
    } else {
        write "Can't import: ",$system.Status.GetOneErrorText(sts),!
        quit 0
    }
}
}

2) You can create some test data (interactively) in a terminal session

set dynObj = {"byr:":"1937", "iyr:":"2017", "eyr:":"2020"}
set data = dynObj.%ToJSON()

or get your data somehow from an input (possibly from a file),  the only important thing is, your data should look like this

write data  -->  {"byr:":"1937","iyr:":"2017","eyr:":"2020"}

3) import those data

write ##class(DC.SehindeRaji).Import(data) --> Saved, ID=1

4) Now open the saved data and check the result

set oref =  ##class(DC.SehindeRaji).%OpenId(1)

write oref.byr  --> 1937
write oref.iyr  --> 2017

write oref.%JSONExportToString(.exported,"") --> 1
write exported  --> {"byr:":"1937","iyr:":"2017","eyr:":"2020"}

zw ^DC.SehindeRajiD
^DC.SehindeRajiD=1
^DC.SehindeRajiD(1)=$lb("","1937","2017","2020")

I hope, this is what yoy want to do...

Julius Kavay · Jun 8, 2021 go to post

The facts:
1) According to the error message: "The system cannot find the file specified."
2) Futhermore, the error message shows slashes and backslashes, mixing is rarely good, Windows uses "\", Unix "/"

What to do is:
1) check the filename, you want to send (including the path)
2) check the existence of the file
3) Under which user accont is IRIS/Cache running?
4) May this user read the file?

Julius Kavay · Jun 8, 2021 go to post

It's not clear to me what you want to do.

A property like

Property MyData as %(Global-or-File)Stream;

means, the size of MyData can be something between 0 and the free space on your (hard) drive.
That's the reason, why is MyData defined as a stream and not as a %String.

On the other hand, in an excel cell you can put no more then 32767 characters, hence the plan to extract those data to an spreadsheet will work only if the MyData properties do not have more then 32767 chars, see
https://support.microsoft.com/en-us/office/excel-specifications-and-lim…

Nevertheless, you could use the following stored procedure to extrac the first 32767 chars from those stream data:

Class Your.Table Extends %Persistent
{
Property StreamData As %GlobalCharacterStream;
// other properties
ClassMethod StreamDataAsText(ID) As %String [ SqlProc ]
{
    set obj = ..%OpenId(ID,0), text = ""
    if obj { do obj.StreamData.Rewind() set text obj.StreamData.Read(32767) }
    quit text
}
}

Now you can get, beside the other data, the first 32767 chars of those stream data too

select Your.Table_StreamDataAsText(ID), * from Your.Table
Julius Kavay · Jun 2, 2021 go to post

If you can call a JavaScript function, then you could do something like this...

<html>
<head><title>Test</title>
<link id="fav" rel="icon" href="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAIAAACQkWg2AAAABnRSTlMAAAAAAABupgeRAAAACXBIWXMAAA7EAAAOxAGVKw4bAAAAF0lEQVQokWP8z0AaYCJR/aiGUQ1DSAMAQC4BH5CRCM8AAAAASUVORK5CYII=">

<script>
    function changeFavicon() {
        var lid=document.getElementById("fav");
        if (lid) {
            lid.setAttribute("href","data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAIAAACQkWg2AAAABnRSTlMAAAAAAABupgeRAAAACXBIWXMAAA7EAAAOxAGVKw4bAAAAGElEQVQokWNk+M9AEmAiTfmohlENQ0kDAD8vAR+xLJsiAAAAAElFTkSuQmCC");
        }
    }
</script>
</head>
<body>
<button onclick="changeFavicon();")>Change my favicon</button><br>
</body>
</html>

The (red and green) icons are just a demo example.

Julius Kavay · May 31, 2021 go to post

 Oh yes, the idea ... I still have to think about that and get some sleep
 

Julius Kavay · May 27, 2021 go to post

I can't you provide a .Net help but there are methods in IRIS/Cache to create QR code:

##class(%SYS.QRCode).GenerateFile(...) and
##class(%SYS.QRCode).GenerateImage(...)

so your developers could have a direct use instead of messing with passing data back and fort between IRIS/Cache and .Net

Julius Kavay · May 27, 2021 go to post

The correct timestamp format is YYYY-MM-DD HH:MM:SS but according to the error message, your data does not meets this format.

104    Field validation failed in INSERT, or value failed ...MyTimeStampField' (value '2021-05-26 11:45:40 ') 

You see the space or tab character after the seconds? 

Julius Kavay · May 25, 2021 go to post

If I got you correctly... for IRIS (and newer Cache Versions) you can use

select * from  INFORMATION_SCHEMA.ROUTINES where ROUTINE_NAME='...'

and for older Cache versions try

select * from %Dictionary.CompiledMethod where SqlProc=1 and Name='...'

(but be patient, this takes some time)

Julius Kavay · May 21, 2021 go to post

You have a string of digits... like

set result="12345678900987654321"

then you can easily extract groups of four digits as

for i=1:4:$length(result) write $extract(result,i,i+3),!

this gives you

1234
5678
9009
8765
4321

assuming, there are no other characters between those numbers...

Julius Kavay · May 21, 2021 go to post

[1,2,,3] is equally arguably as [1,2,3,,] or [1,2,3,,,,,] and IRIS/Cache accepts all of them.

Nothing against a system which is input tolerant (forgiving, with your words) but then this tolerance should be obvious and in some way logical. An example, I tolerate trailing comma(s), becuse they could be leftovers of editing. So I would say, all the arrays

[1,2,3]
[1,2,3,]
[1,2,3,,,]

have a size of 3 - there are three elements, no more. But IRIS/Cache says the sizes are 3, 4 and 6. So let check  the last one

set obj=[].%FromJSON("[1,2,3,,,]")
write obj.%Size() --> 6
for i=0:1:9 write i,?3,obj.%Get(i),?7,obj.%GetTypeOf(i),!

The output of the FOR-Loop is:

0  1   number
1  2   number
2  3   number
3      unassigned
4      unassigned
5      unassigned
6      unassigned
7      unassigned
8      unassigned
9      unassigned


The elements with index 3, 4 and 5 are unassigned and in some kind, I can understand that. But if the higher indices, like 6, 7, 88 or 1000 etc. are also unassigned then I ask you, why is the size 6 and not, say 12 or 573?
For me the logical size should be 3 because there are three intendeed elements, the others are a result of tolerated delimiters! 

Finally, I don't want to start a war, how to interpret JSON strings. It was just my 2cc to a theme, which is out-of-round, according to my opinion.

Julius Kavay · May 20, 2021 go to post

OK, take a more simple case:

set obj=[1,2,,3]  // again, this is a SYNTAX
set obj=[].%FromJSON("[1,2,,3]") // this is OK

but in both cases, the problem was the bouncing comma-key on my keyboard.

The first was told by compiler the second was "forgiven" by JSON-Reader! BUT the real question is, WHAT IS the third item in the above array? The latter shows obj has a size of 4, so the, and the desired thrid element could be null, 3 or maybe something else!

I wrote my very first program somewhere in 1971 or 1972, I can't remember anymore. But one thing I have learned is, one should accept checked data only.

Imagine, you accept those incorrect (aka forgiven) data and beside processing, store the data in your database, then later, for whatever reason, you send the (original string) data to an external party.... bang! They can't read it, because it's not JSON conform.

Julius Kavay · May 20, 2021 go to post

For the sake of completness, there is one more validator: https://jsonlint.com/  (which shows the above settings.json file as incorrect).

One more problem, it seems IRIS (and Cache) speaks with a forked tongue (but has nothing to do with the above problem) :

set string = "{""Value"":123, }"    // note the extra comma!
set stream=##class(%Stream.TmpCharacter).%New()
do stream.WriteLine(string)
set obj1={"Value":123, }  --> gives a SYNTAX
set obj2={}.%FromJSON(string) --> accepts the incorrect (json)string!
Julius Kavay · May 19, 2021 go to post

I'm not sure... but I think, your settings.json does NOT conform to JSON  specification. It seems, you like extra commas but JSON does not likes them. Take a look at:

...
    "active":true , <----extra comma
  }, <----------------------- extra comma
}

This produces in IRIS/Cache an <SYNTAX... which starts with an "<"

Julius Kavay · May 14, 2021 go to post

Long time ago I did some connections to external databases (MySql and PostGres).
The essential parts such a connection are:

1) First, you have to create in your OS the corresponding ODBC Data Source entries
   (System-DSN) after installing the required DB-Driver

2) The connection

    set gtwConn=##class(%SQLGatewayConnection).%New(), gtwHandle=0
    
    if gtwConn.Connect(OdbcName, OdbcUser, OdbcPass) {
        if gtwConn.AllocateStatement(.gtwHandle) {
            // check gtwConn.GatewayStatus
            // etc.
        } else { write "Can't Allocate: "_OdbcName }
    } else { write "Can't connect to "_OdbcName }

3) SQL-Commands

    do gtwConn.CloseCursor(gtwHandle)
    if gtwConn.PrepareW(gtwHandle, sqlStatement) {
        if gtwConn.Execute(gtwHandle) {
           ...
           ...
        } else { /* check gtwConn.GatewayStatus */ }
    } else { /* check.gtwConn.GatewayStatus */ }

   
4) Finish

    if gtwConn {
        do gtwConn.DropStatement(gtwHandle), gtwConn.Disconnect()
        set gtwConn="", gtwHandle=""
    }
Julius Kavay · May 12, 2021 go to post

There is a keyword %NOINDEX indexname1, indexname2, ... to prevent the SQL-Enginne to use specific indices but there is no keyword for the opposite, something like %USEINDEX indexname, sadly.

Maybe someone with more SQL experience knows what is preventing the SQL engine to use the existing index over the numDossiersMER property...

But, and this is the great thing with IRIS and Cache, if everything else fails, you can always create your custom query.

Class User.TestList.Data.Titre Extends (%Persistent, %Populate)
{
Property numTitre As %Integer;
Property millesime As %Integer;
Property codeProduit As %String;
/// Old field which will be replaced by the next one
Property numDossierMER As %Integer;
Property numDossiersMER As list Of %Integer;
Index titreIdx On (numTitre, millesime, codeProduit) [ PrimaryKey ];
/// Old index
Index numDossierMERIdx On numDossierMER;
Index numDossiersMERIdx On numDossiersMER(ELEMENTS);

Query Select(num...) As %Query(CONTAINID = 1, ROWSPEC = "ID:%Integer,Dossier:%Integer,codeProd:%String") [ SqlProc ]
{
}

ClassMethod SelectExecute(par As %Binary, num...) As %Status
{
   kill par, ^||tmpSelectQry
   for i=1:1:$g(num) set nr=$g(num(i)) merge:nr]"" ^||tmpSelectQry(nr)=^User.TestList.Data.TitreI("numDossiersMERIdx",nr)
   set par=$na(^||tmpSelectQry)
   quit $$$OK
}

ClassMethod SelectFetch(par As %Binary, row As %List, end As %Integer) As %Status
{
   set par=$query(@par)
   if par="" { set end=1, row="" }
   else {
      set end=0, id=$qs(par,2)
      set row=$lb(id, $qs(par,1), ..codeProduitGetStored(id)) // and other fields...
   }
   quit $$$OK
}

ClassMethod SelectClose(par As %Binary) As %Status
{
   kill par, ^||tmpSelectQry
   quit $$$OK
}

ClassMethod Test()
{
   write "Using a ResultSet...",!
   set rs=##class(%ResultSet).%New("User.TestList.Data.Titre:Select")
   if rs.Execute(230,3590,40110,507550,6094,70071,820096,9380148,8,592) {
   set t=$zh
   while rs.Next() { write rs.Data("ID"),?10,rs.Data("Dossier"),?30,rs.Data("codeProd"),! }
   }
   write "Time: ",$zh-t*1E3,!!
   write "Direct usage of the query methods...",!
   do ..SelectExecute(.par,230,3590,40110,507550,6094,70071,820096,9380148,8,592)
   set t=$zh
   for  do ..SelectFetch(.par,.row,.end) quit:end  zwrite row
   write "Time: ",$zh-t*1E3,!
}

Storage Default
{
<Data name="TitreDefaultData">
<Value name="1">
<Value>%%CLASSNAME</Value>
</Value>
<Value name="2">
<Value>numTitre</Value>
</Value>
<Value name="3">
<Value>millesime</Value>
</Value>
<Value name="4">
<Value>codeProduit</Value>
</Value>
<Value name="5">
<Value>numDossierMER</Value>
</Value>
</Data>
<Data name="numDossiersMER">
<Attribute>numDossiersMER</Attribute>
<Structure>subnode</Structure>
<Subscript>"numDossiersMER"</Subscript>
</Data>
<DataLocation>^User.TestList.Data.TitreD</DataLocation>
<DefaultData>TitreDefaultData</DefaultData>
<ExtentSize>1000000</ExtentSize>
<IdLocation>^User.TestList.Data.TitreD</IdLocation>
<IndexLocation>^User.TestList.Data.TitreI</IndexLocation>
<Property name="%%CLASSNAME">
<AverageFieldSize>1</AverageFieldSize>
<Selectivity>100.0000%</Selectivity>
</Property>
<Property name="%%ID">
<AverageFieldSize>5.88</AverageFieldSize>
<Selectivity>1</Selectivity>
</Property>
<Property name="codeProduit">
<AverageFieldSize>4.89</AverageFieldSize>
<Selectivity>0.0004%</Selectivity>
</Property>
<Property name="millesime">
<AverageFieldSize>8.89</AverageFieldSize>
<Selectivity>0.0001%</Selectivity>
</Property>
<Property name="numDossierMER">
<AverageFieldSize>8.89</AverageFieldSize>
<Selectivity>0.0001%</Selectivity>
</Property>
<Property name="numTitre">
<AverageFieldSize>8.89</AverageFieldSize>
<Selectivity>0.0001%</Selectivity>
</Property>
<SQLMap name="IDKEY">
<BlockCount>-63088</BlockCount>
</SQLMap>
<SQLMap name="numDossierMERIdx">
<BlockCount>-7912</BlockCount>
</SQLMap>
<SQLMap name="titreIdx">
<BlockCount>-19940</BlockCount>
</SQLMap>
<StreamLocation>^User.TestList.Data.TitreS</StreamLocation>
<Type>%Storage.Persistent</Type>
} }

Some examples after do ##class(..).Poulate(1E6)

USER>d ##class(User.TestList.Data.Titre).Test()
Using a ResultSet...
700556    8                   R7369
696384    230                 R6776
952257    592                 E8624
209184    3590                Q7863
239874    6094                N7969
497500    40110               W6490
188796    70071               O9708
145090    507550              S3705
803994    820096              S20
97986     9380148             W6598
Time: .787

Direct usage of the query methods...
row=$lb("700556","8","R7369")
row=$lb("696384","230","R6776")
row=$lb("952257","592","E8624")
row=$lb("209184","3590","Q7863")
row=$lb("239874","6094","N7969")
row=$lb("497500","40110","W6490")
row=$lb("188796","70071","O9708")
row=$lb("145090","507550","S3705")
row=$lb("803994","820096","S20")
row=$lb("97986","9380148","W6598")
Time: .894

Julius Kavay · May 7, 2021 go to post

There are two solutions, either you use the property numDossiersMER as array instead of list, as suggested by David Hockenbroch, or in case when existing application use list methods like insert and FOR-loops to acces list elements, then you can change this property to a kind of list-table property (see below).

Either of the above gives you the possibility to use queries like:

select Titre->ID, Titre->numTitre, Titre->millesime, Titre->codeProduit, Titre->numDossierMer, numDossiersMER
from User_TestList_Data.Titre_numDossiersMER
where numDossiersMER in (123, 234, 345)

The following guidance is based on the fact that Cache/IRIS uses the so called "schema evolution" in class storage, see also:  https://docs.intersystems.com/latest/csp/docbook/Doc.View.cls?KEY=GOBJ_…

I use to say list-table property if in a class definition a property shows up as

Property PropName As list of WhateverDataType;

but the SQL-projection is array-like

Property PropName As array Of WhateverDataType;

The steps to create a list-table property depends on the state of your project:

a) You not yet have any data (or the data you have can be deleted):

a1) Delete the possibly existing data

a2) Delete the storage definition (Studio-->Inspector-->Storage-->RightClick-->Delete)

a3) Change the property definition to array:

Property numDossiersMER As array of %Integer;

a4) Compile the class

a5) Change the property definotion to list:

Property numDossiersMER As list Of %Integer;

a6) Compile the class

Voila, you got a list-table property:

do obj.MyProp.Insert(data) to add data items
and query property data as it would be a table: select * from class.name_MyProp 

b) You want to keep your data and you want to retain the property namenumDossiersMER (because you don't want to change existing applications). Before proceeding, make a backup of your class globals, then:

b1) Rename the existing property and then add it again as a new array property:

from: Property numDossiersMER as list of %Integer
to  : Property OLDnumDossiersMER as list of %Integer

change the property name in the storage definition too

from:  <Value>numDossiersMER</Value>
to  :  <Value>OLDnumDossiersMEROLD</Value>

then add the new property as array

Property numDossiersMER as array of %Integer;

b2) Compile the class

b3) Change the property's collection from array to list

Property numDossiersMER as list of %Integer;

b4) Compile the class

b5) Transfer the list data from old storage to the new and potentially delete the old list data

set id=0
for  {set id=$order(^User.TestList.Data.TitreD(id)) quit:'id
        set obj=##class(User.TestList.Data.Titre).%OpenId(id)
        if 'obj write id,"  ??",! continue
        for i=1:1:obj.OLDnumDossiersMER.Count() do obj.numDossiersMER.Insert(obj.OLDnumDossiersMER.GetAt(i)
       // obj.OLDnumDossiersMER.Clear()
      do obj.%Save()
}

or you use an SQL statement instead of $order(...)

b6) Rebuild the indexes.

c) You want to keep your data and you want to have a new property name too. Again, before proceeding, make a backup of your class globals, then:

c1) Add the new property as an array    

Property numNewDossiersMER As array Of %Integer;

c2) Compile the class

c3) Change the new property collection from array to list    

Property numNewDossiersMER As list Of %Integer;

c4) Compile the class

c5) Transfer the list data from numDossiersMER to numNewDossiersMER according to b5)

It's IMPORTANT to follow the above steps in the given sequence!

Just to keep things complete, the other way around (array items stored as list items) is also possible. You have just to swap the definition sequence: define as list, compile, redefine as array, compile.

Julius Kavay · May 6, 2021 go to post

Both possible structures are considered. Here, I use the  examples from my previous posting:

set obj=##class(DC.Rick.MemberData).%OpenId(1)
do obj.%JSONExport() --> {"members":[{"dob":"1990-07-18","firstName":"Bob","memberId":123956}]}
set obj=##class(DC.Rick.MemberData).%OpenId(2)
do obj.%JSONExport() --> {}

The second example outputs {} only and not {"members":null}, I don't know why. Maybe there is a parameter which control this behavior, please ask WRC. 

From the view of data value, you can consider {} and {"members":null} as equal.

write {"members":null}.%GetTypeOf("members") --> null
write {}.%GetTypeOf("members") ----------------> unassigned

Both representation mean, the members property has no value. But, yes, but you can philosophize about it ...

Julius Kavay · May 5, 2021 go to post

I assume (according to the error message you show) you are trying to import some JSON-formatted data into an IRIS class. In addition I recommend the reading of https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cl…

To achieve this, you must define two IRIS classes:

Class DC.Rick.MemberData Extends (%Persistent, %JSON.Adaptor)
{
Property members As list Of Member;
}
Class DC.Rick.Member Extends (%SerialObject, %JSON.Adaptor)
{
Property dob As %Date;
Property firstName As %String;
Property middleName As %String;
Property nameSuffix As %String;
Property genderCode As %String;
Property lastName As %String;
Property memberId As %Integer;
Property relationship As %String;
}

Furthermore, I assume you have data like this (I shortened your example to keep things simple):

set memb0={"dob":"1990-07-18", "firstName":"Bob", "memberId":123956}
set memb1={"dob":"1990-05-25", "firstName":"Bill", "memberId":12345}
set memb2={"dob":"1990-10-30", "firstName":"Tommy", "memberId":4567}
set data(1)={"members":[(memb0)]}.%ToJSON()         // one member
set data(2)={"members":null}.%ToJSON()              // no member at all
set data(3)={"members":[(memb1),(memb2)]}.%ToJSON() // two members

check the examples:

for i=1:1:3 write data(i),!

the output should be:

{"members":[{"dob":"1990-07-18","firstName":"Bob","memberId":123956}]}
{"members":null}
{"members":[{"dob":"1990-05-25","firstName":"Bill","memberId":12345},{"dob":"1990-10-30","firstName":"Tommy","memberId":4567}]}

now import those data

for i=1:1:3 {
   set oref=##class(DC.Rick.MembersData).%New()
   if oref.%JSONImport(data(i)), oref.%Save() { write "OK",! } else { write "ERR",! }
}

If everything goes well, you should get three "OK"s and your data global looks like this

zwrite ^DC.Rick.MemberDataD
^DC.Rick.MemberDataD=3
^DC.Rick.MemberDataD(1)=$lb("",$lb($lb($lb(54620,"Bob","","","","",123956,""))))
^DC.Rick.MemberDataD(2)=$lb("","")
^DC.Rick.MemberDataD(3)=$lb("",$lb($lb($lb(54566,"Bill","","","","",12345,"")),$lb($lb(54724,"Tommy","","","","",4567,""))))

check member sizes:

for i=1:1:3 set oref=##class(DC.Rick.MemberData).%OpenId(i) write oref.members.Size,!

and the output should be:

1
0
2

I hope this is a good starting point for you...