Question
Robert Cemper · Mar 23, 2022

How to handle a large JSON object?

I have a rather simplistic JSON Object of this structure: 

{"id":<someid>,"value":<some string>,"details": 
 [{"id":<someid>,"value":<some string>,"details": 
  [{"id":<someid>,"value":<some string>,"details":  
   [{"id":<someid>,"value"}
   ,{"id":<someid>,"value"} 
   ,{"id":<someid>,"value"} ]
  ,[{"id":<someid>,"value":<some string>,"details":  
     [{"id":<someid>,"value"}
    ,{"id":<someid>,"value"} ] ]
 ,{"id":<someid>,"value":<some string>} ] } 
   

The depth and number of depth elements are variable.
It's a finger snip to work it down with  %DynamicObject.
BUT:
The object I try to handle has a size of 78 MByte in a file.
%DynamicObject fails with <MAXSTRING>  and  
%Stream.DynamicCharacter fails with <STORE>.
My attempt with Python also failed by not enough memory

Are there any ideas on how to proceed ?
 

Product version: IRIS 2022.1
$ZV: IRIS for Windows (x86-64) 2022.1 (Build 114U) Mon Jan 31 2022 01:21:31 EST
0
0 267
Discussion (3)1
Log in or sign up to continue

Try:

set $ZSTORAGE=-1

I worked with 12 GB json objects and it was all fine.

Does $ZSTORAGE also affect available memory in Embedded Python ?

  • That raised the level but was no general solution.
    • As I failed in my Docker container I didn't push it to DemoServer not to break it.
  • Neither Embedded nor (external) Python was able to handle it by standard approach.
  • I did my customized solution to handle any size of JSON file.