I do not have the %request object... the call to retrieve the JSON string is not using Cache's %Net.HttpRequest.
At least that is what I understood you asked me to do. 8-)
How the snippet of code (using $fromJSON) would be modified to incorporate your suggestion?
Pedro
Hello!
I made good progress and the code works with small files (less than 32K characters).
When I try to load the full file I get the following error...
Error. Unable to parse file /tmp/pictures.json Error type Escaped hex sequence too large Error code 7 Error location Line 1 Offset 29356
For example, when I use this snippet of code...
n r !,"Filename: ",filename Q:filename="" s stream=##class(%Stream.FileCharacter).%New() s sc=stream.LinkToFile(filename) i ('sc) W "Error on linking file "_filename,! Q try { S array=[].$fromJSON(stream) S iter=array.$getIterator() while iter.$getNext(.key,.value) { write !,"key "_key_":"_value."id",! } ; } catch ex { w "Unable to parse file "_filename,! w "Error type: "_ex.Name,! w "Error code: "_ex.Code,! w "Error location: "_ex.Location,! set obj="" } Q
It will work for small.json but error-out for file all.json (Error: Escaped hex sequence too large)...
-rw------- 1 pborges user 1037250 Aug 14 14:08 /tmp/all.json -rw------- 1 pborges user 28750 Aug 14 14:07 /tmp/small.json
So, the question is... how to parse a really long JSON string? (>1M+ characters)
BTW, the result I get from the REST call is stored in a global and I am saving it to a "flat" file at the OS level to concatenate in a long string.
What I tried 1st was to parse the global, but there is not a clear end for each entry.
For example...
ZW ^result(1) ^result(1)= "{""id"":""0107454"",""title"":""John Smith"",""image_uri"":""https://<some URL>"",""image_timestamp"":""1496781334"",""image_url"":""https://<so ZW ^result(2) ^result(2)= "me URL>"",""is_restricted_under_18_only"":false,""is_restricted_adult_only"":false},{""id"":""01135433"" ...
Because the node #1 reached 32,000 characters, a new node gets created to have the remainder of the result (I have 42 nodes in total).
Jon -- Yes, there were two entries that had invalid characters...Thanks for the tip!
Eduard -- Thank you so very much for helping me on this!