Just a hint, I would take $ZD($h,2). For today, my development system (and systems at customers site) shows:

Write $horolog - $zdate($horolog, 4) + 1 --> 65925.93
Write $zdate($horolog,4) --> 20.07.2021
Write $horolog - $zdate($horolog, 2) + 1  --> 65926 // expected value

Later, this value (65925.93), as a $zdate() argument,  gives you an <ILLEGAL VALUE> 

For $zdate($horolog,4), the link you provided says:

4 DD/MM/[YY]YY (01/07/97 or 27/03/2002) — European numeric format. You must specify the correct dateseparator character (/ or .) for the current locale.

Obviously, the response.Data does not contain valid JSON. You can simply check the received data by putting the data aside in a temporary global, something like this:

do request.HttpResponse.Data.Rewind()
set ^temp.debug($j,"size")=request.HttpResponse.Data.Size
set ^("data")=request.HttpResponse.Data.Read(request.HttpResponse.Data.Size)  // or just the first 1000 bytes
zw ^temp.debug

Now you can take a look on the incoming data, maybe there is an encoding problem or the data do not adhere to JSON specification

According to your code,  the variable obx5 contains the base64 encoded tiff image. There is one thing I do not understand: what are those "\.br\" char-sequences, how they came into the base64 stream?

Anyway, I suppose they are OK (those "\.br\"s), so put all those pieces together and decode all at once:

set input = ""
for i=1:1:$L(obx5,"\.br\") { set input = input _ $P(obx5,"\.br\",i)) }

Do obj.Write($system.Encryption.Base64Decode(input))

Now you should have a correct decoded image, provided, the obx5 variable really contains the original base64 encoded tiff image with randomly inserted "\.br\" chars (for whatever reason).

Somehow I don't get you right. To save obj.%Size() in a variable, just do a simple assign

set myVariable = obj.%Size()

but I'm pretty shure, this was not your intended question.

I suppose, you have JSON formatted data (a string or a stream) and you want to store those data in a table. Am I right?

If yes, then follow the next steps:

1) create a class which describes your JSON objects (strings)

Class DC.SehindeRaji Extends (%Persistent, %JSON.Adaptor)
{
Property byr As %String(%JSONFIELDNAME = "byr:");
Property iyr As %String(%JSONFIELDNAME = "iyr:");
Property eyr As %String(%JSONFIELDNAME = "eyr:");
// do the same for all other fields

ClassMethod Import(data)
{
    set obj=..%New()                    // create a new DC.SehindeRaji object
    set sts=obj.%JSONImport(data,"")    // import the (JSON) data
    
    if sts {
        set sts = obj.%Save()
        if sts {
            write "Saved, ID=",obj.%Id(),!
            quit 1
            
        } else {
            write "Not saved, Err=",$system.Status.GetOneErrorText(sts),!
            quit 0
        }
        
    } else {
        write "Can't import: ",$system.Status.GetOneErrorText(sts),!
        quit 0
    }
}
}

2) You can create some test data (interactively) in a terminal session

set dynObj = {"byr:":"1937", "iyr:":"2017", "eyr:":"2020"}
set data = dynObj.%ToJSON()

or get your data somehow from an input (possibly from a file),  the only important thing is, your data should look like this

write data  -->  {"byr:":"1937","iyr:":"2017","eyr:":"2020"}

3) import those data

write ##class(DC.SehindeRaji).Import(data) --> Saved, ID=1

4) Now open the saved data and check the result

set oref =  ##class(DC.SehindeRaji).%OpenId(1)

write oref.byr  --> 1937
write oref.iyr  --> 2017

write oref.%JSONExportToString(.exported,"") --> 1
write exported  --> {"byr:":"1937","iyr:":"2017","eyr:":"2020"}

zw ^DC.SehindeRajiD
^DC.SehindeRajiD=1
^DC.SehindeRajiD(1)=$lb("","1937","2017","2020")

I hope, this is what yoy want to do...

The facts:
1) According to the error message: "The system cannot find the file specified."
2) Futhermore, the error message shows slashes and backslashes, mixing is rarely good, Windows uses "\", Unix "/"

What to do is:
1) check the filename, you want to send (including the path)
2) check the existence of the file
3) Under which user accont is IRIS/Cache running?
4) May this user read the file?

If you can call a JavaScript function, then you could do something like this...

<html>
<head><title>Test</title>
<link id="fav" rel="icon" href="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAIAAACQkWg2AAAABnRSTlMAAAAAAABupgeRAAAACXBIWXMAAA7EAAAOxAGVKw4bAAAAF0lEQVQokWP8z0AaYCJR/aiGUQ1DSAMAQC4BH5CRCM8AAAAASUVORK5CYII=">

<script>
    function changeFavicon() {
        var lid=document.getElementById("fav");
        if (lid) {
            lid.setAttribute("href","data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAIAAACQkWg2AAAABnRSTlMAAAAAAABupgeRAAAACXBIWXMAAA7EAAAOxAGVKw4bAAAAGElEQVQokWNk+M9AEmAiTfmohlENQ0kDAD8vAR+xLJsiAAAAAElFTkSuQmCC");
        }
    }
</script>
</head>
<body>
<button onclick="changeFavicon();")>Change my favicon</button><br>
</body>
</html>

The (red and green) icons are just a demo example.

Long time ago I did some connections to external databases (MySql and PostGres).
The essential parts such a connection are:

1) First, you have to create in your OS the corresponding ODBC Data Source entries
   (System-DSN) after installing the required DB-Driver

2) The connection

    set gtwConn=##class(%SQLGatewayConnection).%New(), gtwHandle=0
    
    if gtwConn.Connect(OdbcName, OdbcUser, OdbcPass) {
        if gtwConn.AllocateStatement(.gtwHandle) {
            // check gtwConn.GatewayStatus
            // etc.
        } else { write "Can't Allocate: "_OdbcName }
    } else { write "Can't connect to "_OdbcName }

3) SQL-Commands

    do gtwConn.CloseCursor(gtwHandle)
    if gtwConn.PrepareW(gtwHandle, sqlStatement) {
        if gtwConn.Execute(gtwHandle) {
           ...
           ...
        } else { /* check gtwConn.GatewayStatus */ }
    } else { /* check.gtwConn.GatewayStatus */ }

   
4) Finish

    if gtwConn {
        do gtwConn.DropStatement(gtwHandle), gtwConn.Disconnect()
        set gtwConn="", gtwHandle=""
    }

There are two solutions, either you use the property numDossiersMER as array instead of list, as suggested by David Hockenbroch, or in case when existing application use list methods like insert and FOR-loops to acces list elements, then you can change this property to a kind of list-table property (see below).

Either of the above gives you the possibility to use queries like:

select Titre->ID, Titre->numTitre, Titre->millesime, Titre->codeProduit, Titre->numDossierMer, numDossiersMER
from User_TestList_Data.Titre_numDossiersMER
where numDossiersMER in (123, 234, 345)

The following guidance is based on the fact that Cache/IRIS uses the so called "schema evolution" in class storage, see also:  https://docs.intersystems.com/latest/csp/docbook/Doc.View.cls?KEY=GOBJ_d...

I use to say list-table property if in a class definition a property shows up as

Property PropName As list of WhateverDataType;

but the SQL-projection is array-like

Property PropName As array Of WhateverDataType;

The steps to create a list-table property depends on the state of your project:

a) You not yet have any data (or the data you have can be deleted):

a1) Delete the possibly existing data

a2) Delete the storage definition (Studio-->Inspector-->Storage-->RightClick-->Delete)

a3) Change the property definition to array:

Property numDossiersMER As array of %Integer;

a4) Compile the class

a5) Change the property definotion to list:

Property numDossiersMER As list Of %Integer;

a6) Compile the class

Voila, you got a list-table property:

do obj.MyProp.Insert(data) to add data items
and query property data as it would be a table: select * from class.name_MyProp 

b) You want to keep your data and you want to retain the property name numDossiersMER (because you don't want to change existing applications). Before proceeding, make a backup of your class globals, then:

b1) Rename the existing property and then add it again as a new array property:

from: Property numDossiersMER as list of %Integer
to  : Property OLDnumDossiersMER as list of %Integer

change the property name in the storage definition too

from:  <Value>numDossiersMER</Value>
to  :  <Value>OLDnumDossiersMEROLD</Value>

then add the new property as array

Property numDossiersMER as array of %Integer;

b2) Compile the class

b3) Change the property's collection from array to list

Property numDossiersMER as list of %Integer;

b4) Compile the class

b5) Transfer the list data from old storage to the new and potentially delete the old list data

set id=0
for  {set id=$order(^User.TestList.Data.TitreD(id)) quit:'id
        set obj=##class(User.TestList.Data.Titre).%OpenId(id)
        if 'obj write id,"  ??",! continue
        for i=1:1:obj.OLDnumDossiersMER.Count() do obj.numDossiersMER.Insert(obj.OLDnumDossiersMER.GetAt(i)
       // obj.OLDnumDossiersMER.Clear()
      do obj.%Save()
}

or you use an SQL statement instead of $order(...)

b6) Rebuild the indexes.

c) You want to keep your data and you want to have a new property name too. Again, before proceeding, make a backup of your class globals, then:

c1) Add the new property as an array    

Property numNewDossiersMER As array Of %Integer;

c2) Compile the class

c3) Change the new property collection from array to list    

Property numNewDossiersMER As list Of %Integer;

c4) Compile the class

c5) Transfer the list data from numDossiersMER to numNewDossiersMER according to b5)

It's IMPORTANT to follow the above steps in the given sequence!

Just to keep things complete, the other way around (array items stored as list items) is also possible. You have just to swap the definition sequence: define as list, compile, redefine as array, compile.

Both possible structures are considered. Here, I use the  examples from my previous posting:

set obj=##class(DC.Rick.MemberData).%OpenId(1)
do obj.%JSONExport() --> {"members":[{"dob":"1990-07-18","firstName":"Bob","memberId":123956}]}
set obj=##class(DC.Rick.MemberData).%OpenId(2)
do obj.%JSONExport() --> {}

The second example outputs {} only and not {"members":null}, I don't know why. Maybe there is a parameter which control this behavior, please ask WRC. 

From the view of data value, you can consider {} and {"members":null} as equal.

write {"members":null}.%GetTypeOf("members") --> null
write {}.%GetTypeOf("members") ----------------> unassigned

Both representation mean, the members property has no value. But, yes, but you can philosophize about it ...

It depends on...

Who is sitting at the other end? A Cache/IRIS server or a third-party product?

If Cache/IRIS: Mirroring, shadowing are the catchwords, you have to look for. In case of third-party SQL-DB: how fast (how often) want to do your updates? Once a day or (nearly)realtime?

I did something like that several years ago... the procedure is (just as a starting point):

Our application uses objects, so all the involved classes have an %OnAfterSave() method, something like this

Method %OnAfterSave(insert As %Boolean) As %Status
{
   do ..addToTransfer(..%Id())
}

with some smartness, like do not add if the record is already in the transfer queue, etc.  If you use SQL instead of objects, triggers are your friend.

We have also a task,  which crates (based on the class definition) a series of INSERT/UPDATE statement(s) and does the transfer with the help of  %SQLGatewayConnection.

The simplest solution was already answered by Robert Cemper in https://community.intersystems.com/post/how-select-random-row-table. I just want to show a more "universal variant" of that solution.

First, create an SQL stored procedure

class SP.Utilis Extends %RegisteredObject
{
ClassMethod Random(number As %Integer, dummy As %String) As %Integer [SqlProc]
{
   quit $random(number) // we do not use dummy but we need it!!
}
}

then make your query as follows:

select top 10 * from whatever.table where SP.Utils_Random(100,ID)<50

This has following advantages:

1) works with all classes, i.e. the ID column has not to be an integer (greater 0), can be a compound column too (like part1||part2, etc)

2) by adjusting the comparison:

Random(1000,ID) < 50   // gives you more "greater" distances then

Random(1000,ID) <500  // between the returned rows

For testing of edge conditions you can use

Random(1000,ID)<0    // no rows will be returned or

Random(1000,ID)<1000 // all rows will be returnd

With the right side of the comparison you can fine tune the distances between the returned rows.

For the dummy argument in the above function you can use an arbitrary column name, the simplest is to use ID because the ID column always exists, it's purpose is to force the SQL-Compiler to call this function for each row (thinking, the result of the Random() function is row-dependet). A comparsion like Random(100)<50 is executed just once. Roberts solution works too because he uses Random(100)<ID but this works only for tables where ID is a Integer (>0). You can verify this by just issuing a simple query

select top 10 * fom your.table where SP.Utils_Random(100)<50

You will see (by repeatedly executing the above query) either 10 (subsequente) rows or nothing

If you get data as ISO-8859-1 (aka Latin1) and have a Unicode (IRIS/Cache) installation then usually you have nothing to do (except, to process the data). What do you mean with "convert the text to UTF-8"? In IRIS/Cache you have  (and work with) Unicode codepoints, UTF-8 comes into play only when you export your data but in your case, it will rather be ISO-8859-1 or do I something misunderstand?

By the way, if you return your data back to your Latin1 source (as Latin1) then you have to take some precautions because you have an unicode installation, so during the data processing you could mix your Latin1 data with true unicode data from other sources!

See: https://unicode.org/charts/

Also, you may download and read:

https://www.unicode.org/versions/Unicode13.0.0/UnicodeStandard-13.0.pdf

Checking status codes is a good starting point...

set str=##class(%Stream.FileCharacter).%New()
write str  --> 4@%Stream.FileCharacter
write $system.OBJ.DisplayError(str.LinkToFile("/root/my_file.txt")) --> 1
write $system.OBJ.DisplayError(str.WriteLine("line-1"))  --> ERROR #5005: Cannot open file '/root/my_file.txt'1

Your %Save() returns with 1 (OK), because there is nothing to save...

Note: on linux,  for a standard user (like me now) is forbidden to write to '/root' directory