The %Library.DynamicAbstractObject, %Library.DynamicArray and %Library.DynamicObject classes with the %ToJSON and %FromJSON methods first appeared in Caché 2016.2.  I believe that 2016.2 also included JSON constructor syntax in ObjectScript.  I.e., you could write SET jsonvar={"number":1.2, "string":"abcd",null,true} as an executable ObjectScript statement.  Any JSON constant array or object is legal.  I am not sure when JSON constructors involving ObjectScript expressions was allowed.  E.g. SET jsonvar={"number":(1+.2+variable1), "string":("abcd"_variable2),null,false}.  Newer versions of Caché have more JSON support and IRIS has even more support, including the %JSON.Adapter class.  Inheriting the %JSON.Adapter class makes it possible to use the %JSONImport/%JSONExport methods which can read/writeJSON representation into/from ordinary class definitions.

In particular,  an object is garbage collected when there are no ways to access that object using oref values starting from an oref-type value stored in a local variable.  When an oref is converted to an integer (or to a string) then that does not count as an object reference.  Store that integer/string in a PPG and then do KILLs (or otherwise make changes) of oref-type values referring to an object then that object will be deleted despite the fact a PPG (or any other variable) contains an integer/string reference to the object.

Sorry about that.  My browser only showed this part of the URL:   https://docs.intersystems.com/latest/csp/documatic/%25CSP.Documatic.cls?...  I did not notice ?... which indicated a longer URL than what I was really looking at.  The shorter URL I thought I was looking at gives the Class Reference page of the InterSystems documentation web site.

The SubclassOf query in ##class(%Dictionary.ClassDefinitionQuery) executed by a SQL SELECT command on a particular instance does provide a way to look at the subclasses of a class on that instance.  The query provides an even better way for executing code in program that needs to do a run-time search for information that is machine readable.

The local Class Reference web page supported by a particular Caché instance is probably an easier way to find a human readable display of the subclasses.  After clicking on a SubClasses Summary links then you probably want to again click on other links (especially the nested SubClasses Summary links.)
 

The class reference at docs.intersystems.com will tell you about InterSystems supplied classes but it will not tell you about user written classes that are extended from a system supplied abstract classes nor about a user class extended from other user classes.

Look at the following with your browser:  http://localhost:57772/csp/documatic/%25CSP.Documatic.cls , where localhost is the system name on which you are running Caché and 57772 is the WebServer port of your Caché instance.  (If you were running IRIS your WebServer port would look something like 52773.)

Once you are on the Class Reference web page then navigate to find the particular class in which you are interested.  Look at the SubClasses Summary section to find inherited classes.  You can click on the an inherited class name to navigate to the Class Reference information of that class.

Leon has not made a return comment and his original question would be ambiguous to some people.  I often call the " character, the 'double quote' character while I often call the ' character the 'single quote' character.  Other names I might use for the " character are 'quotation character' or 'ditto mark'.  Other names for the ' character would be 'apostrophe' or 'apostrophe quotation character'.  [[Notice I am using apostrophe quotation characters in this comment.]]  I might describe "" characters as 'doubled quotation characters' or 'doubled double quotes'.

Since Leon asked to 'remove double quotes (") from a string' my first impression would be to remove all " characters from the string.  I.e., every $C(32) should be removed.  If Leon wanted to only remove 'doubled quotation characters' then he probably would have written 'remove double quotes ("") from a string'.

If your long strings are coming from JSON representation then

   Set DynObj=##class(%DynamicObject).%FromJSON(...)

will create a  %Library.DynamicObject or %Library.DynamicArray object in memory containing the JSON array/object elements where the sizes are limited only by the amount of virtual memory the platform will allocate to your process.  A string element of an object/array can have many gigabytes of characters (virtual memory permitting) and you can get the value of such a huge string element in the form of an in-memory, read-only %Stream doing:

   Set StreamVal=DynObj.%Get(key,,"stream")

in cases where DynObj.%Get(key) would get a <MAXSTRING>.

The StreamVal (class %Stream.DynamicBinary or %Stream.DynamicCharacter) is a read-only, random-access %Stream and it shares the same buffer space as the 'key' element of the DynObj (class %Library.DynamicObject) so the in-memory %Stream does not need additional virtual memory.

You can then create a persistent object from a class in the %Steam package (%Stream.GlobalBinary, %Stream.GlobalCharacter, %Stream.FileBinary, %Stream.FileCharacter, or some other appropriate class.)  You can then use the CopyFrom method to populate the persistent %Stream from the  read-only, in-memory %Stream.DynamicBinary/Character.

Consider the following:

USER>set num1=1,str1="""1""",num2=2,str2="""2""",num10=10,str10="""10""",abc="abc"

USER>set (x(num1),x(str1),x(num2),x(str2),x(num10),x(str10),x(abc))=""            

USER>set i="" do { set i=$order(x(i))  q:i=""   write i,! } while 1               
1
2
10
"1"
"10"
"2"
abc

The WRITE command treats its expression arguments as ObjectScript string expressions so any argument expression with a numeric value is converted to a string value containing characters.  Note that variables str1 and str2 are strings containing 3 characters starting with one double-quote character, ".  The str1 and str2 values sort among other strings where the first character is a double-quote character.  When you print these subscript strings, the subscripts from variables num1 and num2 print as one character strings,  the subscript from variable num10 prints as a two character string, the subscripts from variables sub1, sub2 and abc print as three character strings and the subscript from variable str10 prints as a 4 character string.

If you use ZWRITE instead of WRITE then ZWRITE will add extra quote marks, will add $C(int) expression syntax and will add other things so that the textual representation that is printed does not contain any unprintable characters and you can see trailing white space.

I wasn't advocating changing a system collation or changing a namespace collation.  I only recommended using ##class(%Library.GlobalEdit) to change the subscript collation of a newly created individual global variable, leaving all other globals unchanged.

You can see the collations loaded into an instance by executing 'DO  ^|"%SYS"|COLLATE' .  I believe in this case the user wants to use built-in collation 133, which should be the version of collation 5 that sorts only strings and does not sort numbers.  It looks like 133 is now considered to be a "legacy collation" as I have problems finding it in the on-line documentation.

You might consider looking at the ##class(%GlobalEdit).Create(...) method.  It has a 'Collation' argument which allows you to change the collation from the namespace default.  Just choose a collation that does *NOT* sort canonical numeric strings in front of non-numeric strings.

I don't recommend ever changing the default collation of a namespace as many utility routines depend on canonical numeric strings sorting in numeric order and not sorting in string order.  Those utilities may not work in such a namespace.

You might consider using the %Library.DynamicObject and %Library.DynamicArray classes which are built into ObjectScript.  ObjectScript supports JSON constructors for %DynamicObject, {...}, and for %DynamicArray, [...].  a %DynamicObject/Array can contain both JSON values and ObjectScript values.   There is also a %FromJSON(x) which reads JSON objects/arrays when x is a %Stream or a FileNme or an ObjectScript %String containing JSON syntax.  Here is an example from a recent IRIS release:


USER>zload foo

USER>zprint
foo()    {
            set DynObj = {
            "notanumber":"28001",
            "aboolean":true,
            "anumber":12345,
            "adecimal":1.2,
            "adate":"01/01/2023",
            "adate2":"01-01-2023",
            "anull":null,
            "anull2":null,
            "anull3":null
            }
            write DynObj.%ToJSON(),!,!

            set DynObj.Bool1 = 1    ; Set without type
            do DynObj.%Set("Bool2",1,"boolean") ; Set with type
            set DynObj.old1 = "abc"   ; Set without type
            do DynObj.%Set("new1","abc","null") ; Set with type
            set DynObj.old2 = ""   ; Set without type
            do DynObj.%Set("new2","","null") ; Set with type
            write DynObj.%ToJSON()
}        

USER>do ^foo()
{"notanumber":"28001","aboolean":true,"anumber":12345,"adecimal":1.2,"adate":"01/01/2023","adate2":"01-01-2023","anull":null,"anull2":null,"anull3":null}

{"notanumber":"28001","aboolean":true,"anumber":12345,"adecimal":1.2,"adate":"01/01/2023","adate2":"01-01-2023","anull":null,"anull2":null,"anull3":null,"Bool1":1,"Bool2":true,"old1":"abc","new1":"abc","old2":"","new2":null}

Your can use ordinary ObjectScript assignment to assign ObjectScript values to entries in a %DynamicObject/Array.  You can use the %Set(key,value,type) method to assign ObjectScript values into an object with an optional type parameter controlling which JSON value to which to convert the ObjectScript value.  I.e., type "boolean" converts ObjectScript numeric expressions into JSON false/true values and type "null" converts an ObjectScript %String into a JSON string *except* that the empty string is converted to the JSON null value.  (Note:  some older implementations of %Set with type "null" only accepted "" as the value while new implementations accept any string expression as the value.)

The JSON constructor operators in ObjectScript, besides accepting JSON literals as values, can also accept an ObjectScript expression enclosed in round parentheses. 

E.g.,  SET DynObj2 = { "Pi":3.14159, "2Pi":(2*$ZPI) }

When asking about a 'ROUTINE' you may be asking about the difference between a 'Routine', 'Procedure', 'Subroutine', 'Function', 'Label', 'Class' and 'Method'

A 'Routine' is a source code file with name like 'myroutine.mac'.  Source code can also be a 'Method' which is found in a 'Class' file with a name like 'myclass.cls"

A Routine file can contain Procedures, Subroutines, Functions and Labels.  See https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=GCOS_usercode#GCOS_usercode_overview for some InterSystems documentation.

You can call a subroutine with the statement 'DO MySubroutine^myroutine(arg)'; You can call a function with the expression '$$MyFunction^myroutine(arg)';  You can call the procedure 'MyProcedure^myroutine(arg)' using either the syntax for a subroutine call or function call depending on whether you need the value returned by the procedure; and, You can Goto the source code following a label with the statement 'GO MyLabel^myroutine'.  If you reference a subroutine/function/procedure/label inside a Routine file (e.g. myroutine.mac) and that subroutine/function/procedure/label access (e.g.$$MyFunction^myroutine(arg)) is referencing a name defined in the same Routine file then you do not have to specify the ^Routine name in the call syntax (e.g. $$MyFunction(arg)).

The local variables used inside a subroutine or function are 'public' variables and those named variables are shared with all other subroutines and functions.  You can use the NEW command inside a subroutine or function to temporarily create new public variables without modifying previous instances of public variables with the same names.

The local variables used inside a procedure are private variables.  The private variables are only available inside the procedure.  Those names do not conflict with variables used by any caller of the procedure.  Those names are not available in any code called by the procedure although it is possible to pass either the private variable or the value of the private variable as an argument when calling out of a procedure.  The declaration of a procedure can optionally include a list of public variables.  Names in public list reference the public variable when they are accessed by code in the procedure.  You can use the NEW command with an argument that is a public variable within a procedure.  A label defined inside a procedure is also private and it cannot be referenced by any GO command outside that procedure.

Methods defined in a class file default to being a procedure with private variables and private labels.  However, it is possible to specify that a method is a subroutine or function.   Also, a method procedure declaration can optionally include a list of global variables.

As mentioned in another Developer Community Question, a recent version of IRIS would allow you to  evaluate object.%Get("pdfKeyName",,"stream") which would return to you an %Stream object containing the JSON string in question as raw characters.  Also, %Get in IRIS can support object.%Get("pdfKeyName",,"stream<base64") which would do Base64 decoding as it creates the %Stream. However, you said you need to stick with an older version of Caché/Ensemble which predates these %Get features.

It is possible to convert the long JSON string component to a %Stream but it will take some parsing passes.

(1) First use SET json1Obj=[ ].%FromJSON(FileOrStream) to create json1OBJ containing all the elements of the original JSON coming from a File or Stream.

(2) If your pdf JSON string is nested in json1OBJ then select down to the closest containing %DynamicObject containing your pdf JSON string,  I.e. Set json2Obj=json1Obj.level1.level2 if your original JSON looks like {"level1":{"level2":{"pdfKeyName":"Very long JSON string containing pdf", ...}, ...}, ...}

(3) Create a new %Stream containing the JSON representation of that closest containing %DynamicObject.  I.e.,

   SET TempStream=##class(%Stream.TmpBinary).%New()
   DO json2Obj.%ToJSON(TempStream)

(4) Read out buffers from TempStream looking for "pdfKeyName":" .  Note that this 14-character string could span a buffer boundary.

(5) Continue reading additional buffers until you find a "-characer not preceded by a \-character; Or until you find a "-character preceded by an even number of \-characters.

(6) Take characters read in step (5) and pass them through $ZCVT(chars,"I","JSON",handle) to convert JSON string escape characters to binary characters and then write them to a %Stream.FileBinary (or some other Binary %Stream of your choosing.)

(7) If your JSON "pdfKeyName" element contains Base64 encoding  then you will also need to use $SYSTEM.Encryption.Base64Decode(string). Unfortunately $SYSTEM.Encryption.Base64Decode(string), unlike $ZCVT(chars,"I",trantable,handle), does not include a 'handle' argument to support the buffer boundary cases in those situations where you have broken a very large string into smaller buffers.  Therefore you should remove the whitespace characters from 'string' before calling $SYSTEM.Encryption.Base64Decode(string) and you must make sure the argument 'string' has a length which is a multiple of 4 so that a group of 4 data characters cannot cross a buffer boundary.

Again the more complete support in IRIS can do steps (2) through (7) without worrying about buffer boundaries by executing

   Set BinaryStreamTmp = json1Obj.level1.level2.%Get("pdfKeyName",,"stream<base64")

and you can then copy the BinaryStreamTmp contents to wherever you want to resulting .pdf file.

The latest version of IRIS will have improved %DynamicObject (and the %DynamicArray) class objects.  They will support a method call like obj.%Get(key,,”stream”) which will return a %Stream.DynamicCharacter oref and this %Stream can contain a very large number of characters.  This can be copied into a %Stream.GlobalCharacter or a %Stream.FileCharacter if you want to save those characters in a persistent object.

This new form of %Get will also be able to include encoding/decoding using Base64 representation.  Similar extensions have been added to the %Set method.

You should not follow the recommendation to modify the $ZTIMEZONE system variable.  See this warning in the $ZTIMEZONE documentation:

Note:
Changing the $ZTIMEZONE special variable is a feature designed for some special situations. Changing $ZTIMEZONE is not a consistent way to change the time zone that Caché uses for local date/time operations. The $ZTIMEZONE special variable should not be changed except by those programs that are prepared to handle all the inconsistencies that result.

On some platforms there may be a better way to change time zones than changing the $ZTIMEZONE special variable. If the platform has a process-specific time zone setting (for example, the TZ environment variable on POSIX systems) then making an external system call to change the process-specific time zone may work better than changing $ZTIMEZONE. Changing the process-specific time zone at the operating system level will change both the local time offset from UTC and apply the corresponding algorithm that determines when local time variants are applied. This is especially important if the default system time zone is in the Northern Hemisphere, while the desired process time zone is in the Southern Hemisphere. Changing $ZTIMEZONE changes the local time to a new time zone offset from UTC, but the algorithm that determines when local time variants are applied remains unchanged

If you change $ZTIMEZONE then the local time will change but local changes in timezone rules (e.g., entering/leaving Daylight Saving Time, DST) will not be changed.  If the system is in the Northern Hemisphere but you want a local time in the Southern Hemisphere (or vice-versa) then DST changes will be backwards.  Also, DST rules near the equator can be very different from the DST rules at latitudes closer to the poles.  If the system local timezone and the modified local timezone are in different countries then the national date/time rules may be incorrect for the modified local timezone.

Using the $SYSTEM.Process.TimeZone(...) method suggested by Jon Willeke is the best way to modify the local timezone used by a Caché/IRIS process.  However, the 'TZ' environment variable modified by the $SYSTEM.Process.TimeZone(...) method requires an argument string that is specific to the Operating System under which Caché/IRIS is running.  Generally the Windows Operating System wants the TZ variable to contain a POSIX format timezone string while Unix/Linux systems want the TZ variable to contain an Olson format timezone string (sometimes called the IANA or ICU format timezone string.)  If you need dates/times using rules from the past then generally the Olson format will work much better than the POSIX format.

Aside from several arithmetic differences, the biggest difference between $INCREMENT AND $SEQUENCE occurs when "SET index=$INCREMENT(^global)" versus "SET index=$SEQUENCE(^global)" are being executed by multiple processes.

All the processes evaluating $INCREMENT(^global) on the same ^global variable will see a sequence of increasing integers.  No two processes will see the same integer returned.  The integers are given out in strict increasing time order and no integer value is skipped.

All the processes evaluating $SEQUENCE(^global) on the same ^global variable will see a sequence of increasing integers.  No two processes will see the same integer returned.  Because blocks of integers are assigned to processes, it is possible for one process to receive a larger integer in the sequence before some other process receives a smaller integer in the sequence.  If some process decides to stop processing integer values at some point then the larger integer values assigned to that process will be skipped and not returned as part of the sequence.

The $SEQUENCE function can have less multi-process overhead because integers in the sequence are assigned in blocks but $SEQUENCE does not guarantee a sequence of numbered tasks is assigned to processes in strictly increasing order.  Every process must continue processing numbered tasks until that particular process has been assigned a sequence number larger than the highest assigned task.  Just because one process has finished the highest assigned task does not mean that other processes are done with the earlier tasks (or have even started earlier tasks) and the sequence is only complete when every process has received a sequence number larger that the number of the final task.

The JSON classes in Ens.Util.JSON,  %ZEN.Auxiliary.json* and %JSON.* all contain methods that convert JSON representation to/from ObjectScript classes.  Once you have an ordinary ObjectScript Class then you are using ObjectScript data types for Property values.  The JSON null is usually converted to "" (null string).  Also, ordinary Property variables of an ObjectScript Class are never undefined but are automatically initialized to "" (the null string).  [[ An exception is [MULTIDIMENSIONAL] Properties which can be undefined but by default such Property variables do not participate in %Save() and JSON/XML export/import operations. ]]  SQL operations involving Class properties treat "" (the null string) as the SQL NULL value and SQL assumes a Class Property containing the ObjectScript string $C(0) is the empty string.

[[ Although the original question involved Caché and not IRIS, IRIS has signifcantly more complete support for the %DynamicAbstractObject classes so my examples will use IRIS.  If possible, I recommend upgrading to IRIS. ]]

There is the class %Library.DynamicAbstractObject and its subclasses %DynamicArray and %DynamicObject that can contain elements which can either be JSON values or ObjectScript values.  The ObjectScript statement:

USER>SET x={"a":null,"b":"","e":1.0,"f":"1.0","g":(00.1),"h":($c(0))}

makes x be a %DynamicObject oref where element a is a JSON null and where element "g" is the ObjectScript number .1 and element "h" is the ObjectScript string $c(0).  Note that if an ObjectScript expression is a %DynamicObject constructor then ObjectScript parses the elements of that constructor using JSON syntax except for the extension to constructor syntax where an element inside round parentheses is parsed as an ObjectScript run-time expression generating an ObjectScript value.

You can convert a %DynamicObject to JSON string representation and any ObjectScript valued element will be converted to JSON representation.

[[ Note that JSON does not support certain ObjectScript values: $double("NAN"), $double("INFINITY") and orefs that are not a subclass of %DynamicAbstractObject.  A %DynamicAbstractObject containing such an ObjectScript element cannot be converted to JSON. ]]

USER>WRITE x.%ToJSON()
{"a":null,"b":"","e":1.0,"f":"1.0","g":0.1,"h":"\u0000"}

Evaluating a %DynamicObject element in an ObjectScript expression converts that element to an ObjectScript value.

USER>ZWRITE x.a, x.b, x.c, x.e, x.f, x.g, x.h
""
""
""
1
"1.0"
.1
$c(0)

Notice that the undefined element x.c is converted to the ObjectScript null string.  You never get an error evaluating x.%Get(key) for any value of the expression key as every undefined element in a %DynamicObject has the value of the null string.  Also, x.a, which contains a JSON null, is converted to the ObjectScript null string.  The JSON treatment of undefined elements and the ObjectScript treatment of undefined Properties means that when we convert an ordinary ObjectScript class to either XML or JSON then we can skip converting a Property with the null string value as converting JSON or XML back to an ordinary class object will result in all unrepresented properties getting the value of the null string.

If you need to know if a %DynamicObject element is JSON null, null string or undefined then evaluating the %GetTypeOf(key) will tell you that.

USER>ZWRITE x.%GetTypeOf("a"),x.%GetTypeOf("b"),x.%GetTypeOf("c"),x.%GetTypeOf("e")
"null"
"string"
"unassigned"
"number"

The %FromJSON(stream)/%ToJSON(stream) methods will let you read/write JSON representation from/to a %Stream.

[[ Things that only work in IRIS follows. ]]

The size of the %DynamicArray/%DynamicObject class objects is limited only by the amount of memory available to your process.  A string valued %DynamicObject element can be significantly longer than the maximum length supported by ObjectScript string values.  If you have such a long string element then you will have convert that element to an ObjectScript %Stream in order to manipulate it in ObjectScript.

USER>SET stream=x.%Get(key,,"stream")  ;; Note 3 arguments with 2nd argument missing

will generate an in-memory, readonly %Stream that can be copied to a Global or File %Stream or can be examined by reading that string in small pieces.

In recent IRIS releases you can do

USER>SET binarystream=x.%Get(key,,"stream<base64")

which will convert a base64 encoded element into a readonly binary %Stream.  You can do the reverse conversion by evaluating x.%Set(key,binarystream,"stream>base64").  See the the Class Reference documentation pages for the %Library.%DyanmicAbstractObject class and its %DynamicArray and %DynamicObject subclasses for more details.