Indirection has its rule:  Indirection works with PUBLIC variables,
i.e. the variable, you address, mustbe a public variable, in your case the <arg> variable.
This is due to compatibility with old applications,
developed before the introduction of the block structure.
  
You have two options

instead of using argument indirection (what you currently do),
use name indirection for label and routinname, see method Test1()

@lab and @(rou)  because label- and routine-names are not variables.

If you want to keep argument indirection, just tell your class, that certain variables were PUBLIC
see method Test2()

In your example, you got a wrong result because, by chance the variable <arg> was defined in the terminal session with the same value as in methode code, see method Test3()


ClassMethod Test1()
{
    s arg="argument-1"
    s lab="say", rou="hello"
    d @lab^@(rou)(arg)
}

ClassMethod Test2() [ PublicList = arg ]
{
	s arg = "argument-2"
	s routine = "say^hello(arg)"
	d @routine
}

ClassMethod Test3()
{
	s routine = "say^hello(arg)"
	d @routine
}

Now some tests in a terminal

kill   // we kill all local variables
do ##class(your.class).Test1() ---> argument-1

kill
do ##class(your.class).Test2() ---> argument-2

kill
do ##class(your.class).Test3() ---> <UNDEF> *arg ==> missing arg

kill
set arg="surprise"
do ##class(your.class).Test3() ---> surprise

// you can prove things the other way too

set arg="my-value"  // variables, defined in a terminal session are public
do ##class(your.class).Test1() ---> argument-1
write arg --> my-value  // arg wasn't overwritten!

do ##class(your.class).Test2() ---> argument-2
write arg --> argument-2  // arg IS OVERWRITTEN

Method Test3() shows, indirection works with public variables

See the example class below 

Class DC.Encoding Extends %RegisteredObject
{

/// Take an raw stream (i.e. unencoded) and
/// output a new, Base64 encoded stream.
/// 
ClassMethod ToBase64(str)
{
	// Base64 encoding means:
	// you take 3*N characters from the source
	// and put  4*N characters into the destination.
	// If the size of the source is not a multiple of 3 then
	// the last one or two bytes will be padded.
	// 
	// If you take an N such that 4*N less or equal 32767
	// (the max size of a short string) then Cache or IRIS
	// can work with short strings, which perform (usually)
	// better than long strings
	// 
	// N is integer.
	// 
	// A good value for N is 8190,
	// so you read 24570 bytes from the source and write 32760 to the destination
	// 
	// Of course, you can take whatever number up to  910286
	// (3 * 910286 = 2730858,  4 * 910286 = 3641144)
	// 
	set len=8190*3
	set flg=1 // this flag instructs $system.Encryption.Base64Encode
			// not to insert linebreaks at every 76 characters
	set new=##class(%Stream.GlobalCharacter).%New()
	do str.Rewind()
	while 'str.AtEnd {
		do new.Write($system.Encryption.Base64Encode(str.Read(len),flg))
	}
	quit new
}

/// Take a Base64 encoded stream
/// and decode it to a new stream
/// 
/// The method itself has no information about the decoded data
/// hence it assumens binary data, but you, the caller (hopefully)
/// knows more about your data and can provide the correct stream
/// type for the decoder.
/// For exaple a character stream instead of binary.
ClassMethod FromBase64(str, new = 0)
{
	// Base64 decoding means:
	// you take 4*N characters from the source
	// and put  3*N characters into the destination
	// 
	// If you take an N such that 4*N less or equal 32767
	// (the max size of a short string) then Cache or IRIS
	// can work with short strings, which perform (usually)
	// better than long strings
	// 
	// N is integer.
	// 
	// A good value for N is 8190,
	// so you read 24570 bytes from the source and write 32760 to the destination
	// 
	// Of course, you can take whatever number up to  910286
	// (3 * 910286 = 2730858,  4 * 910286 = 3641144)
	// 
	
	set len=8190*4
	set:'new new=##class(%Stream.GlobalBinary).%New()
	do str.Rewind()
	while 'str.AtEnd {
		do new.Write($system.Encryption.Base64Decode(str.Read(len)))
	}
	quit new
}

ClassMethod Test(file)
{
	set str=##class(%Stream.FileBinary).%New()
	do str.LinkToFile(file)
	write str.Size,!
	
	set enc=..ToBase64(str)
	write enc.Size,!
	
	set dec=..FromBase64(enc)
	write dec.Size,!
}

}

In case, you talk about Cache/IRIS-Classes:

Class Example.Test Extends %Persistent
{
Property BodyText As list Of MyList;
}


Class Example.MyList Extends %SerialObject
{
Property Text As list Of %String;
}

The steps to add data:

set test=##class(Example.Test).%New()

set list1=##class(Example.MyList).%New()
do list1.Text.Insert("red")
do list1.Text.Insert("green")
do list1.Text.Insert("blue")
do test.BodyText.Insert(list1)

set list2=##class(Example.MyList).%New()
do list2.Text.Insert("Joe")
do list2.Text.Insert("Paul")
do list2.Text.Insert("Bob")
do test.BodyText.Insert(list2)

write test.%Save() --> 1

zw ^Example.TestD
^Example.TestD=1
^Example.TestD(1)=$lb("",$lb($lb($lb($lb("red","green","blue"))),$lb($lb($lb("Joe","Paul","Bob")))))


zso test
BodyText(1).Text(1).: red
BodyText(1).Text(2).: green
BodyText(1).Text(3).: blue
BodyText(2).Text(1).: Joe
BodyText(2).Text(2).: Paul
BodyText(2).Text(3).: Bob

Assuming, your input value is an integer, you have , along with the other solutions, one more:

// this works as long as len  < 145
//
set len =  120
set inp = 12345
write $e(1E120_inp,*-len+1,*)

// of course, if the len is shorter than, say 10,
// then you can use smaller constans like
//
set len=10
set inp=9
write $e(1E10_inp,*-len+1,*)

A good (or even a bad) side effect of the above solution is, if you get an input value which is LONGER than the length, it will be truncated to the given length

First convert the dynamic array to a Cache List and then the Cache List to Python List - voila the job is done

/// Convert a dynamic array to a Cache List
/// 
ClassMethod ArrayToList(arr)
{
	q:'arr.%Size() $lb()
	s list="", it=arr.%GetIterator()
	while it,it.%GetNext(.key,.val) {
		s typ=arr.%GetTypeOf(key)
		s list=list_$case(typ,"object":$lb(val.%ToJSON()),"array":$lb(..ArrayToList(val)),"null":$lb(),:$lb(val))
	}
	q list
}

First, I presume, the Studio lacks such a functionality because usually each nsp contains independent data. As an example, for each of my customers (applications) I have an dedicated namespace  (of course, you may say, one can allways have an exeption),

and second, if there is no readymade functionality, then make your own. Sometimes it takes longer asking questions or searching the internet then writing a quick-and-dirty "one liner", something like this:

// classdefinitions are stored in ^oddDEF, mac-routines in ^rMAC
// as said above, quick-and-dirty:
// if the SEARCHTERM occurs in %-items, then you will get multiple hits
//
// the one-liner version
k n i ##class(%SYS.Namespace).ListAll(.n) s n="" f  s n=$o(n(n)) q:n=""  f s="^|n|rMAC","^|n|oddDEF" f  s s=$q(@s) q:s=""  w:@s_s["SEARCHTERM" s," ",@s,!

// for folks with less experience
//
ClassMethod SearchAllNSP(searchterm)
{
   i ##class(%SYS.Namespace).ListAll(.n) {
      s n=""
      f  {s n=$o(n(n)) q:n=""
          f s="^|n|rMAC","^|n|oddDEF" {
             f  s s=$q(@s) q:s=""  w:@s_s[searchterm s," ",@s,!
          }
      }
   }
}

It's up to you to left out all those multiple %-items and to add some formatting...

So the bottom line of my answer is: yes, there is a way to search (whatever you want) in one go

Class DC.BigJSON Extends %RegisteredObject
{

ClassMethod Test(filename)
{
	if ..SaveToFile(..MakeJSON(), filename) {
		write "Save OK",!
		write "Size ",##class(%File).GetFileSize(filename),!
		
		set input=##class(%File).%New(filename)
		set sts=input.Open("RS")
		if sts {
			set json={}.%FromJSON(input)
			set iter=json.%GetIterator()
			while iter.%GetNext(.key, .val) {
				write "key=",key," size=",$l(val)," data=",$e(val,1,10)_"...",!
			}
		} else  { write $system.Status.GetOneErrorText(sts),! }
	}
}

ClassMethod MakeJSON()
{
	set obj={}
	set obj.text1=$tr($j("",3600000)," ","a")
	set obj.text2=$tr($j("",3600000)," ","b")
	set obj.text3=$tr($j("",3600000)," ","c")
	quit obj
}

ClassMethod SaveToFile(obj, filename)
{
	set file=##class(%File).%New(filename)
	set sts=file.Open("wnu")
	if sts {
		do obj.%ToJSON(file)
		do file.Rewind()
		use file.Name 
		do file.OutputToDevice()
		do file.Close()
		quit 1
	} else { quit sts }
}

}

The size shouldn't be a problem


USER>do ##class(DC.BigJSON).Test("/tmp/test1.txt")
Save OK
Size 10800034
key=text1 size=3600000 data=aaaaaaaaaa...
key=text2 size=3600000 data=bbbbbbbbbb...
key=text3 size=3600000 data=cccccccccc...

The first part (Base 64 encoding is not able to encode ... unicode (2 byte) characters)  is correct. The second part (data-->utf8-->base64 and base64-->utf8-->data) is correct only if there is an agreement beetwen the sender and receiver about the double-encoding (utf8+base64).

If I was told, I get a base64 encoded file then I expect a file which is only base64 encoded and not a mix of several encodings including base64. A simple way to encode your document could be something like this:

ClassMethod Encode(infile, outfile)
{
    // file-binary reads bytes and not characters
    set str = ##class(%Stream.FileBinary).%New()
    set str.Filename = infile
    set len = 24000 // len #3 must be 0 !
    set nonl = 1    // no-newline: do not insert CR+LF
    do str.Rewind()

    open outfile:"nwu":0
    if $test {
        use outfile
        while 'str.AtEnd { write $system.Encryption.Base64Encode(str.Read(len),nonl) }
        close outfile
    }
    quit $test
}

If your system does not support JSON (i.e. pre 2016.2?) then give this "dirty trick" a try:

- add a zero-width-space character to your numbers
- create the output stream
- remove the zero-width-space characters

Instead of the zero-width-space you can use any other character too, which does not appear in your data (binary data should be base64 encoded).

ClassMethod WithQuotes()
{
	set zwsp = $c(8203) // zero-width-space
	set obj = ##class(%ZEN.proxyObject).%New()
	
	set obj.ID = 1234_zwsp
	set obj.Number=123.45_zwsp
	
	if ##class(%ZEN.Auxiliary.jsonArrayProvider).%WriteJSONStreamFromObject(.tmp,obj) {
  		set json=##class(%Stream.TmpBinary).%New()
		do tmp.Rewind()
		while 'tmp.AtEnd { do json.Write($tr(tmp.Read(32000),zwsp)) }
	}
	
	do json.Rewind()
	write json.Read(json.Size)
}

As you wrote,  %XML.TextReader is used to read arbtrary XML documents. "A text where in the middle a little bit xml-structure sits" isn't XML!

Maybe there is a Pyhton library for extracting XML from a text. If not, probably you have to read char-after-char, count each "<" (+1) and ">" (-1) and if the counter is 0 then between the first "<"  and the last ">" probably you have a correct XML structure. Oh, and don't forget for <![CDATA[...]]> sequences, which makes the reading more challenging.

Just came to my mind
- Cache-5.0.x is likely to be 32 Bit version, Win-11 is 64 Bit (only) (in case, the application uses some .dll, .ocx, etc.)
- unlikely that you use it, but as a hint, LAT is not supported anymore
- user database is now provided by ISC. In case your application maintains its own users, you can still use your own user database, but the login process will require some "adaption"

As a first step, I would contact your ISC Sales because Cache-5.0.x licenses neither work with (the latest) Cache nor with IRIS. Second, there was a lot of change between Cache-5.0.x and recent Cache/IRIS versions, so I would check to see if there are any problems to expect. A customer of mine "upgraded" fom Cache-5.0.21 to IRIS  some four years ago...

Just for the case, you are lost in the working memory space and desperately searching the spot(s) in your programm where a specific object is once again referenced, here a small handy method which could help you

/// find all variables which contain a given object(reference)
/// 
/// I: the OREF you looking for
/// 
/// O: "" if the spool-device can't be opened
///    [] if no variables contain the given OREF
///    [var1, var2, ... varN] an array of variable names (incl. subscripted and orefs)
///    
ClassMethod FindObject(obj)
{
	set res=[]
	if $d(%)#10,%=obj do res.%Push("%")
	new % set %=obj kill obj
	
	lock +^SPOOL("nextID")			// adapt this lines
	open 2:($o(^SPOOL(""),-1)+1):1	// to your method of
	lock -^SPOOL("nextID")			// creating new spool IDs
	
	if $t {
		use 2
		set spl=$zb
		do $system.OBJ.ShowReferences(.%,1)
		
		for i=1:1:$za-1 {
			set x=$p($zstrip(^SPOOL(spl,i),"<=>w",$c(13,10))," ",3)
			do:x]"%.~" res.%Push(x)
		}
		close 2
		kill ^SPOOL(spl)
		
	} else { set res="" }
	
	quit res
}

Example

USER>kill
USER>set pers=##class(DC.Person).%OpenId(1)
USER>set temp=pers, zz(3)=temp
USER>write ##class(DC.Help).FindObject(pers).%ToJSON()
["pers","temp","zz(3)"]

I'm not sure wha you want to achive, so I ask a puzzling question: do you want to create dungling object? "I want to remove the object from memory even if it is still referenced in memory", as I understand, would free the memory used by an object but let the object referenc(es) intact, so the reference now would point into nirvana. Is that what you want to do? Why? Can you a little bit elaborate, what is your target or the background respectively?