Maks I fully support your position.
There are very few languages where you can run code written  40 yrs. ago with no modification !
That's what counts for customers.

You have other languages in parallel like MV or BASIC besides all the embedded things like &SQL(), &HTML(), &JS()..

But with same reasoning you may ask why is Java, JavaScript  or C, C# not changed?
Because you got GO, Angular, ..... to have attractive extensions.
COS has it's ZZ*, $ZZ* as extensions.  ( ZZ* ! )

Isn't this enough for thousands of developers that haven't asked for it nor have a need  for it?  

if you open BPL in Studio click here to switch to Class view

by program the code is found in class %Dictionary.XDataDefinition

Property Data holds your XData as Stream

http://docs.intersystems.com/latest/csp/documatic/%25CSP.Documatic.cls?P...

to find the id you can use
select ID from %Dictionary.XDataDefinition where parent %startswith '<my class name>'

for this cases a possible solution could be 

%Stream.Global has a FindAt method that could give you a position of  "\u00"

[Find the first occurrence of target in the stream starting the search at position. ]

http://docs.intersystems.com/latest/csp/documatic/%25CSP.Documatic.cls?P...

But: if you are on the decoded stream all non printables are just single characters. No issue to cut it in pieces

  • read your source stream in reasonable sized junks
  • clean out what ever you need
  • append it to a temporary stream
  • loop on source until you hit AtEnd condition
  • finally replace your source either by "copyFromSteam" method [temp -> source]
    or replace source stream reference by temp stream reference

I guess the whole code is shorter than this description.

I'd suggest not to touch the global under the source steam.

The default for %String is MAXLEN=50

if you write in your definition %String(MAXLEN="")   also in Method calls this should be enough.

Query Methode(data1 As %Library.String(MAXLEN=""), data2 As %Library.String(MAXLEN=""), data3 As %Library.String(MAXLEN="")) As %Library.Query(CONTAINID = 1, ROWSPEC = "Result,Par2:%String") [ SqlProc ]

  and so on.

Or you make you own data type  inheriting %String overwriting  Parameter MAXLEN=""

Or just use %Library.VarString which makes just this MAXLEN=""
http://docs.intersystems.com/latest/csp/documatic/%25CSP.Documatic.cls?P...

Query Methode(data1 As %LibraryVar.String, data2 As %Library.VarString, data3 As %Library.VarString) As %Library.Query(CONTAINID = 1, ROWSPEC = "Result,Par2:%VarString") [ SqlProc ]

  and so on.

of course it makes sense!

but then you know which application is using it and can use the application's cleaning method / routine that takes care of all kind of dependencies:
I remember well times when routines used to start with  KILL ^CacheTemp*($JOB)
I expected over time most applications are using PPG  (^||myGlobal...) to avoid this. Or have a Clean-Up.

Hi Evgeny,

IF you can afford a short OFFLINE state:

#5)  dismount DB / copy of cache.dat to a fast local device / remount it
      move the copy in a secure place: #2, #1

ELSE IF you have to remain online all the time:
#3)  on fast local device + move backup in secure place by #2,#1
 

NEVER #4) a fair chance for massive inconsitency

Robert
[semper fidelis]

Great explanation of the issue. Thanks!
So we have an nice example what Proleptic Gregorian Calendar used for $H calculation means:

 

write $zd($zdh("1492-10-12",3,,,,,-600000)#7,11)
Wed

 

And that's definitely not correct as you demonstrated very precisely.
But is common use in most programming and DB systems. 

But the date as such is questionable for 2 reasons

  • There is a 5..6 hours time gap between Spain and the Caribean sea 
  • At the and of the middle age every kingdom and smaller typically dated their documents
    by the years their actual king was in power. A common date as we know was not at all in place.

So Oct.12 is most likely a date back calculated by historians hundred years later . 
So we should interpret this date as an common agreed convention that by luck was Friday.

Thanks again for the contribution.