I don't understand your problem, in fact, are you having a problem or you just think there is a problem?

The purpose is to avoid the huge amount of source-target DTL processes while working with several hundreds of Rules 

First, rules do not modify messages, only DTL may modify a message.
Did you measured and verified this supposed "huge amount" or you think/guess this is actually happening? 

Please note that HL7 storage is complex/advanced and very efficient, only changed segments are actually duplicated in storage, even from different HL7 messages.
Did you actually checked what is actually stored in your scenario?

I feel and believe that you are looking for a solution to a non existent problem.

Sorry but, again, you are missing important context and details.
With some imagination, fantasy and creativity I can guess the missing info....

What class does your AthenaChangeDataSubscription class exstends?
I guess Ens.Request or %Persistent and %XML.Adaptor and possibly others, this is crucial info.

Where do you see the ![CDATA[ ? (I asked, you did not answer)
I guess from the picture you post you see it in the Contents viewer in Visual Trace, again, this is crucial info.

Now, to your original question:
What causes the ![CDATA[ in a string field?

The CDATA is not in your string property, it is only in the visualization in the portal of the content of your class/property.

Here is a sample request that, I imagine, is similar to your class:

Class Community.msg.AthenaChangeDataSubscription Extends Ens.Request
{
Property QueryParameters As %String;
}

Here is what the IRIS Interoperability portal is using to display the content:

EPTEST>Set msg=##class(Community.msg.AthenaChangeDataSubscription).%New()
 
EPTEST>Set msg.QueryParameters="showportalonly=1&leaveunprocessed=0&limit=5000"
 
EPTEST>Set sc=##class(EnsPortal.MessageContents).writewithoutNonXMLChars(,msg)
<AthenaChangeDataSubscription xmlns:s="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><QueryParameters><![CDATA[showportalonly=1&leaveunprocessed=0&limit=5000]]></QueryParameters></AthenaChangeDataSubscription>
EPTEST>w msg.QueryParameters
showportalonly=1&leaveunprocessed=0&limit=5000
EPTEST>

As you can see the ACTUAL CONTENT of your QueryParameters property does not have any CDATA, the XML VISUALIZATION of your property content does have CDATA, and that's because it is required by XML to properly display the real/actual content.
The IRIS interoperability portal display the XML representation of your class and does it in the proper/correct XML representation.

If your property contains simple text that can be properly (correctly) "rendered" in XML without CDATA, then:

EPTEST>Set msg=##class(Community.msg.AthenaChangeDataSubscription).%New()
 
EPTEST>Set msg.QueryParameters="this is a simple content"
 
EPTEST>Set sc=##class(EnsPortal.MessageContents).writewithoutNonXMLChars(,msg)
<AthenaChangeDataSubscription xmlns:s="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><QueryParameters>this is a simple content</QueryParameters></AthenaChangeDataSubscription>

As you can see there is no CDATA.

Please, please, in the future, provide details of the context.
Imagination, fantasy and creativity should not be a requirement to understand a question.

Why define and reference an outbound adaptor when you don't use it in your code?
I'd suggest to have a look to the documentation Using the HTTP Outbound Adapter.

Anyway, given your code, you don't need a stream, it's just waste of resources and make your code more complicated for no reason, you can simply:

....
Set httpRequest.ContentType = "application/xml"
Do httpRequest.EntityBody.Write("<?xml version=""1.0"" encoding=""UTF-8""?><getURL>"_pRequest.getURL_"</getURL>")
Set sc = httpRequest.Post("", 2)
....

Note that I assume that <Url1></Url1> is contained in the getURL string property content, as your first post suggest.

Where do you see the ![CDATA[  ?

CDATA it's part of an XML document, "a CDATA section is a piece of element content that is marked up to be interpreted literally, as textual data, not as marked-up content" (from wikipedia).

If it's contained in a string property (is this that you mean by field?), then evidently that value was assigned to the string value.

But I suspect your question is missing the context......

I'm not sure if this apply to your case but in the past we found that a very old database (20+ years) that has been upgraded many time over the years had bitmap indices "not compacted" and we gained a lot of space and, more importantly, huge performance improvement running %SYS.Maint.Bitmap documented here:

This utility is used to compact bitmap/bitslice indices. Over time in a volatile table (think lots of INSERTs and DELETEs) the storage for a bitmap index may become less efficient. To a lesser extent index value changes, i.e. UPDATES, can also degrade bitmap performance.

I've used Object Script to update linked tables projected as IRIS/Caché classes, like in your sample, since very long time and it works.

As the error says, your issue is that some property/column cannot be set/modified, I'm pretty sure the same issue arise if you use SQL to update/insert the same column.

Without the table definition it's impossible to guess what's the field and why that column cannot be set.
Maybe some of the fields are part of the primary key that includes other fields that are not set?
Make sure that the table is properly linked, the link table wizard sometime need "guidance" on properly linking tables, particularly in defining primary keys...

Anyway, if properly linked, you can definitely treat/manage/manipulate linked tables the same way as native IRIS/Caché  classes.classes. 

Ciao Pietro,

as said %DynamicAbstractObject has excellent performance and can handle easily very large JSON streams/files.
Depending on your system settings, for large JSON you may need to accommodate process memory, fortunately you can adjust it in your code at runtime so you can write robust code that does not depend on system configuration parameters.
Note that the default value of "Maximum Per-Process Memory" has changed during time, so a new installation and an upgraded installation have different values.

IMHO the real question here is: in what side of the JSON processing is your code?

Are you generating the JSON or are you receiving the JSON from a third party?

If you are receiving the JSON, then I don't think there is much you can do about, just load it and IRIS will handle it.
I'm pretty sure that any attempt to "split" the loading of the JSON stream/file will result in worst performance and consumed resources.
To split a large JSON you need to parse it anyway....

If you are generating the JSON, then depending on the project and specifications constraints, you may split you payload in chunks, for example in FHIR the server can choose to break a bundle it up into "pages".

I'm not sure if your question is about loading the JSON file/stream into a %DynamicAbstractObject or about processing the large %DynamicAbstractObject once it has been imported from JSON?

What's your problem and what's your goal?