I crated a table and use your data:

CREATE TABLE Fehlermeldung (
    Field01 VARCHAR(32000),
    Field02 VARCHAR(32000),
    Field03 VARCHAR(32000),
    Field04 VARCHAR(32000),
    Field05 VARCHAR(32000),
    Field06 VARCHAR(32000),
    Field07 VARCHAR(32000),
    Field08 VARCHAR(32000),
    Field09 VARCHAR(32000),
    Field10 VARCHAR(32000),
    Field11 VARCHAR(32000),
    Field12 VARCHAR(32000)
)


Use the data you provided:

INSERT INTO "Fehlermeldung" VALUES (
1001021,
'qsDataFieldOutOfRange',
'10',
'Der Wert ''<wert>'' des Datenfeldes <feldName> \"<feldBezeichnung>\" ist <artDerAbweichung> als <feldBound>.',
'<params wert=\"\" feldName=\"Modul.name:Bogen.name:Feld.name\" feldBezeichnung=\"Feld.bezeichnung\"  artDerAbweichung=\"Ein Wert aus [größer, kleiner]\" feldBound=\"Minimal- oder Maximalwert des Feldes je nach Wert in artDerAbweichung\" />',
'',
'Leistungserbringer',
'',
'',
'Leistungserbringer',
'Als Template-Meldung kann diese auch auf modifizierte Daten der DAS zutreffen.',
'')


And the write is ok.

The class structure created:

 

/// 
Class User.Fehlermeldung Extends %Persistent [ ClassType = persistent, DdlAllowed, Final, Owner = {desenv}, ProcedureBlock, SqlRowIdPrivate, SqlTableName = Fehlermeldung ]
{

Property Field01 As %Library.String(MAXLEN = 32000) [ SqlColumnNumber = 2 ];
Property Field02 As %Library.String(MAXLEN = 32000) [ SqlColumnNumber = 3 ];
Property Field03 As %Library.String(MAXLEN = 32000) [ SqlColumnNumber = 4 ];
Property Field04 As %Library.String(MAXLEN = 32000) [ SqlColumnNumber = 5 ];
Property Field05 As %Library.String(MAXLEN = 32000) [ SqlColumnNumber = 6 ];
Property Field06 As %Library.String(MAXLEN = 32000) [ SqlColumnNumber = 7 ];
Property Field07 As %Library.String(MAXLEN = 32000) [ SqlColumnNumber = 8 ];
Property Field08 As %Library.String(MAXLEN = 32000) [ SqlColumnNumber = 9 ];
Property Field09 As %Library.String(MAXLEN = 32000) [ SqlColumnNumber = 10 ];
Property Field10 As %Library.String(MAXLEN = 32000) [ SqlColumnNumber = 11 ];
Property Field11 As %Library.String(MAXLEN = 32000) [ SqlColumnNumber = 12 ];
Property Field12 As %Library.String(MAXLEN = 32000) [ SqlColumnNumber = 13 ];

/// Bitmap Extent Index auto-generated by DDL CREATE TABLE statement.  Do not edit the SqlName of this index.
Index DDLBEIndex [ Extent, SqlName = "%%DDLBEIndex", Type = bitmap ];

}

Check the definition of the fifth property if there any unusual restriction.
Or, if you could, post the class definition here.
 

Hello

Older professionals can confirm it, but this operation ordering premise is older than the InterSystems product. It comes from the origin of MUMPS. So I don't think it's something InterSystems can or should change.

When I discovered this feature of language, I decided that all my codes involving numerical calculations should always be formatted with "(", determining precedence, even if precedence is natural to language. And I do it that way regardless of language.

For example, in COS this calculation would be natural:

Write !,10 + 10 * 10
USER> 200

But I would always code like this:

Write !,( 10 + 10 ) * 10
USER> 200

This way the code will be readable even for "Non MUMPS" coders, and this piece of code could even be migrated to other languages without problems.

Yone, hello.

I imagine a lot of scenarios regarding your problem, but I'm missing some details, so I ask?

- You can only send messages within the period informed, but collect/prepare messages throughout the day, more or less like this: [SOAP inbound] (24/7) -> [BPL] (24/7) -> [ SOAP Operation] [4am to 8am, daily]?

- Your problems with mirror servers are because they must be in different timezones and the schedule doesn't mirror each server's local time.

I will base my premise on the above conditions, if not, please skip the next few lines.

Try to split the operation into two separate processes:
- A process to collect and store the original messages in a temporary table.
[SOAP inbound] (24/7) -> [BPL] (24/7) [to write temporary table]

- Another process which will be a 24/7 ensemble service but which only triggers messages based on a time analysis in UTC format and not local time.
[InboundAdapter] (24/7) -> [BPL] (24/7) -> [ analyzes the condition relating to 4am to 8am UTC and sends messages to ] -> [SOAP Operation] (24/7)

Hope this helps.

Hi.

I imagine you may have one of the following questions:

If it is not either case, please elaborate the question further so that we can better help you.

Hi

Did you check if the disk is full? Or may its a O.S. problem:

"You can put 4,294,967,295 files into a single folder if drive is formatted with NTFS (would be unusual if it were not) as long as you do not exceed 256 terabytes (single file size and space) or all of disk space that was available whichever is less. For older FAT32 drives the limits are 65,534 files in a single folder and max file size of 4 Gigabytes and 2TB of total space or all of disk space that was available or whichever is less."

https://answers.microsoft.com/en-us/windows/forum/all/how-many-files-can....

Depending on what needs to be analyzed, the few solutions included in the application may not meet your needs.

In my project, I created specific metrics tables that get the data I wanted. To feed these tables, I created asynchronous processes within the normal operations in Ensemble that collect this data. Remembering that, within the standard Ensemble cycle, messages longer than 7 days are deleted.

Through these tables, which are independent of the Ensemble cycle, it is possible to analyze months and years of data.

Obviously, this consumes disk space.

Hello,

It is possible, but not on the graphical level or management portal.
You would have to write ObjectScript routines and schedule them via task schedulers. 

The start task script should have something like:

Class YourAplication.ManualProductionStart Extends %SYS.Task.Definition
{

Parameter TaskName = "ManualProductionStart";

Method OnTask() As %Status
{
    Set tSC = Quit ##class(Ens.Director).StartProduction("InsertNameOfYouPrduction")
    If ( $System.Status.IsError( tSC ) ) {
    {
        Do ##class(%SYS.System).WriteToConsoleLog( "Error in ManualProductionStart: " _ ##class(%SYSTEM.Status).GetErrorText( tSC ) )
    }
    Quit ##class(%SYSTEM.Status).OK()
}

}

In [System > Task Manager > New Task] create a scheduled task named, for example, ManualProductionStart.
It should be daily, for example: 08:00

The stop script file should have something like:

Class YourAplication.ManualProductionStop Extends %SYS.Task.Definition
{

Parameter TaskName = "ManualProductionStop";

Method OnTask() As %Status
{
    Set pTimeout = 60 /// means 60 seconds to stop all transacitons
    Set pForce = 0 /// If you set to 1 it will performance a force stop, not ideal, could brak integraty.
    Set tSC = Quit ##class(Ens.Director).StopProduction(.pTimeout,.pForce)
    If ( $System.Status.IsError( tSC ) ) {
    {
        Do ##class(%SYS.System).WriteToConsoleLog( "Error in ManualProductionStop: " _ ##class(%SYSTEM.Status).GetErrorText( tSC ) )
    }
    Quit ##class(%SYSTEM.Status).OK()
}

}

In [System > Task Manager > New Task] create a scheduled task named, for example, ManualProductionStop.
It should be daily, for example: 18:00

But.

A major question. Why the need to start/stop production? 
Fail me to see a plausible reason to do so. Integration plataform should be a 24/7 application.
It more common some business services / operations had a delimited scheduled hour, not all the production.

Hello,

I had similar issues with journal size. Like you case, it grew a lot during the day and the Ensemble only cleaned at the end of the day. It could reach over 300Gb, but the disk had that amount of space available. Depending on the day, it consumed all the HD space, stopping the journal and requiring manual intervention.

And as there were numerous transactions in SQL that depends on BeginTransaction, CommitTransaction, in addition to mirroring, removing this data from the journal was not an option.

The solution was to adjust the size of the journal file fractions in System > Configuration > Journal Settings [Start new journal file every (MB)] to something like ([YouMaxJournalSizeInMB]/256)

Create a scheduled task by calling, for example, ManualPurge.
It should be daily, hourly, in my case during business hours, from 06:00 to 23:59 (so as not to disturb the backup)

The task file should have something like:

/// [2016-12-29] Marcio Dias - Creation.
Class YourAplication.Utilities.JournalTask Extends %SYS.Task.Definition
{

Parameter TaskName = "ManualPurge";

Method OnTask() As %Status
{
    Set $ZTrap = "Error", tSC = ##class(%SYSTEM.Status).OK()

    Set zNS = $NAMESPACE
    ZNSPACE "%SYS"

    Set tSC = ##class(%SYS.Journal.File).PurgeAll()
    If ( $System.Status.IsError( tSC ) ) {
    {
        Do ##class(%SYS.System).WriteToConsoleLog( "Error in ManualPurge: " _ ##class(%SYSTEM.Status).GetErrorText( tSC ) )
    }

Error 
    ZNSPACE zNS
    Quit tSC
}

}

The purpose of this solution is, every hour, to physically erase the individual journal file in which all open transactions are closed and already exported to the mirror system.
The ##class(%SYS.Journal.File).PurgeAll() parses and guarantees these conditions

In my case, after deploying this solution, the maximum journal size hardly exceeded 80Gb in total.

A personal note.
I attend a job interview couple of months ago and was asked to write a small application in Ensemble.
Insted of use BPL in graphical mode, I prefered to code in ObjectScript directaly. Not old fashion MUMPS GolfCase, but present camelCase, JavaScrit like structured code.
I believe the evaluator dont liked the ObjectScript beeing used that way and I was not selected.
Still wondereing what I did wrong.

Funny. The job position still open.

Hi.

As the main objective is not to overload the server any of the solutions presented will overload the production server. Whether performing a backup, exporting data to file, etc. ​Especially if it is a repetitive, constant task with a large volume of data.

If your product license allows it, it is recommended to use a shadow server. Adopting a shadow server will redirect the processing effort from any adopted solution to another machine.
https://docs.intersystems.com/latest/csp/docbook/Doc.View.cls?KEY=GCDI_s... 

I had a similar problem on an old project.
Data could be entered as an object such as SQL.
In both cases there were exceptions for processing the business rules contained in the triggers. The most common was loading data imported from an external source where some mandatory values were missing. And normally this charge was done along with the normal functioning of the application.
To get around this we created control flags to be able to process this data.

Something like that:
Set ^Flag = $JOB_"|"_"Action:IgnoreBusinessRules"

On the %Library.Persistent extend class:
Class User.Sample Extends %Persistent
{

Method %OnBeforeSave(insert As %Boolean) As %Status [ Private, ServerOnly = 1 ]
{
    /// Abort business rules
    If $Get(^Flag)'="" & ($Piece($Get(^Flag),"|",1)=$JOB) & ($Piece($Get(^Flag),":",2)="IgnoreBusinessRules")  Quit $$$OK
    /// Before this point, follow normal business rules.
    Quit $$$OK
}

Trigger TrgBefInsert [ Event = INSERT, Time = AFTER ]
{
    /// Abort business rules
    If $Get(^Flag)'="" & ($Piece($Get(^Flag),"|",1)=$JOB) & ($Piece($Get(^Flag),":",2)="IgnoreBusinessRules")  Quit $$$OK
    /// Before this point, follow normal business rules.
    Quit $$$OK
}

Trigger TrgBefUpdate [ Event = UPDATE, Time = AFTER ]
{
    /// Abort business rules
    If $Get(^Flag)'="" & ($Piece($Get(^Flag),"|",1)=$JOB) & ($Piece($Get(^Flag),":",2)="IgnoreBusinessRules")  Quit $$$OK
    /// Before this point, follow normal business rules.
    Quit $$$OK
}

}

I believe the best answer is: It depends.

Depends on the data architecture of the original project, whether the data structure is GLOBAL, relative SQL or Objects.

For example, in my current project, I have data in objects originating from Ensemble and metrics and monitoring data in relational tables. The native part of Ensemble is never subject to change, but the part concerning relational tables sometimes needs editing. When it happens I prefer to use third party SQL manager like DBeaver.

In another project, fully specified in Globals, the preferred way of editing large amounts of data was to write scripts in ObjectScript.

Edit globals directly, only in very specific situations. And keep in mind that, if the global is referring to an SQL table, there is always the risk of corrupting an index.

Hi, Yone

Did you notice in command "w $system.SQL.DATEADD("yy",65,"1976-22-04")"
The Date is in format "YYYY-DD-MM"
The correct must by in format "YYYY-MM-DD"
The command "w $system.SQL.DATEADD("yy",65,"1976-04-22")" results "2041-04-22 00:00:00"

The command "w $system.SQL.DATEADD("yy",65,"19760422") " result in error, must contain the separator"-".