Thanks for time and many suggestions.

Have also been brainstorming what is already in the platform that I could leverage to assist speed up functionality.

Approach is to loop over extent to find the next "set record" that has the most additional numbers available that are not previously selected.

I came up with using Bit Strings instead of IRIS lists at language level.

This allows efficient bit operations via BitLogic "OR" operator.

Storing the BitStrings in a calculated property on record insert and update, mitigates recalculating a Bit String from source string when iterating over an extent each time, looking for the next best record to use.

FInally wrapped this up in a Class Query.

Class TOOT.Data.Instrument Extends %Persistent
{

Property NoteList As %String(MAXLEN = 8000, TRUNCATE = 1);

Property NoteListEmbedding As %Embedding(MODEL = "toot-v2-config", SOURCE = "NoteList");

Property NoteListBit As %Binary [ SqlComputed,SqlComputeOnChange = NoteList];

/// Calculates and stores a bit string during record insert or update
ClassMethod NoteListBitComputation(cols As %Library.PropertyHelper) As %Binary
{
    set bitString=""
    set numlist=cols.getfield("NoteList")
    set numsLen=$L(numlist,",")
    for i=1:1:numsLen {
        set val=$P(numlist,",",i)
        continue:val<6
        continue:$D(found(val))
        Set found(val)=""
        Set $Bit(bitString,val)=1
    }
    return bitString
}

/// Callable query for getting best document based on bitflags
Query BestNoteList(Top As %Integer=5, Accumulate As %Boolean=0) As %Query(ROWSPEC = "ID:%String,Count:%Integer") [SqlProc]
{
}
ClassMethod BestNoteListExecute(ByRef qHandle As %Binary, Top As %Integer=5, Accumulate As %Boolean=0) As %Status
{
    Set:Top<1 Top=5
    Set qHandle=$LB(Top,0,"")
    Quit $$$OK
}

ClassMethod BestNoteListFetch(ByRef qHandle As %Binary, ByRef Row As %List, ByRef AtEnd As %Integer = 0) As %Status 
[ PlaceAfter = BestNoteListExecute ]
{
    Set qHandle=$get(qHandle)
    Return:qHandle="" $$$OK
    Set Top=$LI(qHandle,1)
    Set Counter=$LI(qHandle,2)
    Set BitString=$LI(qHandle,3)
    Set Counter=Counter+1
    If (Counter>Top) {
        Set Row=""
        Set AtEnd=1
        quit $$$OK
    }
    Set statement=##class(%SQL.Statement).%New()
    Set tSC=statement.%PrepareClassQuery("TOOT.Data.Instrument","Extent")
    Set tResult=statement.%Execute()
    Set MaxCount=$BITCOUNT(BitString,1)
    Set MaxBitStr=""
    Set MaxId=0
    While tResult.%Next() {
        Set tmpId=tResult.%Get("ID")
        Set tmpBit=##class(TOOT.Data.Instrument).%OpenId(tmpId,0).NoteListBit
        Set tmpBit=$BITLOGIC(BitString|tmpBit)
        Set tmpCount=$BITCOUNT(tmpBit,1)
        If tmpCount>MaxCount {
            Set MaxCount=tmpCount
            Set MaxBitStr=tmpBit
            Set MaxId=tmpId
        }
    }
    Do tResult.%Close()
    If (MaxId'=0) {
        Set Row=$LB(MaxId,MaxCount)
        Set AtEnd=0
        Set $LI(qHandle,2)=Counter
        Set $LI(qHandle,3)=MaxBitStr
    } Else {
        Set Row=""
        Set $LI(qHandle,2)=Counter
        Set AtEnd=1
    }
    Return $$$OK
}

ClassMethod BestNoteListClose(ByRef qHandle As %Binary) As %Status [ PlaceAfter = BestNoteListFetch ]
{
    Set qHandle=""
    Quit $$$OK
}
}

Calling from the Management Portal:

Where ID is the Record ID and Count is the increasing coverage of bitflags with each itteration of appending a new record.

 

Temporarily added logging to the Compute Method to confirm not being called during the query running.

Hi Julius, Thanks for clarification questions.

1. Yes looking to understand how developers may approach finding sepecific sets as your example: sets 1,2,4.

2. No, the AllList is informative about possible numbers avaialble to use in the sets in a specific scenario. Following your question maybe a generic solution would be flexible for any number in search for "minimum number of sets to include maximum distinct elements".

Appreciate for some colleagues, there are scenarios where the ideal is not achievable for all code, application config and infrastructure config, especially where parallel work by multiple organizations operate on the same integration.
These can be Operational aspects in addition to understood Development scenarios.
Differencing can smooth the transitions for example:
* A deployment or configuration has occurred and the person responsible is not aware of Mirrored or DR environment also requiring a coordinate parallel update. Scheduled tasks come to mind.
* Upgrade may be staggered to minimize user downtime window, and app / infrastructure config may have planned gaps that need to be followed up and closed.
* There may be more than one organization and team responsible for managing, supporting and deploying updates. Where communication and documentation have not been usefully shared, cross-system comparison is good fallback to detect and comprehensively resolve the gap.
* It can help halt incompatible parallel roll-outs.
* A partial configuration deployment can be caught and reported between environments.
* Differencing can be useful between pre-upgrade and post-grade environments when applying root cause analysis for new LIVE problems, to quickly eliminate recent changes from being suspected of a problem application behavior. To allow investigation to proceed and iterate and avoid solution / upgrade rollback.
Just don't feel bad about not achieving an ideal, if have been left responsible for herding cats. There are ways to help that deployment work also.

I note the original question mentioned web production components. In the case of CSP pages, I use Ompare facility to always detect for the generated class and not the static csp source file. This will alert to cases where new csp page was deployed but manual / automatic compilation did not occur, and the app is still running the old version of the code.

You can check for XML invalid characters by decoding the encoded payload. For example:

zw $SYSTEM.Encryption.Base64Decode("VXRpbHMJOyBVdGlsaXR5IE1ldGhvZHM7MjAyMy0wOS0xMSAxNjo1Nzo0MiBBTgoJcSAKY2hrQ3RybChmaXg9MCkKCWsgXmdnRwoJcyB0PSJeUkZHIgoJcyBjdHI9MAoJdyAiUkZHIiwhCglmICBzIHQ9")
"Utils"_$c(9)_"; Utility Methods;2023-09-11 16:57:42 AN"_$c(10,9)_"q "_$c(10)_"chkCtrl(fix=0)"_$c(10,9)_"k ^ggG"_$c(10,9)_"s t=""^RFG"""_$c(10,9)_"s ctr=0"_$c(10,9)_"w ""RFG"",!"_$c(10,9)_"f  s t="

Look for a $C( ? ) where ? is in 0,1,2,3 .. 30,31.

Note: Tab $C(9) and  New line ($C(10) and $C(13,10) are fine.

Sometimes cut-n-paste from an email / word document will use $C(22) as a quote character.

Ompare - Compare side-by-side multiple disconnected IRIS / Cache systems.
https://openexchange.intersystems.com/package/ompare
This was developed in order to compare environments on different networks. It works by profiling code and configuration, across one or more namespaces, generating a signatures and optional source file to be imported into the reporting service.

The SQL Profile capability is reused to provide comparison of integration production settings across environments.
It ignores non-functional differences in code like blank lines, method / line-label order. Useful for manual integration that has occurred in different order or with different comments.
It provides reporting to show side-by-side differences of the same namespaces across multiple instances.

Has been useful for assurance environment parity, for upgrade sign-off.

Article - Using Ompare to compare CPF configuration and Scheduled Tasks between IRIS instances
https://community.intersystems.com/post/using-ompare-compare-cpf-configuration-and-scheduled-tasks-between-iris-instances

I created a lighter no-install version to compare changes in releases of IRIS versions.
/Ompare-V8 see: https://openexchange.intersystems.com/package/Ompare-V8

I feel there could be some options. Direct access restriction can potentially be applied on service by settings AllowedIPAddresses AND / OR enforcing clientside certificates on SSLConfig. Infrastructure firewall is also a possibility. If offloading authentication and TLS with standard requests, basic authentication at the webserver configuration is also viable. As REST parameters or HTTP Headers, could also validate against Integration Credentials Store.

From Gertjan's suggestion, this won't survive page refresh caused by compile button click, or other navigation, but it does seem to jam the session door open, by pinging the server every minute:

clearTimeout(zenAutoLogoutTimerID)
clearTimeout(zenAutoLogoutWarnTimerID)
var pingServer=setInterval(function(){var xhttp = new XMLHttpRequest();xhttp.open("GET", "EnsPortal.BPLEditor.zen", true);xhttp.send();},60000);

ie: After opening the BPL page, launch Developer tools and run JavaScript commands in console tab.

One approach that might be suitable is to have a look at overriding method OnFailureTimeout in the sub-class, which seems for this purpose.

Anticipate the ..FailureTimeout value will need to be more than 0 for OnFailureTimeout to be invoked.

Another area to review could be overriding OnGetReplyAction, to extend the retry vocabulary.

Sometimes a sending application can have pauses sending data and this is mitigated by increasing the read-timeout.

This is also indicative of the thread / process at the sending system closing the connection.

Curious if this always happens on a standard configured description of the particular observation. Or is this free text entered by a particular upstream user (eg: copy and paste from word document, eg: Line breaks not escaped in HL7, ASCII 22 being used for double quote ). Is the input content incompatible with the encoding that is being used to transmit the message and crashes the transmit process. Suggest review the error log on the sending system could be useful.

Hi Rathinakumar,

One reason may be process contention for same block.

Most application processes work by $Ordering forward from the top of the global down.

When doing table scans, this can results in processes waiting, or running behind another process.

As an alternative for Support and Large Reporting Jobs you can instead "$Order UP" instead.

s sub=$o(^YYY(sub),-1) q:sub=""

Interested if this may mitigate any performance issue caused by contention on a busy system.

set basename="Test.Sub.Base"
set file="c:\temp\subclasses.xml"

// Get list of compiled sub-classes
do $SYSTEM.OBJ.GetDependencies(basename,.out,"/subclasses=1")
// If you didn't want the base class
kill out(basename)

// Write the classes to terminal
zwrite out

// Suffix for export
set next="" for {set next=$Order(out(next)) quit:next=""  set toExport(next_".CLS")=""}
// Fire in the hole !!
do $SYSTEM.OBJ.Export(.toExport,file,"/diffexport")

// to display different qualifier flags and values used by various OBJ methods ( /diffexport ):
do $SYSTEM.OBJ.ShowQualifiers()

Another option have used is stunnel, on Linux variants (SUSE and RedHat).

Where Cache / IRIS connects to local proxy, which then connects via TLS to LDAP service.

Note: If running proxy in process jail, and find it can't get re-lookup of DNS after being started, ie dns lookup is once on start up. An approach is a mini-service script to monitor the DNS to IP resolution periodically, and auto-restart the stunnel proxy when it changes. One advantage being, if the DNS resolution service is temporarily unavailable, the running proxy carries on using the previously resolved IP address.