How about using Work Queue Manager framework in conjunction with a BS?

Pick up a message but do not delete before we start processing since we are not going to record anywhere permanently that we are working on it.
Check if message queued in current lifetime temp list.
If already queued skip.
If not already queued:
    mark as in progress for duration of this container lifetime
    queue up for Work Queue Manager to process.
    use callbacks to say finished and delete the message.perhaps also give back answer as well.
We then also have auto scaling using the WQM framework if I am not mistaken.

It is possible for a BS to have more than one job  - the issue becomes how to control the multi process access to the inbound feed in an orderly manner. Hence as you asked it is  important  to know how the BS is pulling data.

Eduard, Is the  rate at which one BS can process known or is it variable based on data unit to be processed? Similarly is the rate of arrival known or possible to detect? If possible to know these 2 rates they could be used to control BS count rather than CPU (a tight looping BS looking for work could skew CPU test comes to mind)

Is the design to have all the processing of a data unit in the BS rather than pass to a BP/BO?

James

NB: there is a variation on pool size meaning for a BS using TCP based listening adapter where there can only be one listener but when Job Per Connection is true pool size setting is used as the maximum allowed number of concurrent connections created by the listener.

Perhaps also you can limit calling the Shouldxxx()  based on update production timeout (for example) rather than checking every request in the collection.

James

Hope resolved by now but : One needs to double the quotes within the pattern since it needs to be a single string:

Matches(tAlias, "1(.E1""SITE A"",.E1""SITE B"")")

Matches(tAlias, "1(.E1""SITE A"",.E1""SITE B"")")

James

Hi Grace

I defer to those with experience in the field to offer comparative advice but concerning the Production Export functionality:

For existing environments it is possible to use the Export for deployment from the Production configuration page for one or more items and not the whole production. This allows changing existing items or adding items to a production when deploying the export file. When changing an item the deployment code will disable the item first if it is enabled and then re-enable the item as necessary after the deployment has finished.

It is possible to remove items using the deployment functionality but for this one needs to use Ens.Deployment.Util APIs and not the GUI to create the removal deployment package. 

Concerning settings - if you haven't seen perhaps System Default Settings might be appropriate  (https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY...).

An alternative use of System Default Settings by some sites is to use for the same values without having to enter per item - hence the option in the Export for deployment dialog  to export deployable System Default Settings.

The Export from the Production configuration page attempts to identify what are linked items/code but might not be complete. This is the reason for being able to add Studio project contents to the export.  In version 2017.2.0 we added detection of RecordMap classes to be included in the export. 

Best wishes

James
 

See class method EnsLib.Testing.Service::SendTestRequest() that allows one to issue the test from terminal without having to add another component.

classmethod SendTestRequest(pTarget As %String, pRequest As Ens.Request, ByRef pResponse As Ens.Response, Output pSessionId As %String, pGetReply As %Boolean = 0) as %Status

    Send a test request to the specified target host (BusinessProcess or BusinessOperation). 

Hi Amir

This issue is addressed in 2016.1.0

W x.GetValueAt("MSH.MSH~12.VID~1")
2.5

Best wishes

James