That's why I use this code in my projects
    set stream=##class(%Stream.TmpCharacter).%New()
    do Result.$toJSON(stream)
    
    while 'stream.AtEnd {
        write stream.Read()
    }

and such write, instead of simple
do stream.OutputToDevice()

because, sometime I got unreadable response in this case, don't know why, may be because of gzip

There are some ways to retrieve this value

set tSC=##class(EnsPortal.Utils).ItemSettings("Production.Name||Item.Name",.settings,.colNames)

in this case you can find something like here 

$lb("Core","PoolSize",1,1,"",3,"Number of jobs to start for this config item. <br>Default value: <br>0 for Business Processes (i.e. use shared Actor Pool) <br>1 for FIFO message router Business Processes (i.e. use a dedicated job) <br>1 for Business Operations <br>0 for adapterless Business Services <br>1 for others <br>For TCP based Services with JobPerConnection=1, this value is used to limit the number of connection jobs if its value is greater than 1. A value of 0 or 1 places no limit on the number of connection jobs.","%Library.Integer","","0","","","",0,"Pool Size","Additional","Additional Settings","")

Or just  open this item, and get this property directly

if ##class(Ens.Config.Item).NameExists("UT.Client.GPK.Production","RS.Transformation",.id) {
  set item=##class(Ens.Config.Item).%OpenId(id)
  write item.PoolSize
}

Caché has very useful feature names as Mapping, in your case you can look at Global Mappings

How it can help now. With a mapping you can split your globals which stores your tables.
For example You have table Sample.Person, with globals ^Sample.PersonD for Data and ^Sample.PersonI for indices.
then you may create new databases like PERSONDATA1, PERSONDATA2 and PERSONINDEX

And create mappings likes on picture

In this case we send first 10000 rows to database PERSONDATA1, and all next data to PERSONDATA2. When you decide to split it again, just create new database, and edit mappings.

And you can place every database in any place as you want.

But in future releases we expect to see new feature: Distributed SQL

In any way it is possible to check how this Global Buffer uses, with very useful tool ^GLOBUFF. And if you see that about 80-90% of buffers have already used, it means that you should increase Global Buffer size.

%SYS>d ^GLOBUFF

Find which globals are using the most buffers.

Display the top <25>:

Total buffers: 32768      Buffers in use: 2163     PPG buffers: 23 (1.063%)

Item  Global                             Database          Percentage (Count)
1     rOBJ                               CACHESYS           37.540 (812)
2     rOBJ                               CACHELIB           27.323 (591)
3     oddCOM                             CACHELIB           9.154 (198)
4     oddDEF                             CACHELIB           7.258 (157)
5     %qCacheMsg                         CACHELIB           4.438 (96)
6     oddCOM                             CACHESYS           3.375 (73)
7     SYS                                CACHESYS           2.681 (58)
8     ROUTINE                            CACHESYS           0.693 (15)
9     ISC.Monitor.AlertD                 CACHESYS           0.509 (11)

Thanks for the reply Luca, of course, I know that I can write my own script for it, and your ccontainermain it is a very good solution. And it works quite well, and with some my modifications, more informative now. And I hope that I can help to improve this tool.
Now I am goid to write an article (in russian yet), about Docker+Caché+HAProxy, not as a micro-service, but with ECP, my containers will work as ECP-Applications with registration on HAProxy. And on github I have some questions which I now faced, when worked on it.

I use such code in one of my projects

ClassMethod Test()
{
    set address="One Memorial Drive, Cambridge, MA 02142, USA"
    do ..GetCoords(address, .latitude, .longitude, .partialMatch)
    zw latitude, longitude, partialMatch
}

ClassMethod GetCoords(Address As %String, Output Latitute As %Float = "", Output Longitude As %Float = "", Output PartialMatch As %Boolean) As %Status
{
    set params("address")=Address
    set sc=..CallGoogleAPI("/maps/api/geocode/json", .params, .data)
    
    if data.status="OK" {
        set result=data.results.$get(0)
        set PartialMatch = +result."partial_match"
        set Latitute = result.geometry.location.lat
        set Longitude = result.geometry.location.lng
    }
    quit $$$OK
}

ClassMethod CallGoogleAPI(Url As %String, ByRef Params, Output Data) As %Status
{
    #;set Params("key")="your api key here"
    quit ..CallApi("maps.googleapis.com", 1, Url, .Params, .Data)
}

ClassMethod CallApi(Server As %String, Secure As %Boolean = 1, Url As %String, ByRef Params, Output Data) As %Status
{
    set ht=##class(%Net.HttpRequest).%New()
    set ht.Server="maps.googleapis.com"
    set ht.Https=Secure
    set:Secure ht.SSLConfiguration=..GetSSLCertificate(Server)
    
    set param=""
    for {
        set param=$order(Params(param),1,value)
        quit:param=""
        do ht.SetParam(param, value)
    }
    
    set sc=ht.Get(Url)
    if $$$ISERR(sc) quit sc
    
    set Data={}
    set Data=Data.$fromJSON(ht.HttpResponse.Data)
    
    quit $$$OK
}

ClassMethod GetSSLCertificate(Server As %String) As %Status
{
    new $namespace
    znspace "%SYS"
    do {
        quit:##class(Security.SSLConfigs).Exists(Server)
        
        set tSC=##class(Security.SSLConfigs).Create(Server)
        $$$ThrowOnError(tSC)
    } while 0
    quit Server
}

Luca, may I ask, what exactly means InterSystems supports Docker now. I have not found anything about docker in documentation in version 2016.1. Actually I see that you already use docker with version 2015, and with 2016 I still have to use ccontainermain. It would be better if we could just use ccontrol for the same functionality as it now used ccontainermain.