As Alexey said, the issue could be with code, not ECP (as usually ECP is backward compatible)
Make sure that code on app. servers in not mapped (no routine, package mapping) to a remote DB (on database server) - all code should be local to app. server

If your 3 app. servers are identical (for HA, LB reasons) then if you can disable access to 1 app. server and still working with other 2 - it will give you an option to to the app. servers one by one without downtime.

Another issue to concern is cached queries - on such upgrades its good to have all cached queries "un-freeze" to make sure all are compiled.
 

Hello Norman,

1. to build a sync mirror with the server, then you can "set no failover" on primary" and "play" withe the backup, you could take IRIS down in any time on the backup, it will re-sync when its up again

2. O/S defragmentation - either by O/S tool, or to copy the DB to a new clean disk

3. Internal defragmentation - there are some options:
A) to copy all data with GBLOCKCOPY to a new DB, with 1 process, to keep data as sequential as possible (will take a lot of time)
B) to use the internal defragmentation tool (will need 2 x space on that DB) then compact globals

4. In general, its a good practice to split indices to a different DB (and use mapping) since the data ("D" globals) are usually not getting so fragmented over time. This will also help on future defragmentation, compact globals, DB "shrink" and other maintenance on less volume 

Hello Alex,

I would suggest using MONLBL with all measures, to better understand any "bottlenecks" in code, memory issues or global access. In parallel, to monitor the VM to see if we hit CPU, disk I/O or memory thresholds there.
it might be that there are some "internal" functions that can be optimized or written definitely to speed up code.
I recommend that you open a WRC, since they have good tools to analyze almost every aspect of IRIS

Intersystems UDP implementation (e.g. EnsLib.UDP.OutboundAdapter) is assuming that any stream<MTU size: it uses stream.OutputToDevice() so it will not work with streams>MTU

when splitting parts, consider packet can arrived at random order, so you should "collect" them at receiver, then build the incoming message according to the correct order when you know you have them all

Some additional notes:

1. If you have many changes (variations) in query structure (e.g. using different column names, or different conditions on the WHERE clause) dynamic SQL is the choice. However, if your queries do not change in structure, only parameters than embedded SQL will be a better choice (e.g. select id for ref.table where name='name')

2. Embedded SQL build the cached query ONCE while you compile, embedded SQL build and compile a cached query every time your SQL structure changes.

2. Speed of dynamic SQL will be identical to embedded SQL (over time) since most of the possible SQL combinations will have cached queries in place - each time you compile a class/table, its "cached queries" are purged (so expect to have slight degradation after releases or changes in class/table)

 3. In case you can use embedded SQL consider giving your client access to a VIEW or a SP (instead of doing SQL against the original table). This way changes you do in class/table will not affect the client

4. As mentioned, security is very important to notice: if you intend to let the client send & execute (dynamically) any SQL, try to limit this for a specific table and sanitize the input to avoid potential "SQL injection" (using s SP with parameter is a good way to secure your backend)

Sets to a global with or without PC, single of multiple on 1 set command or using execute are relatively easy to find. The issue comes with indirections (@) - for that I recommend writing code that will do the searches (a real cross referencing). Over time, this code can be improved.

Another option is using visual studio code where you have an option to search with regex. which will let you find most of the easy places.

I recommend using visual studio code (vs-code) where you can search with regex. searching. Consider also seqdhing for 0-n while spaces to elimnate all spaces, tabs etc.
for example: a reference to a global could be:
set ^global( ... )= 
s ^global( ... )=
s:cond ^global( ... )=

If combined with other commands: then you should search for the comma (,) e.g.
set var=someting,^global( ...)=
 

However, use of indirection is very complex to find... (you need to skip 0-n of any characters including new lines between the set and the use of @)

1. Sound a perfect candidate for the InterSystems Ideas portal, to be able to do searches inside streams.
2. Another option:  you could use request/response messages that stores in normal properties (e.g. Extends Ens.Request or Ens.Response), you could convert those properties back to JSON or (compressed) stream at the BS level (after the response), so all your messages inside Ensemble could be searchable.

Hello Caio,

There is no DECLARE @variable (SQL server) or DECLARE variable (Oracle) on Cache but there are few options:

1. Use a host variable(s) in embedded SQL:

SET variable = 2000
&SQL(SELECT Column FROM Table WHERE ID = :variable) 

2. Use the same with Dynamic SQL

SET variable = 2000
SET sql = "SELECT Column FROM Table WHERE ID = " _ variable
DO ##class(%SQL.Statement).%ExecDirect(, sql)

3. Writing a SP

CREATE PROCEDURE Procedure(IN variable  INT)
AS
BEGIN
    SELECT Column FROM Table WHERE ID = variable;
END