Use args... to supply a variable number of parameters:

ClassMethod DoCleverStuf(args...) As %Status [ CodeMode = objectgenerator ]
{
    do %code.WriteLine("    For i=1:1:args {")
    do %code.WriteLine("        Write args(i)")
    do %code.WriteLine("    }")
    do %code.WriteLine("    Set tSC = $$$OK")
    ...
}

can I generate a whole method at compile time?

You can use projections to do that. Here's an example.

Assuming you have a sync mirror established, adding new db to mirror is as simple as:

  1. Create DB on Primary.
  2. Run SYS.Mirror:AddDatabase. It returns %Status, check that it's ok with $$$ISOK(sc). It should be equal to 1.
  3. Dismount database on Primary (using SYS.Database:DismountDatabase) OR Freeze IRIS (Backup.General:ExternalFreeze).
  4. Copy IRIS.DAT to Backup.
  5. Mount database on Primary (using SYS.Database:MountDatabase) OR Thaw IRIS (Backup.General:ExternalThaw).
  6. Mount database on Backup.
  7. Activate database on Backup (SYS.Mirror:ActivateMirroredDatabase).
  8. Catchup database on Backup (SYS.Mirror:CatchupDB).

Please note that some methods accept db name, most accept db directory, and others db sfn. Please keep that in mind.

Database is a physical file containing globals.

Namespace is a logical combination of one or more databases.

By default Namespace has two databases:

  • GLOBALS - default location for globals
  • ROUTINES - default location for code (classes/routines/includes)

In addition to that, namespace can have any amount of mappings. Mappings map specified globals/code from a specified database.

When you try to access a global, first global mappings are searched for this global, if no mappings are found, the GLOBALS database is used.

When you try to access some code, first package/routine mappings are searched for this code, if no mappings are found, the ROUTINES database is used.

To split data innto a separate DB:

  1. Create target DB.
  2. Make source DB RO (by forcefully logging users out for example and/or marking it as RO).
  3. Copy data/code from the source to target DB (for globals use ^GBLOCKCOPY).
  4. Create mapping from the target DB to your namespace.
  5. Confirm that data is present.
  6. Delete data from a source DB.

I would not recommend regexp for that. If you have one place with such a date, you can use transient/calculated property pair:

Class User.JSON Extends (%RegisteredObject, %JSON.Adaptor)
{

Property tsjson As %String(%JSONFIELDNAME = "ts") [ Transient ];

Property ts As %TimeStamp(%JSONINCLUDE = "none") [ SqlComputeCode = {set {*}=$replace({tsjson}," ", "T")_"Z"}, SqlComputed ];

/// d ##class(User.JSON).Test()
ClassMethod Test()
{
	set json = {"ts":"2022-02-02 01:01:34"}
	set obj = ..%New()
	zw obj.%JSONImport(json)
	w "ts:" _ obj.ts
}

}

If you have a lot of json properties,  use a custom datatype to do automatic conversion:

Class User.JSONTS Extends %Library.TimeStamp
{

ClassMethod IsValidDT(%val As %RawString) As %Status
{
    /// replace it with a real check
    q $$$OK
}

/// Converts the Objectscript value to the JSON number value.
ClassMethod JSONToLogical(%val As %Decimal) As %String [ CodeMode = generator, ServerOnly = 1 ]
{
    /// $replace({tsjson}," ", "T")_"Z"
    If 1,($$$getClassType(%class)=$$$cCLASSCLASSTYPEDATATYPE) || $$$comMemberKeyGet(%class,$$$cCLASSparameter,"%JSONENABLED",$$$cPARAMdefault) {
        Set %codemode=$$$cMETHCODEMODEEXPRESSION
        Set %code="$replace(%val,"" "", ""T"")_""Z"""
    } Else {
        Set %code=0 
    }
    Quit $$$OK
}

}

And use it instead of the standard timestamp:

Class User.JSON Extends (%RegisteredObject, %JSON.Adaptor)
{

Property ts As User.JSONTS;

/// d ##class(User.JSON).Test()
ClassMethod Test()
{
	set json = {"ts":"2022-02-02 01:01:34"}
	set obj = ..%New()
	zw obj.%JSONImport(json)
	w "ts:" _ obj.ts
}

}

You can do it like this:

1. Get active production:

w ##class(EnsPortal.Utils).GetCurrentProductionName()

2. Get a list of all BSes in an active production:

SELECT i.Name
FROM Ens_Config.Item i
JOIN %Dictionary.ClassDefinition_SubclassOf('Ens.BusinessService') c ON c.Name = i.ClassName
WHERE Production = ?

3. Disable all BSes (info)

for stop = 1, 0 {
  for i=1:1:$ll(bhList) {
    set host = $lg(bhList, i)
    set sc = ##class(Ens.Director).TempStopConfigItem(host, stop, 0)
  }
  set sc = ##class(Ens.Director).UpdateProduction()
}

4. Wait for all queues to empty:

SELECT TOP 1 1 AS "Processing"
FROM Ens.Queue_Enumerate()
WHERE "Count">0

5. Check that there are no active async BPs (extent size of all BPs must be 0 - here's an example)

6. Stop the production.

w ##class(Ens.Director).StopProduction()

After that and assuming deferred sending is not used (docs) it would be guaranteed that there are no in-flight messages.

Globals are cached in global buffer and you can use that.

  1. Create a new database CACHED with a distinct block size (16, 32, or 64 Kb).
  2. In your global buffer settings, set the global buffer for that block size to be equal to the amount of memory you want to allocate to the cache.
  3. Map cache global into a CACHED database.

This will give you an in-memory LRU cache. If you also follow @Dmitry Maslennikov's suggestion and use PPG, nothing would be persisted. Otherwise you'll need to invalidate the persisted cache manually/by a task.

Here's the code I use (by @Dmitry Zasypkin):

/// Canonicalize XML.
/// in: XML string or stream to canonicalize.
/// out: Canonicalized XML is returned in this argument. If it's a string, out must be passed by refrence.
/// elementId: attrubute Id to canonicalize. If elementId="", the entire document would be canonicalized.
/// prefixList: a local of namespace=prefix pairs to add to a root tag, only in a case of exclusive canonicalization.
ClassMethod canonicalize(in As %Stream.Object, ByRef out As %Stream.Object, isInclusive As %Boolean = {$$$NO}, keepWhitespace = {$$$YES}, elementId As %String = "", ByRef prefixList As %String = "", writer As %XML.Writer = {##class(%XML.Writer).%New()}) As %Status
{
	#dim sc As %Status = $$$OK
	
	#dim importHandler As %XML.Document = ##class(%XML.Document).%New()
	set importHandler.KeepWhitespace = keepWhitespace
	
	if $isObject(in)
	{
		set sc = ##class(%XML.SAX.Parser).ParseStream(in, importHandler,, $$$SAXFULLDEFAULT-$$$SAXVALIDATIONSCHEMA)
	}
	else
	{
		set sc = ##class(%XML.SAX.Parser).ParseString(in, importHandler,, $$$SAXFULLDEFAULT-$$$SAXVALIDATIONSCHEMA)
	}
	if $$$ISERR(sc) quit sc
	
	if $isObject(in) && $isObject($get(out)) && (in = out) do in.Clear()
	
	if $isObject($get(out))
	{
		set sc = writer.OutputToStream(out)
	}
	else
	{
		set sc = writer.OutputToString()
	}
	if $$$ISERR(sc) quit sc
	
	#dim node As %XML.Node = importHandler.GetDocumentElement()
	if (elementId '= "") set node = importHandler.GetNode(importHandler.GetNodeById(elementId))

	// Main part
	if isInclusive
	{
		set sc = writer.Canonicalize(node, "c14n")
	}
	else
	{
		if (+$data(prefixList) >= 10)
		{
			#dim prefix As %String = ""
			for 
			{
				set prefix = $order(prefixList(prefix))
				if (prefix = "") quit
				do writer.AddNamespace(prefixList(prefix), prefix)
			}	
		}

		set sc = writer.Canonicalize(node)
	}
	if $$$ISERR(sc) quit sc
	
	if '$isObject($get(out))
	{
		set out = writer.GetXMLString(.sc)
		if $$$ISERR(sc) quit sc
	}
	
	do writer.Reset()
	
	quit $$$OK
}

If you have a web application /csp/SomeApp and users need to login to access this application, it is enough to go to a /csp/SomeApp web application configuration page and set Serve Files to Use InterSystems Security to get the effect you want. After making this change, users would not be able to access /csp/SomeApp/image.png without logging into your application first.

That depends on the precision you need.

1. If you need just close enough you can do this:

  • Check how much time, on avarage BS takes to run. Let's say X seconds
  • Set Call Interval on your BS to 86400-X
  • Start BS at 10:00 AM
  • Assuming average runtime stays constant it should work well enough

2. If you need to run your BS at exactly at 10:00 AM use this task to achieve that.

Something like this:

#dim results As EnsLib.LDAP.Message.Results
for i=1:1:results.Results.Count() {
	#dim result As EnsLib.LDAP.Message.Result
	set result = results.Results.GetAt(i)
	write "DN: ", result.DN, !
	write "Attributes: ", !
	for j=1:1:result.Attributes.Count() {
		#dim atribute As EnsLib.LDAP.Message.Attribute
		set atribute = result.Attributes.GetAt(j)
		write $$$FormatText("  - Name: %1, Result: %2, Value: %3", atribute.Name, atribute.Result, atribute.Value), !
	}
}

I'll start from the simplier one:

and also the use or  DISPLAYLIST &  VALUELIST does this brings any advantage vs defining a standard property (eg.fast access!), so instead of have to do Valuetist "H" and Dispay "Hot" why just a standard property as string containing "Hot"?

More effective storage. If you can replace Cold with 1, Medium with 2 and Hot with 3 you can save some space. Another reason is that VALUELIST turns a string into a enum, which changes the logic considerably.