As every BP is also a SQL table, you can select active processes which are older than 1 day  with this query:

SELECT ID, %SessionId, %SuperSession, %TimeCreated, DATEDIFF('day',%TimeCreated, CURRENT_TIMESTAMP) AS DaysOld
FROM <business_process_table>
WHERE DATEDIFF('day', %TimeCreated, CURRENT_TIMESTAMP) > 1
      AND %TimeCompleted IS NULL

As BO messages also get logged, you can check Ens.MessageHeader table for message processing time.

If that base class has a property that is another class, do I have to add the DSTIME parameters to that secondary class? 

No. DSTIME needs to be defined only for a base class of a cube (fact table class). 

It works like this:

  1. During the compilation if DSTIME parameter is present %Save/%Delete methods would write the fact class, id and action into ^OBJ.DSTIME global
  2. Cube synchronization job would iterate over this global but only over fact class and process them accordingly
  3. %PurgeDSTIME from %DeepSee.Utils class would clear all processed records in ^OBJ.DSTIME global and all records that are not from fact classes

You can only read into simple local variables, so it should be like:

set Name=##class(Example.Team1).%New()
read "enter your name :", NameStr
set Name.EmpName = NameStr

There are also utility prompt methods, see %Library.Prompt class. For example GetString method supports object properties:

set Name=##class(Example.Team1).%New()
do ##class(%Library.Prompt).GetString("enter your name", Name.EmpName)

Here are some thoughts:

  • Write Source Control Hook that implemets OnAfterSave method and checks there (or maybe some other entry point). There are several sample source control hook classes, check them out.
  • Use ^rINDEXCLASS global (key - class name in uppercase) - it contains some basic information such as modification time (1st position) and hash (13th position). You can monitor it and if time or hash changes then record the new class version.
  • Use lock table to see what classes are currently being edited
  • Use $$$defClassKeyGet macros (see %Dictionary.ClassDefinition/%Dictionary.CompiledClass definitions, they use these macros a lot) to get info about modification time/hash and the changes themselves
  • %Compiler.UDL.TextServices to get the class text

I would have done it like this:

  • Background process monitors ^rINDEXCLASS global
  • Upon finding changes get the current class code via %Compiler.UDL.TextServices class,  GetTextAsStream method
  • Write this information into your own class (classname, timestamp, user, classtext, diff, previousversion)

Modification of system classes is not a very good idea:

  • They are lost on update
  • User may not have access required to install these changes

That being said, the best solution in my opinion is to setup your critical/production systems in such a way, that developers do not have direct write access to them. Each of the developers have their own environment where they can do whatever and then commit it to source control system. Continuous integration solution (or one developer or a script) then uploads the code (after it passes the tests) to the production server. There are several source control/continuous integration systems available for Caché.

It is a good idea that a change can only be done via source control commit. All other changes simply should not exist.

There is a beforeunload HTTP event which is fired when the window, the document and its resources are about to be unloaded. Handle this event in your js code and from there you can send an HTTP request to a server (I usually use GET /webapp/logout), which would do something like this:

ClassMethod logout() As %Status
{
    #dim %session As %CSP.Session
    set st = %session.Logout(1)
    set %session.EndSession = 1
    return st
}

 I'd also love to see the web application's properties programmatically, if possible (such as the physical files path).

Use Security.Application class directly, or preferably via %Installer manifest.

but I honestly don't know why it's choosing that web application over the default of /csp/default-namespace

Uncompile and compile ZEN classes after setting default web application. Or you can try to set HOMEPAGE parameter in the application class.

My question is, if we go to using Group By ID (which is currently null) in the web application, will this fix the sharing of %session Data? (This is why I have to make sure I'm editing the correct web application!)

If the user opens one web application in two tabs, they would be sharing a session (and namespace). Group By ID would only help if you have N different applications and would specify N different values for Group By ID property. In that case, they should use N different sessions.

Finally, we have a web application for each namespace; will we need the Group By ID for each application defined for each namespace, and should it be the same ID?

Depends, if you want to share %session or not.

Both of these tools are for GitHub and Cache (not Git and Cache, but with easy customization required to support some other Git server API over HTTP).

Cache Updater is a simple Cache task. You specify GitHub information (repository and user/pass if it's a private repo) and namespace. Then you set a schedule and that's it. Every time the tasks runs it downloads (and compiles) code from GitHub and into Cache Namespace.

CacheGitHubCI is a full fledged CI system. You may specify pre and post compile actions, unit tests, etc. The results and timings of each build and action are saved and there are DeepSee dashboards available. GitHub webhooks are supported, which allows running builds immediately after there is a new commit in a repository.

Which one you choose depends on your requirements. Cache Updater is very easy to set up, CacheGitHubCI offers more features.

Both of them are built atop GitHub COS API, which provides COS wrappers for GitHub API.