@Jani Hurskainen providing a top-level answer here:

If you want to truly build your own unit test framework from scratch, you'd need to create a custom resource processor class in IPM. I see from https://github.com/intersystems/ipm/issues/616 that you've already discovered this feature and I appreciate your tenacity and the deep dive into IPM that you're doing.

Every development team I've worked on within InterSystems (that is, three very different ones) has had its own things it's wanted to do that %UnitTest.Manager and %UnitTest.TestCase don't *quite* do the way we want, with the API we want, right out of the box. The approach in general is to extend %UnitTest.Manager to tweak unit test runner behaviors (IPM does this itself, as you may have noticed); to extend %UnitTest.TestCase to add application-specific utility methods, assertions, and generic On(Before|After)(All|One)Test(s?) implementations (often controlled by class parameters); and potentially to add some mix-in utility classes that one might extend along with %UnitTest.TestCase or your own derived unit test base class.

For the basic case of "I want to run tests with my own %UnitTest.Manager subclass" we have a flag you can pass in to override the unit test manager class, -DUnitTest.ManagerClass=yourclassname. See https://community.intersystems.com/post/unit-tests-and-test-coverage-int... for an example of how to use this (with my team's https://github.com/intersystems/TestCoverage open source package).

At the IPM codebase level, there's special treatment of the common "pParams" array passed around everywhere - something looking at pParams("UnitTest","ManagerClass") will find the value specified in -DUnitTest.ManagerClass in the package manager shell command.

Hopefully this is helpful!

@Evgeny Shvarov to some extent we already do, via the -DUnitTest.ManagerClass parameter we use to run TestCoverage and through custom resource processors (which I see @Jani Hurskainen is already playing with!)

The common pattern in ObjectScript if you're going to customize unit test processes is to write a subclass of %UnitTest.Manager that does things the way you want. (And probably a few subclasses of %UnitTest.TestCase that add the standard assertion types, maybe some application-specific utility methods / wrappers, etc.) IPM will play well with this model, it's more work (all around!) if you want to write your own entire unit test framework from scratch.

The purpose of IRIS embedded source control features is to keep code changes made in the database synchronized with the server filesystem, to automate any source control provider-specific operations in a way that ensures that synchronization, and to provide concurrency controls for developers working in a shared environment (when relevant). In the days of Studio, all code changes were made in the database first, rather than on any filesystem, so you needed an embedded source control solution to get real source control at all. With client-side editing in VSCode, there are *still* some changes to code that are made "in the database first" - specifically, all the management portal-based graphical editors for interoperability and business intelligence. For such use cases, embedded source control is relevant even when you're developing against a local Docker container (which I'd consider modern best practice and prefer over a remote/shared environment where feasible) - otherwise, you need to jump through extra hoops to get your changes onto the client/server filesystem.

In a client-centric mode, it's totally fine to use git-source-control alongside the git command line, built-in VSCode tools, or your preferred Git GUI (GitHub Desktop, GitKraken, etc). However, this misses an important benefit of git-source-control: when you pull, checkout, etc., if you do it through the extension, we can automatically reflect the operation in IRIS by loading added/modified items and deleting items that have been removed in the database. If you make changes on the filesystem through one of these other channels, it's up to you to make sure things are reflected properly in IRIS.

Another benefit of git-source-control for local development is that when working across multiple IPM packages loaded from separate local repos, changes made via isfs folders will automatically be reflected in the correct repository. This is more natural for established ObjectScript developers especially (e.g.: "I just want to edit this class, then this other class in a different package") than a client-centric multi-root VSCode workspace, which could achieve the same thing but with a bit more overhead.