You might use a GitLab webhook to do this. The first step would be to create an endpoint on your IRIS environments that can be called over HTTP to run the Pull method of the git-source-control API. That will deploy a change by running "git pull" and loading the changed files into IRIS. A simple way to do that is by creating a new web application with a REST handler like this:

Class MyApp.PullHandler Extends %CSP.REST
{
ClassMethod Pull()
{
    do ##class(SourceControl.Git.API).Pull()
}
XData UrlMap
{
<Routes>
<Route Url="/pull" Method="POST" Call="Pull"/>
</Routes>
}

You will also need to make sure that this endpoint is network accessible to your GitLab environment, and authenticated according to your security requirements.

The second step would be to configure a GitLab webhook to call this endpoint on the event of a PR being merged. Here is some documentation from GitLab about how to do that: https://docs.gitlab.com/ee/user/project/integrations/webhooks.html

Hello Reuben,

You're correct that a single namespace can only have a single branch checked out at a time with git-source-control. The reason is that the namespace has one Git repository, and a Git repository cannot have multiple branches checked out at the same time.

I would consider this a feature rather than a bug. Imagine that a single namespace was associated with multiple branches, and an item has different changes on each branch. Which branch's version of the item would get loaded into IRIS? Shared development environments require there to be a single source of truth for the state of every item in source control.

For your scenario I might suggest single-user development environments. You could create a separate namespace for each developer on your server. Each namespace would have its own repository, with its own branches, cloning the same remote repository. The downside is that if two users edit the same item in different namespaces, you can end up with merge conflicts that you will need to resolve. I would highly recommend single-user environments if your developers are already comfortable with Git.

We found a way to do the second part, importing the class from a file without overwriting a specific member. The trick is:

  • open the old class definition with ##class(%Dictionary.ClassDefinition).%OpenId("class name")
  • create a clone with %ConstructClone()
  • load the class from the file without compiling it
  • open the new class definition
  • copy the member you want (in this case production definition XDATA) from the clone to the new class definition, and save the new class definition
  • finally, compile the class.

In older versions, trying to run "do $i(a)" would throw a <SYNTAX> error. You could instead use "if $i(a)" or "set a = $i(a)" to do the same thing. The "do $i(a)" was added with IRIS 2019.1 because it's a nicer-looking syntax. So you can treat them as identical, and the only reason to care either way is if you want to write code that is backwards-compatible with older Caché / IRIS versions.

Hello Alan, we are lacking in documentation that explains what each of those menu items do. I logged a GitHub issue here to add that: https://github.com/intersystems/git-source-control/issues/296

You mention having an existing application with a lot of code already in source control that you would like to migrate to Git. What I might do for this situation is initialize a new Git repository and copy all the files from your older source control system into the repo. You can then configure git-source-control to use this new repository for source control. The "Import All" option will import the files from this new repository into IRIS.

In the meantime, here's a quick and dirty explanation of the options you mention:

  • Status: outputs the results of "git status" to the source control output
  • Settings: opens a web page where you can configure git-source-control settings
  • Launch Git UI: opens a web page where you can perform basic Git commands graphically
  • Push to remote branch: equivalent of "git push"
  • Fetch from remote: equivalent of "git fetch"
  • Pull changes from remote branch: equivalent of "git pull", plus a call to the pull event handler
  • Export All: exports all newly changed items in IRIS to the Git repository
  • Export All (Force): exports all items in IRIS to the Git repository, including those with older timestamps
  • Import All: imports all items in the Git repository to IRIS if the version in IRIS is outdated
  • Import All (Force): imports all items in the Git repository to IRIS

This can happen if the routine contains an ASCII character that cannot be printed to XML. Here is an example from a routine I created:

> set routine = ##class(%Routine).%OpenId("pbarton.test.MAC")
> zw routine.Read()
"pbarton"_$c(10)_" write ""hello"_$c(7)_""""

You can see the routine contains $c(7), which is a non-printable ASCII character. When I export the routine it looks like this:

<?xml version="1.0" encoding="UTF-8"?>
<Export generator="IRIS" version="26" zv="IRIS for Windows (x86-64) 2023.2.0L (Build 159U)" ts="2023-09-13 11:20:00">
<RoutineBase64 name="pbarton.test" type="MAC" languagemode="0" timestamp="66730,40577.6434199">cGJhcnRvbgogd3JpdGUgImhlbGxvByI=
</RoutineBase64>
</Export>

Here's a simple example I wrote up and tested based on documentation.

A web service class on the server:

/// Sample.MyService
Class Sample.MyService Extends %SOAP.WebService
{

/// Name of the WebService.
Parameter SERVICENAME = "MyService";

/// TODO: change this to actual SOAP namespace.
/// SOAP Namespace for the WebService
Parameter NAMESPACE = "http://tempuri.org";

/// Namespaces of referenced classes will be used in the WSDL.
Parameter USECLASSNAMESPACES = 1;

Method ReceiveFile(attachment As %GlobalBinaryStream) [ WebMethod ]
{
    set filestream = ##class(%Stream.FileBinary).%New()
    $$$ThrowOnError(filestream.LinkToFile("C:\temp\file_"_$username_$zts_".out"))
    do filestream.CopyFrom(attachment)
    $$$ThrowOnError(filestream.%Save())
}

}

A web client class on the client. This was generated with the SOAP wizard in Studio. Only the datatype of the attachment argument to ReceiveFile has been modified.

Class MyService.Client.MyServiceSoap Extends %SOAP.WebClient [ ProcedureBlock ]
{

/// This is the URL used to access the web service.
Parameter LOCATION = "http://localhost:52773/csp/user/Sample.MyService.cls";

/// This is the namespace used by the Service
Parameter NAMESPACE = "http://tempuri.org";

/// Use xsi:type attribute for literal types.
Parameter OUTPUTTYPEATTRIBUTE = 1;

/// Determines handling of Security header.
Parameter SECURITYIN = "ALLOW";

/// This is the name of the Service
Parameter SERVICENAME = "MyService";

/// This is the SOAP version supported by the service.
Parameter SOAPVERSION = 1.1;

Method ReceiveFile(attachment As %GlobalBinaryStream) [ Final, ProcedureBlock = 1, SoapBindingStyle = document, SoapBodyUse = literal, WebMethod ]
{
 Do (..WebMethod("ReceiveFile")).Invoke($this,"http://tempuri.org/Sample.MyService.ReceiveFile",.attachment)
}

}

And some sample code for the client to use this class to send a file:

/// get the file
set filestream = ##class(%Stream.FileBinary).%New()
$$$ThrowOnError(filestream.LinkToFile(pFileName))

/// create the attachment
set attachment = ##class(%GlobalBinaryStream).%New()
do attachment.CopyFrom(filestream)

/// create the client and send the file
set client = ##class(MyService.Client.MyServiceSoap).%New()
set client.Username = "redacted"
set client.Password = "redacted"
do client.ReceiveFile(attachment)

This will include the entire base-64-encoded file in the body of the SOAP message. An even better way would be to use MTOM attachments for the file. See the documentation here for more details about how to do that: https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls...

Very helpful article, thank you for posting. I'm curious if you see any benefit to using columnar storage in a scenario that is also using InterSystems IRIS Business Intelligence (f.k.a. DeepSee) cubes. Columnar storage lets you run analytical queries with aggregates very efficiently in pure SQL. On the other hand IRIS BI pre-computes the aggregates in cubes, which you then must query in MDX. I might be totally off base but they sound like alternatives to each other.

It's possible to do this by using a trigger generator. Then you can run GetColumns at compile time of the class, and use the result to write out lines of code using the {fieldName*C} syntax. Just as a warning, using generators can be tricky because it adds a layer of indirection to your code. The best way to debug is to use the "View Other" command in Studio or VS Code and look directly at the generated code.

Here is some sample code for a trigger generator:

Trigger TestTrigger [ CodeMode = objectgenerator, Event = INSERT/UPDATE, Foreach = row/object ]
{
    set tableName = %compiledclass.SQLQualifiedNameQ
    set st = ##class(%SYSTEM.SQL).GetColumns(tableName,.byname,.bynum,1)
    $$$ThrowOnError(st)
    set i = 1
    while $d(bynum(i)) {
        set xColName = bynum(i)
        do %code.WriteLine(" set ^test("""_xColName_" changed"") = {"_xColName_"*C}")
        set i = i + 1
    }
}

Hi Michael, great questions. A lot of this will depend on your own practices for source code management and deployment. In my team's case we ended up overriding a lot of the %UnitTest behavior to provide reasonable defaults for our process. Hopefully this sparks some more discussion. I'm interested in how other peoples' answers will differ.

> Are all your unit tests added to .gitignore so they don't get wound up in source code?

No - we want source code for our unit tests to be in source control, for the same reason all other code is in source control. We make sure that unit tests don't end up on production systems by maintaining different branches for test and production. Unit tests are in a separate directory that we don't merge from the test branch to the production branch. This is using Perforce. There might be a different workflow recommended for Git that would give you the same results.

> Why does the normal RunTestCase() method automatically delete the extracted unit test class files from the folder?  Why is that the default?

If I had to guess, this is a good default for a deployment process where you compile everything, run tests, and then copy over the code database to production. In that case you would always want test classes to delete themselves after running. In our case we have a different way of deploying code, so we override the RunTest methods to use the "/nodelete" flag by default.

> When it comes to automated testing (Jenkins specifically) what is the lifecycle?

We use a very similar lifecycle for Jenkins automated testing to the one you describe.

  1. Jenkins pulls all code from the remote repo
  2. Run an %Installer class on the build instance that overwrites the code database so we start from scratch
  3. Load and compile all code from the local workspace into the build instance, including tests. Report any compilation error as a build failure.
  4. Run all tests.
  5. Generate JUnit-format test reports and Cobertura test coverage reports.