Pravin Barton · Aug 20, 2025 go to post

> Where in IRIS?

If you have VS Code set up with server-side (isfs) editing, there will be a little source control icon in the upper right corner when you open an item for editing:

The same icon will show up in the System Management Portal when you're editing an item that can be source controlled, for example in the production configuration page or the data transformation builder.

Note that these icons will only show up if server-side source control is enabled for the namespace, which `##class(SourceControl.Git.API).Configure()` will do for you automatically.

> Is there some easy to follow tutorials for both solutions?

This isn't a full tutorial but the best I can recommend for now is this video: https://community.intersystems.com/post/video-selecting-right-source-co…

Pravin Barton · Aug 19, 2025 go to post

Hi Paul,
There are a couple different ways of doing source control with IRIS and git.

The first is client-side source control. You will have a git repository on your local machine. You use VS Code to edit files in that local git repository. The ObjectScript extension for VS Code will push those edits to the IRIS database. You can then commit those changes and push them up to GitHub/GitLab/etc. using the git CLI locally or whatever git tool you prefer.

The second is server-side source control using Embedded Git. You will have a git repository on the remote server that IRIS is running on. That git repository can be created with `##class(SourceControl.Git.API).Configure()`. You can use VS Code with ISFS (or Studio, or the Interoperability Portal) to edit code in the IRIS database. Embedded Git will export that code to the git repository for you. You can then commit and push those changes using the source control menus embedded into IRIS.

It sounds like you have a mix of both currently which can get chaotic.

The advantage of client-side source control is that it's closer to standard industry practices. There are a lot of helpful tools like GitLens and GitHub Desktop that rely on you having a local git repository.

There are a couple of advantages of server-side source control:
- If you have multiple developers editing code in the same namespace (pretty common with legacy IRIS users) it will prevent them from stepping on each others' toes.
- If you are doing most of your work in the IRIS interoperability editors, server-side source control gives you source control actions embedded directly in those editors.
  
  Let me know if this is helpful or you have other questions.

Pravin Barton · Jun 4, 2025 go to post

You might be able to use the Size property to keep track of the initial size of the stream, write some more data, then use MoveTo() to rewind back to where you started writing:

set stream = ##class(%Stream.FileCharacter).%New()
do stream.Write("hello world!")
set tmpSize = stream.Size
do stream.Write("goodbye world!")
do stream.MoveTo(tmpSize + 1)
write stream.Read()
Pravin Barton · Feb 13, 2025 go to post

Try the following:

set mylist = ""for i = 1:1:5 {
    set item = "item"_i
    set mylist = mylist _ $ListBuild(item)
}

zw mylist

$Listbuild lists have a separator that includes the length of the data in the next item of the list. This is why your original code wasn't working - $listbuild(mylist, item) would make a new list with two elements where the first item was the original list.

Pravin Barton · Jan 28, 2025 go to post

Hi Armin, which version of the Embedded Git plugin is running on your server? If you have IRIS terminal access, the following command will show the version:

zpm "list-installed git-source-control"

Pravin Barton · Jul 29, 2024 go to post

You might use a GitLab webhook to do this. The first step would be to create an endpoint on your IRIS environments that can be called over HTTP to run the Pull method of the git-source-control API. That will deploy a change by running "git pull" and loading the changed files into IRIS. A simple way to do that is by creating a new web application with a REST handler like this:

Class MyApp.PullHandler Extends%CSP.REST
{
ClassMethod Pull()
{
    do##class(SourceControl.Git.API).Pull()
}
XData UrlMap
{
<Routes>
<Route Url="/pull" Method="POST" Call="Pull"/>
</Routes>
}

You will also need to make sure that this endpoint is network accessible to your GitLab environment, and authenticated according to your security requirements.

The second step would be to configure a GitLab webhook to call this endpoint on the event of a PR being merged. Here is some documentation from GitLab about how to do that: https://docs.gitlab.com/ee/user/project/integrations/webhooks.html

Pravin Barton · Jul 16, 2024 go to post

Hello Reuben,

You're correct that a single namespace can only have a single branch checked out at a time with git-source-control. The reason is that the namespace has one Git repository, and a Git repository cannot have multiple branches checked out at the same time.

I would consider this a feature rather than a bug. Imagine that a single namespace was associated with multiple branches, and an item has different changes on each branch. Which branch's version of the item would get loaded into IRIS? Shared development environments require there to be a single source of truth for the state of every item in source control.

For your scenario I might suggest single-user development environments. You could create a separate namespace for each developer on your server. Each namespace would have its own repository, with its own branches, cloning the same remote repository. The downside is that if two users edit the same item in different namespaces, you can end up with merge conflicts that you will need to resolve. I would highly recommend single-user environments if your developers are already comfortable with Git.

Pravin Barton · Apr 26, 2024 go to post

We found a way to do the second part, importing the class from a file without overwriting a specific member. The trick is:

  • open the old class definition with ##class(%Dictionary.ClassDefinition).%OpenId("class name")
  • create a clone with %ConstructClone()
  • load the class from the file without compiling it
  • open the new class definition
  • copy the member you want (in this case production definition XDATA) from the clone to the new class definition, and save the new class definition
  • finally, compile the class.
Pravin Barton · Apr 2, 2024 go to post

In older versions, trying to run "do $i(a)" would throw a <SYNTAX> error. You could instead use "if $i(a)" or "set a = $i(a)" to do the same thing. The "do $i(a)" was added with IRIS 2019.1 because it's a nicer-looking syntax. So you can treat them as identical, and the only reason to care either way is if you want to write code that is backwards-compatible with older Caché / IRIS versions.

Pravin Barton · Feb 26, 2024 go to post

This is a very delayed answer to an old question, but there is now a $zconvert mode in IRIS that will do this for you:

> write $zconvert("Árvore", "A")

Arvore
Pravin Barton · Nov 3, 2023 go to post

Hello Alan, we are lacking in documentation that explains what each of those menu items do. I logged a GitHub issue here to add that: https://github.com/intersystems/git-source-control/issues/296

You mention having an existing application with a lot of code already in source control that you would like to migrate to Git. What I might do for this situation is initialize a new Git repository and copy all the files from your older source control system into the repo. You can then configure git-source-control to use this new repository for source control. The "Import All" option will import the files from this new repository into IRIS.

In the meantime, here's a quick and dirty explanation of the options you mention:

  • Status: outputs the results of "git status" to the source control output
  • Settings: opens a web page where you can configure git-source-control settings
  • Launch Git UI: opens a web page where you can perform basic Git commands graphically
  • Push to remote branch: equivalent of "git push"
  • Fetch from remote: equivalent of "git fetch"
  • Pull changes from remote branch: equivalent of "git pull", plus a call to the pull event handler
  • Export All: exports all newly changed items in IRIS to the Git repository
  • Export All (Force): exports all items in IRIS to the Git repository, including those with older timestamps
  • Import All: imports all items in the Git repository to IRIS if the version in IRIS is outdated
  • Import All (Force): imports all items in the Git repository to IRIS
Pravin Barton · Sep 13, 2023 go to post

This can happen if the routine contains an ASCII character that cannot be printed to XML. Here is an example from a routine I created:

> set routine = ##class(%Routine).%OpenId("pbarton.test.MAC")
> zw routine.Read()
"pbarton"_$c(10)_" write ""hello"_$c(7)_""""

You can see the routine contains $c(7), which is a non-printable ASCII character. When I export the routine it looks like this:

<?xml version="1.0" encoding="UTF-8"?><Exportgenerator="IRIS"version="26"zv="IRIS for Windows (x86-64) 2023.2.0L (Build 159U)"ts="2023-09-13 11:20:00"><RoutineBase64name="pbarton.test"type="MAC"languagemode="0"timestamp="66730,40577.6434199">cGJhcnRvbgogd3JpdGUgImhlbGxvByI=
</RoutineBase64></Export>
Pravin Barton · Sep 12, 2023 go to post

If your VS Code workspace has an active connection to an ObjectScript server, you can do this:

  • Open the local XML file in VS Code
  • Right click and choose "Preview XML as UDL"
  • In the preview window, right click and choose "Import and Compile Current File"
Pravin Barton · May 16, 2023 go to post

Very helpful tool! This is much better than testing everything manually. I'll be adding it into the build pipeline for my system using IRIS BI.

Pravin Barton · Apr 18, 2023 go to post

Here's a simple example I wrote up and tested based on documentation.

A web service class on the server:

/// Sample.MyService
Class Sample.MyService Extends %SOAP.WebService
{

/// Name of the WebService.
Parameter SERVICENAME = "MyService";

/// TODO: change this to actual SOAP namespace.
/// SOAP Namespace for the WebService
Parameter NAMESPACE = "http://tempuri.org";

/// Namespaces of referenced classes will be used in the WSDL.
Parameter USECLASSNAMESPACES = 1;

Method ReceiveFile(attachment As %GlobalBinaryStream) [ WebMethod ]
{
	set filestream = ##class(%Stream.FileBinary).%New()
	$$$ThrowOnError(filestream.LinkToFile("C:\temp\file_"_$username_$zts_".out"))
	do filestream.CopyFrom(attachment)
	$$$ThrowOnError(filestream.%Save())
}

}

A web client class on the client. This was generated with the SOAP wizard in Studio. Only the datatype of the attachment argument to ReceiveFile has been modified.

Class MyService.Client.MyServiceSoap Extends %SOAP.WebClient [ ProcedureBlock ]
{

/// This is the URL used to access the web service.
Parameter LOCATION = "http://localhost:52773/csp/user/Sample.MyService.cls";

/// This is the namespace used by the Service
Parameter NAMESPACE = "http://tempuri.org";

/// Use xsi:type attribute for literal types.
Parameter OUTPUTTYPEATTRIBUTE = 1;

/// Determines handling of Security header.
Parameter SECURITYIN = "ALLOW";

/// This is the name of the Service
Parameter SERVICENAME = "MyService";

/// This is the SOAP version supported by the service.
Parameter SOAPVERSION = 1.1;

Method ReceiveFile(attachment As %GlobalBinaryStream) [ Final, ProcedureBlock = 1, SoapBindingStyle = document, SoapBodyUse = literal, WebMethod ]
{
 Do (..WebMethod("ReceiveFile")).Invoke($this,"http://tempuri.org/Sample.MyService.ReceiveFile",.attachment)
}

}

And some sample code for the client to use this class to send a file:

/// get the file
set filestream = ##class(%Stream.FileBinary).%New()
$$$ThrowOnError(filestream.LinkToFile(pFileName))

/// create the attachment
set attachment = ##class(%GlobalBinaryStream).%New()
do attachment.CopyFrom(filestream)

/// create the client and send the file
set client = ##class(MyService.Client.MyServiceSoap).%New()
set client.Username = "redacted"
set client.Password = "redacted"
do client.ReceiveFile(attachment)

This will include the entire base-64-encoded file in the body of the SOAP message. An even better way would be to use MTOM attachments for the file. See the documentation here for more details about how to do that: https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=GSOAP_mtom

Pravin Barton · Feb 16, 2023 go to post

Very helpful article, thank you for posting. I'm curious if you see any benefit to using columnar storage in a scenario that is also using InterSystems IRIS Business Intelligence (f.k.a. DeepSee) cubes. Columnar storage lets you run analytical queries with aggregates very efficiently in pure SQL. On the other hand IRIS BI pre-computes the aggregates in cubes, which you then must query in MDX. I might be totally off base but they sound like alternatives to each other.

Pravin Barton · Aug 11, 2022 go to post

Thank you Alex! $SYSTEM.Process.ClientIPAddress($J) does it, that gives me the IP of the SOAP client.

Pravin Barton · Jul 14, 2022 go to post

It's possible to do this by using a trigger generator. Then you can run GetColumns at compile time of the class, and use the result to write out lines of code using the {fieldName*C} syntax. Just as a warning, using generators can be tricky because it adds a layer of indirection to your code. The best way to debug is to use the "View Other" command in Studio or VS Code and look directly at the generated code.

Here is some sample code for a trigger generator:

Trigger TestTrigger [ CodeMode = objectgenerator, Event = INSERT/UPDATE, Foreach = row/object ]
{
    set tableName = %compiledclass.SQLQualifiedNameQ
    set st = ##class(%SYSTEM.SQL).GetColumns(tableName,.byname,.bynum,1)
    $$$ThrowOnError(st)
    set i = 1
    while $d(bynum(i)) {
        set xColName = bynum(i)
        do %code.WriteLine(" set ^test("""_xColName_" changed"") = {"_xColName_"*C}")
        set i = i + 1
    }
}
Pravin Barton · May 18, 2022 go to post

Hi Michael, great questions. A lot of this will depend on your own practices for source code management and deployment. In my team's case we ended up overriding a lot of the %UnitTest behavior to provide reasonable defaults for our process. Hopefully this sparks some more discussion. I'm interested in how other peoples' answers will differ.

> Are all your unit tests added to .gitignore so they don't get wound up in source code?

No - we want source code for our unit tests to be in source control, for the same reason all other code is in source control. We make sure that unit tests don't end up on production systems by maintaining different branches for test and production. Unit tests are in a separate directory that we don't merge from the test branch to the production branch. This is using Perforce. There might be a different workflow recommended for Git that would give you the same results.

> Why does the normal RunTestCase() method automatically delete the extracted unit test class files from the folder?  Why is that the default?

If I had to guess, this is a good default for a deployment process where you compile everything, run tests, and then copy over the code database to production. In that case you would always want test classes to delete themselves after running. In our case we have a different way of deploying code, so we override the RunTest methods to use the "/nodelete" flag by default.

> When it comes to automated testing (Jenkins specifically) what is the lifecycle?

We use a very similar lifecycle for Jenkins automated testing to the one you describe.

  1. Jenkins pulls all code from the remote repo
  2. Run an %Installer class on the build instance that overwrites the code database so we start from scratch
  3. Load and compile all code from the local workspace into the build instance, including tests. Report any compilation error as a build failure.
  4. Run all tests.
  5. Generate JUnit-format test reports and Cobertura test coverage reports.
Pravin Barton · May 12, 2022 go to post

If the character stream has JSON-format contents and you'd like to read it into a dynamic entity in ObjectScript, you can simply pass the stream into the %FromJSON method:

set obj = ##class(%DynamicAbstractObject).%FromJson(stream)

See the documentation for dynamic entity methods here.

Pravin Barton · Apr 22, 2022 go to post

Good question - it looks like the VS Code plugins only support password authentication for now. I'd encourage opening an issue against the InterSystems Server Manager GitHub project if you have a need for it. In theory this would be possible with some implementation work on the VS Code plugin. You would also need to enable delegated authentication on the /api/atelier web application in IRIS with a custom ZAUTHENTICATE routine to support OAuth.

Pravin Barton · Apr 7, 2022 go to post

Note if you are on a later IRIS version and you're not finding anything in ^%ISCLOG. The log entries have been moved from ^%ISCLOG into ^ISCLOG global, and they are only accessible in the "%SYS" namespace. The commands to use it look like this:

set ^%ISCLOG = 2
// do something that will generate logs
set ^%ISCLOG = 0
set $namespace = "%SYS"
zwrite ^%ISCLOG

The first version with this new log location is IRIS 2018.1.0.

Pravin Barton · Dec 2, 2021 go to post

Hi Prashanth,

Based on the error status, this looks like an invalid SSL certificate on the REST endpoint. The certificate name is "*.docmansandpit.com" but the server name is "api.ss1.docmansandpit.com". The certificate would need to have "*.ss1.docmansandpit.com" to cover that domain. I would try calling the same endpoint in another test client (or even a web browser) to see if it gives you the same certificate error. If so you would have to get in touch with the owner of that API to fix their certificate.

If you're only able to replicate the issue in your Caché instance I would contact InterSystems Support.

As a last resort, you can probably disable checking the server certificate by doing:

set httpRequest.SSLCheckServerIdentity = 0

This is not recommended because it's insecure, but it might be useful as a debugging tool.