Question
· Jan 4, 2022

Implement CI/CD with VSTS for Intersystems IRIS rest api

Hi Team,

I want implement build/release pipeline for InterSystems IRIS rest API with VSTS without docker container or other tool.

Can you please provide step by step guide for same.

Thanks,

Ankur Shah

Product version: IRIS 2019.1
Discussion (6)0
Log in or sign up to continue

Most of the CI/CD processes are now based on Container's way. And do it without Docker makes this process much more complex. And It's not quite clear what do you want to achieve. 

And in any way, this task is quite complex, and very much depends on what kind of application you have, how you build it right now, in some cases major OS, and even when some other languages and technologies are used for application. You may contact me directly, I can help with this, I have experience and knowledge on this.

We are using angular as front end and Intersystem IRIS as a backend. 
We created a CI/CD pipeline for angular project without docker container with VSTS. Same way we want to implement CI/CD pipeline for Intersytem IRIS.

Goal is to move our IRIS code from stage server to production server with the help of CI/CD pipeline. Moreover, we don't have any idea on docker and not sure what additional infrastructure required for used docker container. 

Hi.

I implement CI/CD pipeline of IRIS in AWS without container!
I use CodeCommit what is git service and CodeDeploy what is deploy service.

When source code(cls files) was pushed to CodeCommit, CodeDeploy executes to pull source files from CodeCommit, and deploy to application server.
Application server is installed IRIS, and use Interoperability for monitor deploy files.

When Interoperability detects files, executes $SYSTEM.OBJ.DeletePackage(path) and $SYSTEM.OBJ.ImportDir(path, "*.cls;*.mac;*.int;*.inc;*.dfi", flag).

Might be a bit late to the party, but we use studio project exports in one of our project to create build artifacts, mainly because we are working with customers that do not support containers or other methods of deployment.

Here is the snippet:

ClassMethod CreateStudioExport() As %Status
{
        #Dim rSC As %Status
        #Dim tSE As %Exception.StatusException
        #Dim tProject as %Studio.Project
        Try {
            set tRelevantFiles = ..GetAllRelevantFiles()
            set tProject = ##class(%Studio.Project).%New()
            set tProject.Name = "My Studio Export"
            set tIterator = tRelevantFiles.%GetIterator()
            while tIterator.%GetNext(.key , .classToAdd ) {
                write "Adding "_classToAdd_" to project export",!
                $$$ThrowOnError(tProject.AddItem(classToAdd))
            }
            $$$ThrowOnError(tProject.%Save())
            zwrite tProject
            $$$ThrowOnError(tProject.Export("/opt/app/studio-project.xml", "ck", 0, .errorLog, "UTF-8"))
            Set rSC = $$$OK
        } Catch tSE {
            zwrite errorLog
            Set rSC = tSE.AsStatus()
            Quit
        }
        Quit rSC
} 

ClassMethod GetAllRelevantFiles() As %DynamicArray
{
    set tt=##class(%SYS.Python).Import("files")
    set string = tt."get_all_cls"("/opt/app/src/src")
    return ##class(%DynamicArray).%FromJSON(string)
}

Here is the python script:

import os
import json # Used to gather relevant files during a buildpipeline step! 

def normalize_file_path(file, directory):
    # Remove the first part of the directory to normalize the class name
    class_name = file[directory.__len__():].replace("\\", ".").replace("/", ".")
    if class_name.startswith("."):
        class_name = class_name[1:]
    return class_name 

def is_relevant_file(file):
    file_lower = file.lower()
    return file_lower.endswith(".cls") \
        or file_lower.endswith(".inc") \
        or file_lower.endswith(".gbl") \
        or file_lower.endswith(".csp") \
        or file_lower.endswith(".lut") \
        or file_lower.endswith(".hl7") 

def get_all_cls(directory):
    all_files = [val for sublist in [[os.path.join(i[0], j) for j in i[2]] for i in os.walk(directory)] for val in sublist]
    all_relevant_files = list(filter(is_relevant_file, all_files))
    normalized = list(map(lambda file: normalize_file_path(file, directory), all_relevant_files))
    print(normalized)
    return json.dumps(normalized)

It is all rather hacky, and you probably have to use the snippets I provided as basis, and implement stuff yourself. 

What we do is:

  1. Spin up a docker container with python enabled in the buildpipeline that has the source files mounted to /opt/app/src
  2. Execute the CreateStudioExport() method in said docker container
  3. Copy the newly created studio export to the build pipeline host
  4. Tag the studio export as artifact and upload it to a file storage

Maybe this helps! Let me know if you have questions!