Mine doesn't do that - yet. Still using HealthShare 2020.1. I assume it'll be added with a later version!

11:     ^FSLOG(10)    =    "DispatchRequest^HS.FHIRServer.Service^3245|Msg|Dispatch interaction search-type for MedicationAdministration"
12:     ^FSLOG(11)    =    "BuildIncludeList^HS.FHIRServer.Service^3245|_include|include: "
13:     ^FSLOG(12)    =    "BuildRevIncludeList^HS.FHIRServer.Service^3245|_include|revInclude: "

As for the polling - I guess with a timestamp maybe added in the future (We're planning to upgrade soon), this may very well be the smartest solution. Thanks!

Might be a bit late to the party, but we use studio project exports in one of our project to create build artifacts, mainly because we are working with customers that do not support containers or other methods of deployment.

Here is the snippet:

ClassMethod CreateStudioExport() As %Status
        #Dim rSC As %Status
        #Dim tSE As %Exception.StatusException
        #Dim tProject as %Studio.Project
        Try {
            set tRelevantFiles = ..GetAllRelevantFiles()
            set tProject = ##class(%Studio.Project).%New()
            set tProject.Name = "My Studio Export"
            set tIterator = tRelevantFiles.%GetIterator()
            while tIterator.%GetNext(.key , .classToAdd ) {
                write "Adding "_classToAdd_" to project export",!
            zwrite tProject
            $$$ThrowOnError(tProject.Export("/opt/app/studio-project.xml", "ck", 0, .errorLog, "UTF-8"))
            Set rSC = $$$OK
        } Catch tSE {
            zwrite errorLog
            Set rSC = tSE.AsStatus()
        Quit rSC

ClassMethod GetAllRelevantFiles() As %DynamicArray
    set tt=##class(%SYS.Python).Import("files")
    set string = tt."get_all_cls"("/opt/app/src/src")
    return ##class(%DynamicArray).%FromJSON(string)

Here is the python script:

import os
import json # Used to gather relevant files during a buildpipeline step! 

def normalize_file_path(file, directory):
    # Remove the first part of the directory to normalize the class name
    class_name = file[directory.__len__():].replace("\\", ".").replace("/", ".")
    if class_name.startswith("."):
        class_name = class_name[1:]
    return class_name 

def is_relevant_file(file):
    file_lower = file.lower()
    return file_lower.endswith(".cls") \
        or file_lower.endswith(".inc") \
        or file_lower.endswith(".gbl") \
        or file_lower.endswith(".csp") \
        or file_lower.endswith(".lut") \
        or file_lower.endswith(".hl7") 

def get_all_cls(directory):
    all_files = [val for sublist in [[os.path.join(i[0], j) for j in i[2]] for i in os.walk(directory)] for val in sublist]
    all_relevant_files = list(filter(is_relevant_file, all_files))
    normalized = list(map(lambda file: normalize_file_path(file, directory), all_relevant_files))
    return json.dumps(normalized)

It is all rather hacky, and you probably have to use the snippets I provided as basis, and implement stuff yourself. 

What we do is:

  1. Spin up a docker container with python enabled in the buildpipeline that has the source files mounted to /opt/app/src
  2. Execute the CreateStudioExport() method in said docker container
  3. Copy the newly created studio export to the build pipeline host
  4. Tag the studio export as artifact and upload it to a file storage

Maybe this helps! Let me know if you have questions!

Are you referring to log output in the "Log" panel of the Production?

As far as I am aware, these are writting with the $$$LOGSTATUS, $$$LOGINFO, etc macros, which simply write to Ens.Util.Log:

Which you can then access via SQL or the global. So, if you wanted to delete everything for a specific production Item, you could do something like this:

select * from Ens_Util.Log where ConfigName = 'Error Notification';
delete from Ens_Util.Log where ConfigName = 'Error Notification';

where ConfigName is the service, process or operation's name.

Hope that helps!

I'll just repost @Dmitry Maslennikov grep from the community discord here, which might give you a hint where to look until ISC updated the official statement

$ grep -ir log4j /usr/irissys/
/usr/irissys/lib/RenderServer/runwithfop.bat:rem set LOGCHOICE=-Dorg.apache.commons.logging.Log=org.apache.commons.logging.impl.Log4JLogger
Binary file /usr/irissys/dev/java/lib/h2o/h2o-core-3.26.0.jar matches
Binary file /usr/irissys/dev/java/lib/uima/uimaj-core-2.10.3.jar matches
Binary file /usr/irissys/dev/java/lib/1.8/intersystems-integratedml-1.0.0.jar matches
Binary file /usr/irissys/dev/java/lib/1.8/intersystems-cloudclient-1.0.0.jar matches
Binary file /usr/irissys/dev/java/lib/1.8/intersystems-cloud-manager-1.2.12.jar matches
Binary file /usr/irissys/dev/java/lib/datarobot/datarobot-ai-java-2.0.8.jar matches
/usr/irissys/fop/fop:# LOGCHOICE=-Dorg.apache.commons.logging.Log=org.apache.commons.logging.impl.Log4JLogger
/usr/irissys/fop/fop.bat:rem set LOGCHOICE=-Dorg.apache.commons.logging.Log=org.apache.commons.logging.impl.Log4JLogger
Binary file /usr/irissys/fop/lib/commons-logging-1.0.4.jar matches
Binary file /usr/irissys/fop/lib/avalon-framework-impl-4.3.1.jar matches
/usr/irissys/fop/lib/README.txt:    (Logging adapter for various logging backends like JDK 1.4 logging or Log4J)
Binary file /usr/irissys/fop/lib/pdfbox-app-2.0.21.jar matches

The Apache HTTP Server is not written in Java (See this StackExchange post)

The security exploit refers to a very popular java logging implementation, log4j. Log4j is published under the Apache Foundations name, but is not to be confused with the Apache http server (also called httpd occasionally).

That said, you might want to check if you are using any Java libraries in your InterSystems products via the Java gateway - and if they are bundled with log4j for logging. Also check if you are having log4j directly in your Java classpath. What you are looking for is the log4j.jar.

If you want to check a library, you can download the jar of the library and open it with 7zip or similar tools, then take a look and check if it contains log4j.jar. If it does, you should get in touch with the creator of the library.

Disclaimer: I am not part of InterSystems, this is of course not an official statement. I am just a Java developer that had to deal with this today a bit!


I think the recommended way to expose data nowdays would be via a REST API. The web application itself would be a framework of your choice. There is lots to choose from: Angular, React, Vue, WebComponents...you name it!

It should be named that CSP does exist and is still supported, though the usage of CSP is discouraged. See here.

So, the main part to be done would be to write REST APIs for the data you want to access. There is a good documentation to be found here.

Once you got that, you can take a look at the REST API Template from the intersystems-community

In terms of security, you might want to check out this post about OAuth: https://community.intersystems.com/post/oauth-authorization-and-intersys....

For general objectscript knowledge, you might want to try the learning path about developing server side applications here.

You can also launch a lab instance, an IRIS instance you can try out things in from here.

In general, checking out the open exchange for example applications if, of course, always a good idea!

I do believe I found a decent solution to my own problem.

  • Rename the Angular index.html to index.csp. In angular.json, set the architect.build.options.index property to src/index.csp
  • Set the AutoCompile property of the Security.Applications web application to 1
  • Set ServeFiles to 3

That way, the index.html effectively serves as as a CSP page. No changes are done to the content of the index.html, IRIS seems to be happy with taking a plain HTML file as CSP. AutoCompile makes sure that the index.csp is compiled if needed (useful for local development). 

Yeah I think we'll be utilizing the OS auth for this!

Regarding the use case: Part of our team uses Studio on the "old" develop-on-remote-instance style of development. We built a process that grabs the code from the remote instance to deploy it to the next stage. It's an intermediary solution to building actual docker images instead of just pushing code into docker images.

Basically, a job would execute a git pull, then open an iris session, and execute a manifest updater that will pull the source code from a mounted volume. Right now, this is done manually because of said password prompt.

Hey! I had the same issue. Apparently, this is an ongoing Issue with compose: https://github.com/docker/compose/issues/8186

Someone there posted a workaround for VSCode. So, in addition to your workaround, you could add this to your settings.json in VSCode (Same workaround, different environment)

    "terminal.integrated.env.windows": {