Since version 1.93 of VS Code, the “import and compile” process has been very slow (more than an hour). Have you encountered this issue and do you know how to resolve it?
We have installed IRIS on a separate server that we access remotely and copied all code and data to it so we can test IRIS before we do the move.We develop directly on the server.
I don't know why I can't get into management portal. I saw something related to permission issue of group allowed to start and stop and have already changed to root and Admin but still doesn't work.
I'm starting to explore OAuth2 and, as a first step, I want to set up an OAuth2 authentication with Microsoft. I've created a small sample page that attempts to sign in with Microsoft. However, after entering the user credentials, the redirection doesn't work.
I have installed the latest version of IRIS (without a web server) to replace my community version with a embeded web server. I tried to connect Visual Studio Code to my namespace, but I am unable to do so. I keep receiving the message "Not found." Here is my configuration:
I've recently updated the python version of a linux server running Red Hat Enterprise Linux 8.10 (Ootpa). We have an instance 2023.1 running there, and whenever I run the $System.Pyhthon.Shell() I can see it's still pointing to the old version. From within linux, it runs the latest one (we've change all the links to the new 3.11, so no scripts are broken).
So I guess the problem comes from the fact irispython is still compiled using old python version. So, how can I do to force IRIS to use the current version on the server, or update the irispython file?
I ran into two issues with Cache to XML Export and Cache to JSON Export in regard to array sequences. So before I waste time opening a WRC ticket, I figured I would poll the Development Community, since there is always so much wonderful feedback and suggestions via this Developer Community! So much thanks in advance for everyone's input! Go Team!
I am converting xml message into HL7 message but the input XML message contains pdf which is converting into base 64 and getting mapped to OBX:5.5 in HL7 message and sending it to downstream
In Downstream service i am using normal HL7 TCP class EnsLib.HL7.Service.TCPService but the message looks like below i am not sure why stream is taking as another segment in HL7 message,
Hello, I want to create PDF from HTML source. I found pandoc. I installed pandoc on IRIS container image. I created Interoperability production. I have setup REST service to receive HTML file in request body. I call pandoc command pandoc -o output.pdf input.html from a BPL process. I copy output.pdf file stream into response body. I save the response at the source. I get a file named output.pdf but it does not load in Acrobat. I suspect I am doing something wrong with headers (accept-encoding?) or maybe do I need to base64 encode the pdf file to transfer it via REST?
Hello, I try to develop a REST interface where I need to interact with legacy MUMPS routines. How can I pass in input to a Read without modifying the legacy code?
I think in linux I can execute command < inputfile to read from file, but how does it work in ObjectScript?
I'm a newbie in Dockers & containers and trying to install IRIS & VS Code in a container with dockers in Linux.
My understanding is that I first need to install docker, create a container then install IRIS image and looked into this documentation (Container How-to | InterSystems IRIS Data Platform 2024.2) and got me lost so I guess I need baby steps:
How to add the following to a stored procedure (Cache Studio) 1. Select from a few tables and insert result into a TABLEA. 2. Then select data from the TABLEA, apply some SQL logic, insert results into TABLEB. 3. SELECT * FROM TableA UNION ALL SELECT * FROM TableB
My purpose is to encrypt a communication using JWT tokens. I am developing on IRIS and my purpose is to generate a JWT token that will run on an older version of Cache (so I have to use functions that are compatible with the older version, Cache).
I wrote this code in IRIS:
s username = "user-test123" set st = ##class(%OAuth2.Utils).TimeInSeconds($ztimestamp,0) set et = ##class(%OAuth2.Utils).TimeInSeconds($ztimestamp, 60*15)
when using $zt($P($H,",",2)) from terminal it gives the correct local server time but when using it in my cache code it gives an incorrect time (5 hrs difference)
Currently we are exploring how we can allocate additional disk space to our current environment as we have seen a significant increase in growth of our Database files. Currently we have 3 namespaces, all with 1 IRIS.dat each that contains both the Global and Routines.
Since we have started down the route of everything within a single IRIS.dat file for each namespace, is it logical as we see growth to be able to split the current IRIS.dat for each namespace into a separate IRIS.dat for global and a IRIS.dat with for routines for each namespace in a Mirror environment?
I am currently part of a team that is developing an application using Microsoft PowerApps as the front end and IRIS as the backend. Effectively that frontend screens, which are house and an Azure serve, call a series of REST interfaces exposed by IRIS from a physical Microsoft server. During the development stage we have not had any security in place but now we need to secure the application using a single sign on. PowerApps relies on Microsoft Entra for its security both LDAP and OAuth. Has anyone in the community connected IRIS to Microsoft Entra?
I have an API set up in IRIS which is secured using an IRIS authentication service, so there is a bearer token being passed down in the request header.
Using the FHIR DEMO, I have pieced together how to make a FHIR Request using OAuth against an External FHIR Repository. When I execute the Patient search (HS.FHIRServer.Interop.Request), I get a HS.FHIRServer.Interop.Response that has a Quick Stream ID, which I then use to convert the Quick Stream to a JSON Dynamic Object. if I do a trace on the Raw JSON Object, I am able to pull out single elements, however I want to pull the raw JSON into a defined Class Structure.
I am trying to establish an HTTPS connection to a server using a %Net.HttpRequest object. I'm able to ping and curl the server via command line. The issue I am running into is that I am able to establish a connection, but something seems to be going wrong with verification from the server side. For example, if I use the CheckSSLCN method on the server, it returns this error message
ERROR #6155: Unable to verify SSL/TLS connected to correct system as no SSL certificate present for this socket. */
My variable `check1` is a string. It is either the empty string (for invalid/false answer) or a non-empty string for a valid/true input. If it is valid, I want to return it. I wrote this code:
In a HL7 Business Rule, I want to create a custom function that works on an HL7 message object (ENSLIB.HL7.Message). I extended the Ens.Rule.FunctionSet class, and have a class method that accept a ENSLIB.HL7.Message obj -
ClassMethod myfunction(msg As EnsLib.HL7.Message) As %Boolean, so I can call myfunction(HL7) in the rule's condition