I need to make changes to OBX 5 which shows as immutable
I have tried ConstructClone, ThrowOnError, and Streams but I can't get the syntax correct
Example
OBX|1|TX|2000.02^REASON FOR REQUEST^AS4|142|REASON FOR REQUEST: Total Cost: 0.00||||||O ^^^^ remove "REASON FOR REQUEST" ^^ add cr/lf so down stream reports can be formatted more easily
I'm working with routes for a rest service with intersystems iris. I have a working /test path, but I want to create a regex that accepts any path passed after /test. example /test/sdjklsbdk or /test/sdfkjgbskdbf/skjbksdb/ksdjbdks
I used <Route Url="/test/(.*)" Method="POST" Call="test"/> and <Route Url="/test/:path" Method="POST" Call="test"/> and I got the error <PARAMETER> what could it be?
Is there a generic process for "walking" the structure of a virtual document - eg an HL7 message (EnsLib.HL7.Message) or an XML document (EnsLib.EDI.XML.Document).
At least we'd want to be able to visit all "nodes" (HL7 fields or sub-fields, XML nodes) in the virtual document and be able to work out/generate the Property Path (so we could call "GetValueAt").
We can just about come up with something generic for HL7, since it only nests down to 4 levels within each segment, though we're using numeric Property Path's at that point rather than symbolic ones (MSH:1.3 etc).
I'm trying to create an indexed table with an vector field so I can search by the vector value. I've been investigating and found that to get the vector value based on the text (token), use a Python method like the following:
I have a HL7 DTL in which I'm doing a lookup to a table based on a code value in the IN1:3 field. That incoming code may have a 1 to 1 mapping, or 1 to many mapping in a table. If it's a 1 to many, the values in the lookup table are comma delimited. If it's 1 to 1, that IN1 segment will map straight across. If it's one to many, I need to create additional IN1 segments. For example, if the incoming code maps to three, I need to map the original IN1 segment with one of the mapped codes, then create two additional IN1 segments with the other 2 codes for a total of 3 IN1 segments. I'd lik
I was using VSCode to edit a DTL because it seemed easier to copy/paste code from parts of the DTL I was editing. I tried to add <sql> tag and code to call a SELECT statement, but when I compiled I got the following error...
ERROR <Ens>ErrInvalidDTL: Invalid DTL
> ERROR #5490: Error running generator for method 'GetSourceDocType:osuwmc.Epic.MFN.DTL.EpicMFN949002Normalization'
Perhaps this is an issue that has long been discussed somewhere. I have searched for it but still could not find the solution anywhere. In our environment, we send a container of SDA objects to UCR from HealthConnect. To populate the container we do a SQL query on a source system and then populate the appropriate SDA objects. Sometimes it happens that I want to do an update of a field and I want to empty the corresponding SDA field. In other words, send an empty string. The Alert.toTime field of the object in question is filled with the date '2022-11-29Z13:00:00'.
I want to get rid of this account but there seems to be no way to do this by myself. Whom can I write to, to have my account deleted? Thank you in advance...
I am starting a conversion project for a health system that is currently migrating to Health Connect. They have their DEV and TST environments in one production and their PRD in another. What is the best practice for standing up conversion environments...separate name spaces in the two productions or separate productions? Keep in mind the interfaces are temporary?
I have two instances of IRIS, one is Production and another one is Staging (both managed by Docker) and I want to set up a full daily recover of the Staging server from a full backup of the Production server. I know how to do this manually using the DBREST utility, as well as how to make a copy of the database by making a full copy of the durable directory (however this option requires a full stop of the Production-database). What is the best way to automatically restore all databases from a full backup using scripting?
I have the need to query an external database and write the result set/snapshot to an internal %Persistent [ DdlAllowed ] table that I built. I have built inbound SQL Services before and write them externally to replace SSIS jobs, but how would querying a database via a Service and writing the data to an internal table work?
Can I just take the inbound query structure and write it to the class file of the internal table in a DTL? If so, what would be the Target? Or does this need to be done within a BPL as a Code block?
We are calling a REST web-service from Ensemble using EnsLib.HTTP.OutboundAdapter and redefining the adapter class to set custom headers as described by @Eduard Lebedyuk here: How to set Content-Type
Is there a way in ObjectScript to return the OS user of the superserver? I know %SYS.ProcessQuery can find this for a given process but is there a clean way independent of a specific process ID I can find the OS user used for background jobs?
For IRIS this is usually irisusr and Caché this is usually cacheusr but may vary based on installation and upgrade history of an instance. I would find it very useful to determine programmatically if a process is running as this particular user when the username may vary.
The task is to find all globals that are referenced in certain routines. I could search for ^ using class(%Studio.Project).FindInFiles but that would also find ^ in comments and function calls. I can distinguish between a global and a function call visually, but it would be lovely to be able to skip function calls programmatically. Is it possible?
My query that I am running on my Custom SQL Inbound Service has columns that are larger than the typical string length. How do I enlarge the SQL Snapshot Column limitations
i am building a new container with Docker-Compose with a --volume ./:/irisdev/app:rw to mount a host volume (UBUNTU).
If i try to create some data in /irisdev/app i have no write permission
If i open a bash host session i see all mounted files of my ./ directory. But with ls -l only UID 100 has write permission. The UID for "iris owner" is 52773.
Is there a way to set the UID and Group to "iris owner" ?
I'm looking for a practical project guide to help me gain hands-on experience with InterSystems Ensemble HL7. Ideally, this guide would walk through building a small project — something that covers key concepts like message routing, transformations, and interoperability.
Looking for a SQL query or any other method to find the Unique/distinct message counts for all productions or at least per production namespace within a given time frame. For e.g TimeCreated = January 2025 (Whole month)
I have used the following, but its not restricting the numbers based on the TimeCreated filter. Every time a new message is processed by system, its added to the total. I am running the query in today's date
Select Sum(MsgCount)
From
(Select DISTINCT TargetConfigName, count(DISTINCT SessionID) as MsgCount
What is the SQL table name for it? How can I obtain it via ObjectScript?
A quick search doesn't show any methods and properties. Documentation is a bit "wrong" here saying that the SQL table name is the same. It will be at least 'x_y.z'.
I'm working on a requirement to loop through all encounter streamlets(SDA) to identify specific encounters based on an encounter extension property for a patient fetch request. However, this current process is time-consuming, and we need to create indexes for that property to quickly retrieve the expected results without going through all the encounter streamlets of a patient.
I would appreciate help on how to achieve this, as I couldn't find any documentation explaining how to create indexes on a SDA element.
I need to read a directory on a remote server which requires a user to be su.
The question is how to correctly read the server response and then to send a su password using IRIS device I/O API (I'm able to read other commands output such as uname, but can't figure out how to switch to su):