I am attempting to set up a security role for our support team so they can have read access to the production and messages.
I have given the role RW rights on the resource associated with the database. However, when I log into Management Portal and select "Ensemble", the "Available Ensemble namespaces" list is empty.
What permissions do i need to set to be able to navigate to the production?
Hi, we are implementing HIE for multiple hospitals, one question came from client on how do we decide how many edge productions ECR should be created. is there any guidelines that will help us to decide? Also if we combine two hospital data into one ECR then also how do we decide which ones to combine together. Any guidelines will be appreciated.
I want to check patient class in multiple rules, since the message structure is different to reach out to patient class value, i created a method to get the patient class and put it in a variable. This variable i can use in multiple rules. But somehow function is not able to read the HL7 message. Any tips.
I have to write a DTL with the Data Transformation Builder to convert messages from HL7 ORU R01 v2.1 to HL7 ORU R01 v2.5. The incoming messages contain a text in OBX-5. This text contains LF characters (only LF - Segment separator is CR). Therefore it is not possible to parse the incoming message. While testing the transformation the OBX Segment ends at the first occurence of LF. Is there a way to replace the LF character before parsing?
DeadJobAlert: Job 'X478915' for config item 'Business Service' was marked as 'dead' under ghost Id 'sys:ZX478915_6117'
This is the error message I get, but can you advice how can I start debugging this? I look at the jobs and I just see "dead" I cant see any information on the message itself.
We are facing what seems to be a network problem while transferring HL7 messages from Ensemble/Healthshare to a distant target through TCP/IP.
Here is the version of the system in any case it could be useful: Cache for Windows (x86-64) 2017.2.1 (Build 801U) Wed Dec 6 2017 09:07:51 EST [HealthShare Modules:Core:14.02.2415 + Linkage Engine:15.03.9901]
I have multiple files with different columns, first 9 values are fixed, so i want to ignore the first value, and next 8 values i want to combine into one value using ^ sign
I'm trying to get my VS Code instance that is connected to an AWS IRIS instance to edit/save/compile .csp files, but it's failing to work and I'm not sure why. The ".csp" is associated with the objectscript-csp code, and the server is connected, but things just don't act like they are enabled.
Should this work? and if so, what might I have missed in configuring things?
Doing a new project with %JSON.Adaptor, unexpectedly realized that %JSON.Adaptor does not support export to native JSON. %JSONExport just outputs directly to the current device, and there are two more methods %JSONExportToString, and %JSONExportToStream.
In conjunction with generating REST from swagger specification, where any generated method accepts as a result %DynamicObject, which is good.
I have multiple places in my REST where I have to return JSON for an object, but I have to modify the result a bit, just extend it with some other way.
A simple question: View Global Data page shows globals always in ascending subscript order. Very often I need to see latest page, is there any trick for this?
Update: I mean to see last subscripts in Management Portal, not using the code
Do you know how to create workflow users and roles programmatically? I use Docker for test deploy and I need to set up IRIS Interoperability using the install script.
Maybe do you know how to import/export already existing workflow users and roles?
I am trying to pull data from CacheDB and push into elasticsearch using logstash. In the configuration file i am giving the following. But it is throwing error No Suitable Driver Found for jdbc:Cache://ipaddress:port/namespace. Could anyone please help to resolve this ? I tried both JDK17 and JDK18 but no luck.
I have a project to only filter certain pathology results into a downstream system.
Within a HL7 router and business I was planning on using a lookup table and either the exists() or Lookup(), but am having issues when using it with repeating fields or segments.
For example if I perform teh analysis per stated segment usign {} brackets this will work, as each stated repeat is assessed:
Hello, I need to use IRIS to connect to an MSSQL base. It has to be done via ODBC, I can't use JDBC at this time by client option.
I am trying to use Microsoft Driver libmsodbcsql-13.1.so.9.2
But I can't, my attempts result in: Connection failed. SQLState: () NativeError: [11001] Message:
I have done all DSN configuration, and my configuration is listed in SQL Gateway Connections. I know it's working, because when I run a test with isql I have the information that connects to the bank.
I am developing a viewer for Crystal Reports using the Crystal Reports for Visual Studio (CR13SP26). I have also installed the latest ODBC Drivers for Cache, but when I connect some reports to the Cache database using a connection string, I get an error that I failed to retrieve data from the database, and it reports the Database Vendor Code 30. Has anyone used Crystal Reports connecting via a connection string and received this error? If so, how did you correct it?
Languages like Java and C++ allows to develop a multi-threaded program with two or more parts that can run concurrently and each part can handle a different task at the same time making optimal use of the available resources specially when your computer has multiple CPUs. Is it possible In ObjectScript? If yes, Where I can get a good sample or application?
I am going to develop a ASP .NET Core Application. In that Hos can use IRIS Entity Framework. I searched but I couldn’t find IRIS Entity Framework for .Net Core . Please kindly help me to overcome this issue.
I am looking for any pointers on how Intersystems IRIS Health can monitor a filesystem/Folder that user/s /applications can drop in CSV files via FTP and load the file to the IRIS DB . I understand that I will need create a record map for the CSV files, I am looking for any configuration references on how how to process files using file inbound adapters with the intent to pick up the CSV file as they are dropped in the target location and pass it to a Business process and ingest into the IRIS database