Was at an HL7 Connectathon over the weekend and got in a scramble that headed us in the direction of trying out Preview 4 for I4H and found that the USER namespace, and subsequent namespaces created do not have any mappings included with them.
The navigation by default returns the Total records, how can I change that or allow the navigator to only show the number of records in the current page, or is there a property of Tablepane that returns the current page records No?
We have recently encountered an issue which requires us to define a new identity field (named xMDA in the attached example), instead of using the default ID field.
We need to run an SQL query which unfortunately overrides the ID field (see attached image) but we still need to be able to access the ID field in said query.
I am trying to do a rewrite using the webgateway-nginx docker container but getting an "unknown directive sub_filter' message. How can I add additional nginx modules into a webgateway nginx container?
We recently moved from using the Private Web Server, to using an Apache/Web Gateway setup and moved towards using the built in LDAP functionality within IRIS. Since then, we have 1 user that uses VSCode (/api/atelier) heavily that continues to have issues signing into IRIS through VS Code and the /api/atelier extension.
I am busy trying out the %UnitTest.TestProduction class to implement some automated production testing.
I have a scenario which I am not sure how to get the final results of for assertion. Below is the scenario I am trying to test, with comments on what I have done.
While the documentation of configuring authentication with Kerberos for IRIS on Linux servers is sparse, for docker i found no docs at all. Assuming I would be able to adapt the requirements from linux to docker (on linux host) I had no success at all. Has anyone successfully done this?
I have been trying to do a backup from tape using the D ^DBREST command. I am not able to connect to the tape drive, that recently got replaced. It is configured and I can see it with IBM's ITDT. I did a test and it came back with the error below:
Are there any suggestions on how to fix this or what commands to use to get this opened and connected? I do appreciate it.
I have been testing FHIR and Iris OAuth2 but have noticed that my callback (redirect uri) I have configured in the Client config, gets called multiple times with the same code.
I restore the “dirty” copy of the IRIS.DAT files, then apply the incremental backup as documented for Online Backup. As shown in the picture, restarting the service before mounting and rejoining the database to the mirror can restore the mirror database to the point in time of dirty data. I think the mount method used here is the default value, and I don't want it to perform mirrorcatchup. Is there any other way to recover the mirrored database to the point in time of dirty data?
We are embarking on a project that we are injesting raw EDI files 837's to start with into HS.
We have and inbound X12 adaptor to take in the raw *.edi file and we have an business process that is mostly pass thru. we have been unable to find any DTL to map X12 to SDA/FHIR ( similar to the ones that exist for HL7 , CDA,CCD). If anybody has done anything on this front,would appreciate any tips.
We would like to share with you a doubt, and we would appreciate if you could read and answer us.
Currently we have a file titled: "Imagen PatientID 9358340 PatientName Milagros ReasonForStudy 350290 InstitutionName 350290 StudyDate 20220927.xml" , in which we have the following InstitutionName:
Hello, I am using the power bi report builder tool to create a report with dynamic parameters. Every time I try to create and apply a parameter in power bi query editor. I get the below listed error message.
FYI: I have created a Data source connection using "InterSystems ODBC" Version 2015.01.00.429. I am getting data without any issue.
Query: select Payment_History.OCCURENCE_DATE from Payment_History where Payment_History.OCCURENCE_DATE >= @BeginDate
I'm trying to build a cube based on a linked table but seems that IRIS is not able to do it :O
Long story short, I have a linked table in IRIS that sources a Microsoft SQL table (using standard linked feature from the portal). It works fine, I can access it using SQL as many other times. On top of that, I've created in DeepSee (ok, Analytics) a cube that uses this class as source. It compiles correctly, no errors given. When I build it with 100 records, all goes well and using Analyzer I can see results.
First of all, thank you very much in advance for reading and responding.
Also, thank you for any support, because it is a relief, support, contribution, help to have people with more understanding, knowledge and practice, reading, and thinking this doubt.
There is the following need:
Two 2 circuits are available:
1st DICOM Circuit of "Studio" ( Classic Service ).
I'm new user learning to use Iris and Ensemble. I'm trying to set up a TCP interface to send delimited data from Ensemble to another interface engine. I created File.PassthroughService to pick up the file and send the data to TCP.Framed.PassthroughOperation. The framing is MLLP and SSL configuration used. It is able to process small files around 50kb. When I drop a larger file such as 5mb, the operation is not getting the ack within the 60 sec timeout.
I am having an issue with a file not being deleted from the FTP server and returning an error to this effect. The file, however, was fully processed. This causes the file to process again.
My question is I seem to recall a flag that can be set to enable more detailed tracing to be done on the FTP service/adapter. However, I have not been able to locate this. Does anyone recall such a logging setting or was I just dreaming?
Sorry I'm not that familiar in using Javascript in Zen pages, so basically we have a priting utility (app) that requires us to add a bit of JavaScript call to facilitate printing from our Zen pages as below:
So I added XData link pasted the required script, is this the correct way?
Thank you for taking the time to read and answer this question.
We need to find out how to display an EnsLib.DICOM.Document using LOGINFO, in the traces.
We have tried to use:
set writer=##class(%XML.Writer).%New()
set status=writer.OutputToString()
set status=writer.RootObject(..DocumentFromService)
set xml= writer.GetXMLString()
$$$LOGINFO("..DocumentFromService en xml: "_xml)
I'm getting a lot of hs_err_pid.mdmp & hs_err_pis.txt error files in the path where Cache.DAT in located and as I googled these seems to be Java error files and I'm wondering what this has to do with Ensemble, and is it alright to just delete them?