We have msgs coming like below where line ending with \n then it throw error in router production but if msgs come with \r\n then router dont throw error.
I have a process that takes data from a CSV file (actually a record mapper object) and creates a nicely formatted JSON string I would love to send this along to a RESTful business operation. However no matter what I try, I continually get <INVALID OREF> errors when trying to populate the object that extends Ens.Request to give it the JSON string.
I can add strings, other objects, you name it - but stuffing a JSON formatted variable/object into another object I want to send someplace is proving to be an impossibility.
I like to install Cache Evaluation version for learning purpose. Could you please share the download link to install it? I appreciate any help you all provide. Thanks.
Have you ever had to convert HL7v2 messages to FHIR (Fast Healthcare Interoperability Resources) and found the process complicated and confusing? InterSystems is rolling out a new cloud based SaaS offering called InterSystems FHIR Transformation Service, which makes the process easy. We are excited to announce an Early Access Preview Program for our new offering, and we would love to have you kick the tires and let us know what you think! All you need is a free AWS account, with an S3 bucket to drop in your HL7v2 messages, and another S3 bucket to get your FHIR output.
Does calling the BIND method of %SYS.LDAP, with the username, domain and password of the user that needs to be authenticated- the right way to authenticate him/her ?
Also - am I correct in assuming that something like this is independant to (and I don't need to specify setting for), System Security -> LDAP Options
I am reading in an X12 document into my production that needs to be processed and returned as a CSV file. I have created a record map to support the fields I want to extract with a batch class to store headers. I have a DTL mapping the data to the appropriate fields in the record map and am sending the record map to a EnsLib.RecordMap.Operation.BatchFileOperation.
Working in support, I usually get asked how many days I should keep journals. Should it be two days or after two backups? More? Less? Why two?
The correct answer (for most of the environments) is that you should keep the journals since the last validated Backup. I.e., until you don't check if a Backup is valid (restoring the file and checking with the Integrity utility), you can't be sure there is a good copy of your data and can't purge the journals safely.
I'm facing a base growth issue, which is being generated by a process and an Ensemble feature.
When executing the process of cleaning up the message queues, the Ensemble “preserves” the Streams that were part of these messages, deleting only the Header and Body. In this way the database (of one of the namespaces) has grown around 60GB per day, which has been maxing out the disk capacity.
InterSystems informed that this is a characteristic and that it is explained in the documents mentioned below.
Here you have an easy way to write and execute COS code from your unix scripts. This way one does not need to write routines or even open Studio or Atelier. It can be an option for simple and small actions for instance things like installation tasks or compiling.
See sample bash script (compile.sh) to compile classes:
InterSystems has decided to stop further development of the InterSystems IRIS Natural Language Processing, formerly known as iKnow, technology and label it as deprecated as of the 2023.3 release of InterSystems IRIS. InterSystems will continue to support existing customers using the technology, but does not recommend starting new development projects outside of the core text exploration use cases it was originally designed for.
I am in the process of trying to implement version control software with studio. Has anyone got any recommendations (either Linux based/windows based) as a place to start I have installed Gitlab and I wanted to know whether anyone has come across any obstacles using this.
I was also wondering whether anyone has developed any hooks for Gitlab that works well with Studio as I would prefer if there was a more integrated solution with studio? Any help with this would be great.
Hello! We want an Integration that should be moving files from a FTP server in a DMZ zone into another FTP server on our local network. I tried using EnsLib.FTP.PassthroughService(EnsLib.FTP.InboundAdapter, EnsLib.FTP.OutboundAdapter) Using this approach ensamble write data to the database, causing the CACHE.DAT to grow for every file that is moved. Looks like the entire file is written to the database, is this the case? We are not really interested in storing any file content information in ensamble in this particular case.
This article is aimed at developers implementing DICOM productions, specifically for cases with third-party endpoints that cannot handle the DIMSE timeout themselves.
Hello, I am writing to request assistance on an issue I appear to be having when accessing Ensemble. I have it running on a Windows virtual machine, on a Mac laptop, and am trying to access it through the emergency ID account. When starting Ensemble through the command line window using ccontrol start ENSEMBLE /Em... I get an error and Ensemble does not start. Below is the error message I am getting when checking the logs:
I know the process ID< and I know the global name: ^||testing(index). From the terminal (and therefore, and different process ID), how do I view the contents of ^||testing() ? Not the list of process private globals; the contents of this single PPG.
I'm trying to get my VS Code instance that is connected to an AWS IRIS instance to edit/save/compile .csp files, but it's failing to work and I'm not sure why. The ".csp" is associated with the objectscript-csp code, and the server is connected, but things just don't act like they are enabled.
Should this work? and if so, what might I have missed in configuring things?
I was editing the properties of a persistent ObjectScript class the other day and noticed that the storage definition wasn't updating to reflect my latest changes.
In this case, I deleted a property that was no longer needed from the class definition, saved, recompiled, and still saw it in the storage definition.
I'm looking to find if there is a datatype convert equivalent in Object Script to SQL convert function. Have a VarBinary string coming in from source application (which is really performing a SQL dump). The source application uses the standard SQL convert function to convert from varchar to varbinary on their side.
I know &sql(Convert()) should work in Object Script, but am wondering if there is a better way of doing this.
Getting data in via flat file (Record Map), then using data transform to transpose this data to SDA3.
Most of my classes are mapped from Globals. I want to access Cache classes from a BI software through ODBC connection.
'Last update' information does not exist in most of the classes. My question is whether there is a 'last update' timestamp that is automatically generated for each line in classes I can extract to external systems?
This earlier article already announced the new iKnow REST APIs that are included in the 2017.1 release, but since then we've added extensive documentation for those APIs through the OpenAPI Specification (aka Swagger), which you'll find in the current 2017.1 release candidate. Without wanting to repeat much detail on how the APIs are organised, this article will show you how you can consult that elaborate documentation easily with Swagger-UI, an open source utility that reads OpenAPI specs and uses it to generate a very helpful GUI on top of your API.
Welcome to NewBie's Corner, a weekly or biweekly post covering basic Caché Material.
MUMPS verses Caché, what's the difference?
MUMPS was developed at Massachusetts General Hospital during the 1960s. Through a series of experiences and companies over the years eventually MUMPS evolved into Caché. Some deny this but the facts are there. You can read through the various websites with Wikipedia and make up your own mind. The closest way to explain this is that Caché is a superset of MUMPS.