I'm trying to create a method that will automatically create something I can save and use later, which will let me automate data migration from one version of a class to the next.
I need to process a very large Batch file which contains HL7 charges which we pull using FTP. I need to loop through the file and do some checks and then split the large Batch file in to individual messages. These individual messages then gets process through a different module to create a output batch of different format
The following post is a guide to implement a basic architecture for DeepSee. This implementation includes a database for the DeepSee cache and a database for the DeepSee implementation and settings.
The following post outlines an architectural design of intermediate complexity for DeepSee. As in the previous example, this implementation includes separate databases for storing the DeepSee cache, DeepSee implementation and settings. This post introduces two new databases: the first to store the globals needed for synchronization, the second to store fact tables and indices.
I created a DTL to do HL7 mapping. The test function in the tools works the DTL perfectly but when used by the rule in my business process, the OBX segments are stripped and the MRN is gone. The assigning authority and ID type are added into the PID but the actual patient MRN is blank (3.1 value).
I have setup an async reporting mirror member with Read only access. My problem is that if I try to do any sql reporting against that data I am getting errors. I am sure that this is because the DB is read only, but I had assumed that setting up a reporting mirror would handle this.
I am trying to use data transformation (DTL) to map a JSON to SDA. My elements in the source JSON is not one to one with SDA object. That means I have to add code to loop through these objects in order to complete the mapping. Can someone send me a sample that can look to create that? I am not very comfortable with scripting language used in Health Share. Appreciate your help.
is there a way to copy namespace mappings from another namespace to a new namespace programmatically with a command?
When using management portal there is an option to choose an existing namespace to copy the mappings from, I'm basically looking for the equivalent from command line / COS.
I am looking for a mapping from SDA collections to the HS Analytics (HSAA) data model. Specifically HS.SDA3.Container.Observations to the tables (Couldn't find all the fields in HSAA.Observation). Can someone help? Thanks
Inspired by the article "Declarative development in Caché" that's still trending on the dev com. The OP explored a functional style of iterating over a collection. A comment today suggested "Caché would need syntax support for anonymous functions".
With Macros you can kind of get anonymous like syntax using dot notation.
This is not production code, but it does work. First the macros...
I have a text file that is fixed width delimited and am using a BPL to process this file, ultimately performing a transform from the text file to an HL7 message. I created a DTL, mapping from the recordmap to the HL7. In my BPL, I am performing some loops and other logic (that all is working). My issue is what to do when I perform the Transform.
The following post outlines a more flexible architectural design for DeepSee. As in the previous example, this implementation includes separate databases for storing the DeepSee cache, DeepSee implementation and settings, and synchronization globals. This example introduces one new databases to store the DeepSee indices. We will redefine the global mappings so that the DeepSee indices are not mapped together with the fact and dimension tables.
Sometimes global mapping of the same globals can be defined in different ways. E.g., I need to define it for 3 globals ^qAuditC, ^qAuditLog, ^qAuditLogC from the same database named APP-NOJOURN. Which approach should be better from the performance point of view?
1) qAudit* => APP-NOJOURN (one record in global mapping table) or
2) qAuditC => APP-NOJOURN qAuditLog => APP-NOJOURN qAuditLogC => APP-NOJOURN (three records in global mapping table)
I am inserting rows in a table. This table is appearing in all namespace as I did global mapping.
So once I run insert command from a method, it insert the rows. When I run the same insert command from other namespace, it replace the existing data in table.
Insert command is same in all namespace but the data I m inserting is different.
Is there an out-of-the-box or accepted standard method for loading up mappings between different code sets and then referencing these mappings (both directions) from DTL? First thought was the built in Lookup() and corresponding data tables but these only work in one direction (key -> value) and not the reverse. Obviously I can build my own classes to support a two way mapping but am wondering if there's a standard way of achieving this. The mapping should contain the code and display name from each of the code sets and allow mapping based on either code or display name.
Is there a possibility to map a CSP page residing in namespace ABC to a namespace XYZ so you could access it as if executing from XYZ: http://localhost:57772/csp/xyz/MyPage.csp ? Some odd cocktail of web application and package mappings that could make this happen?
The idea is to keep the CSP page in sort of a read-only namespace that only contains code, with the data residing in another namespace. This works for zen pages, but not for CSP.
Prompted by this post about accessing a global at its original location after you have changed a mapping, here's a tip about one specific dropdown in Portal that's sometimes useful.
How Tax Service, OpenStreetMap, and InterSystems IRIS could help developers get clean addresses
Pieter Brueghel the Younger, Paying the Tax (The Tax Collector), 1640
In my previous article, we just skimmed the surface of objects. Let's continue our reconnaissance. Today's topic is a tough one. It's not quite BIG DATA, but it's still the data not easy to work with: we're talking about fairly large amounts of data. It won't all fit into RAM at once, and some of it won't even fit on the drive (not due to lack of space, but because there's a lot of junk). The name of our subject is FIAS DB: the Federal Information Address System database - the databases of addresses in Russia. The archive is 5.5 GB. And it's a compressed XML file. After extraction, it will be a full 53 GB (set aside 110 GB for extraction). And when you start to parse and convert it, that 110 GB won't be enough. There won't be enough RAM either.
I need advice converting a comma delimited string container with multiple records into some type of recordmap that iterates through all the records.
My string container has several records and I would like to loop through the number of records in the string container and transform each record in the container individually. Number of records will vary but the number of fields per record is static (28 fields). Meaning after every 28 fields, a new record begins. The goal is to convert to individual delimited flat file records.
I have a global whose structure is multi-level and I am trying through a class and a SQL query to display a table which includes all the values and levels.
Currently, namespace Alpha is configured to use database AlphaDB as its global database. How would we go about having namespace Alpha configured to use database AlphaDB for its global database except where global ^Customers(CustomerId) has a CustomerId greater than 10M, which we would like to have it redirected to database BetaDB.
In other words, ^|"AlphaDB"|Customers contains all customers between 1 and 10,000,000; and ^|"BetaDB"|Customers contains all customers greater than 10,000,000. Any help would be appreciated.
I'm trying to setup a REST server with CORS support. I have created a class that handles a specific part of the service (printer control). This class I have referenced in my main REST class by adding <Map Prefix="/print" Forward="myClass.PrintAPI" />
My main class does have its own <Route>s and handles CORS requests perfectly. But in my subclass OnHandleCorsRequest() is only run when requesting from same origin and never run when making a CORS request. I have set Parameter HandleCorsRequest = "true" in both my main REST class and the subclass.