I've found a couple of methods that will tell me whether a package is mapped from another database, but not which database. Is there such a method/routine?
Is there an out-of-the-box or accepted standard method for loading up mappings between different code sets and then referencing these mappings (both directions) from DTL? First thought was the built in Lookup() and corresponding data tables but these only work in one direction (key -> value) and not the reverse. Obviously I can build my own classes to support a two way mapping but am wondering if there's a standard way of achieving this. The mapping should contain the code and display name from each of the code sets and allow mapping based on either code or display name.
I created a record map and my DTL works fine. I would like to query one of the values in my input record mapped class. How would I do that? I've tried request.Field1, document.Field1 and can't see to get my rule to work. Any help would be appreciated. Thanks.
I need advice converting a comma delimited string container with multiple records into some type of recordmap that iterates through all the records.
My string container has several records and I would like to loop through the number of records in the string container and transform each record in the container individually. Number of records will vary but the number of fields per record is static (28 fields). Meaning after every 28 fields, a new record begins. The goal is to convert to individual delimited flat file records.
If you are looking to breathe new life into an old MUMPS application follow these steps to map your globals to classes and expose all that beautiful data to Objects and SQL.
This example is going to cram in 4 or 5 different things beyond what was covered in Part 1
Using Interoperability, I can't figure out how to create separate XML's files from a CSV-file using the GUI-features Record Maps/Complex Record Mapper -> Data Transformations. I'm familiar with reading/writing the files using File Service/Operation, but don't understand the processing-steps.
The preferred method by my colleagues is to do this without any Objectscript or Embedded Python coding, but if this can only be done by some coding that's fine as well.
In my Data Transformation, the Target class needs to create a new List of objects (ListOfObj), depending on some conditions of Source class (Source/Target are completely distinct/different classes).
I experimented with Lists of 'primitive' data types (ListOfDT), and I could add new %String items (as an example) to a List of %String property, with "append" action in DT.
Does anyone have an example, or guidance, how to create new Lists of Objects in data transformation?
For example, if I have a 'container' class like this, it works:
I have a global whose structure is multi-level and I am trying through a class and a SQL query to display a table which includes all the values and levels.
I'm trying to create a method that will automatically create something I can save and use later, which will let me automate data migration from one version of a class to the next.
I am inserting rows in a table. This table is appearing in all namespace as I did global mapping.
So once I run insert command from a method, it insert the rows. When I run the same insert command from other namespace, it replace the existing data in table.
Insert command is same in all namespace but the data I m inserting is different.
I am migration my web application of Cache 2013 to Cache 2016, in Cache 2013 I have a integration with a Java aplication using Java Gateway mapping proxy classes and consuming a method that param is a object, and it works perfectly.
But in Cache 2016 this integration don't work, I send the param as object but Cache send as String with the ref of object...
In cache studio there are features, dialog boxes, that help map data from a global to class properties. I have used %CacheSQLStorage quit a bit, or have in the past, to map globals to classes.
I haven't been able to find a similar feature in VisualStudio. Do I need to upgrade to IRIS to be able to use VisualStudio to map global properties to classes?
I am trying to use data transformation (DTL) to map a JSON to SDA. My elements in the source JSON is not one to one with SDA object. That means I have to add code to loop through these objects in order to complete the mapping. Can someone send me a sample that can look to create that? I am not very comfortable with scripting language used in Health Share. Appreciate your help.
My dilemma is that I'm working with a file that has three different data records plus the header and trailer. The record type is in positions 21-23. The header and trailer have spaces in positions 1-20. The data records have a variable data in positions 1-20.
I'm VERY novice on all things "OpenAM", and beyond knowing that Caché supports working with OpenAM, I have nothing else to go on.
The documentation doesn't seem to be very deep on the nature of how this works beyond a single paragraph saying it's supported for Single Sign On (SSO).
Hi, we are a veterinary lab and we use both the LAB and FIN systems of Antrim. Now we are looking to expose the data in a SQL/Object compatible way so we were wondering if same / similar things had been done by other community members already? If so, could you please share your approach / experience / gotchas with us and we are all ears. I can be reached at yang.jiao@antechmail.com . Thank you!
I have a text file that is fixed width delimited and am using a BPL to process this file, ultimately performing a transform from the text file to an HL7 message. I created a DTL, mapping from the recordmap to the HL7. In my BPL, I am performing some loops and other logic (that all is working). My issue is what to do when I perform the Transform.
We are receiving XML documents and storing them. When we click to go into the clinician portal and again to view a patient, we can see the documents but none of the data is mapped to their respective buckets, i.e. allergies or medications.
I am thinking that I need to build an XSLT parser and change the format to SDA3, is this an appropriate approach? Or would using the Data Transformation (Ensemble -> Build -> Data Transformation) be a better idea? Lastly, if the XSLT idea is preferred, where would I call it within the stack?
The following post outlines a more flexible architectural design for DeepSee. As in the previous example, this implementation includes separate databases for storing the DeepSee cache, DeepSee implementation and settings, and synchronization globals. This example introduces one new databases to store the DeepSee indices. We will redefine the global mappings so that the DeepSee indices are not mapped together with the fact and dimension tables.
I like to know if we need to have the message in a file to process a Record Map?
I am working with Interoperability Production that processes files /messages using Record Maps. My team was asked to redesign the solution for deployment in AWS. We use containers. We had problems with having multiple containers processing files from the same directory. We are considering Amazon Simple Queue Service instead of having files on a shared file system.