I have the class ConfigUtils.ConfigSettingsTable, which is a persistent object. I know I need to map packages from the original namespace. In this case, I have mapped ConfigUtils.ConfigSettingsTable from the originating namespace (IRISTST database) across all other namespaces.
When developing productions there are many places that we input endpoint information (IP/port) especially when there are dozens of operations going to the same environment. What later happens is that the IP changes and we have to then go into each operation and update the IP.
Let's say I have an InterSystems IRIS instance with 6 Namespaces:
Foo1
Foo2
Foo3
Foo4
Foo5
Bar
And the number of Foo# namespaces can increase at any time for a number of reasons. I need to ensure that they all have identical configuration globals stored in a DB called CONFIG, so I do the following in my configuration file:
Is there a sensible approach to having a lookup table in Namespace A, and then accessing this from Namespaces B, C, D (etc)?
I'm trying to avoid creating a Global mapping of the lookup table global (^Ens.LookupTable) as I fear that it would then link all other lookups in that global and lead to some unexpected behaviour, but would be open to trying something in this realm if it's the best option.
In cache studio there are features, dialog boxes, that help map data from a global to class properties. I have used %CacheSQLStorage quit a bit, or have in the past, to map globals to classes.
I haven't been able to find a similar feature in VisualStudio. Do I need to upgrade to IRIS to be able to use VisualStudio to map global properties to classes?
I'm having trouble defining the mapping needed to take the very large base64 string in OBX:5.5, and map it to an XML virtual document property that supports Stream.GlobalCharacter. I know from within the DTL you have to use custom code to manage the segment due to its size.
For a particular problem we were trying to parse a relatively large *.csv file with a recordmap. We are doing this from a BPL where we start with a REST call to acquire the file. This file needs a slight transformation which we tried in a DTL. However DTL's seem to be incapable of parsing larger files.
I have a global whose structure is multi-level and I am trying through a class and a SQL query to display a table which includes all the values and levels.
I found the thread that discusses object mapping, in particular mapping a common global among more than one namespace. The example that is given is a simple one when it's ^global(sub1, ^global(sub2, etc. However I'm having trouble getting this to compile/work when the global has a fixed subscript amongst variable ones.
I have this global in namespaces LAB and ARK in the following format:
^CB(1,sub1)=....
^CB(1,sub2)=...
^CB(1,sub3)=...
Here is what I have for this. In it's current state it throws tons of errors:
I'm doing a REST service. A method has as body parameter a JSON corresponding to a class A.
In my production I have class A so that I retrieve the parameters using a dynamic object, such that:
Set body = ##class(%DynamicObject).%FromJSON(%request.Content)
Set myObjectA = ##class(A).%New()
Set myObjectA.Id = body.Id
Set myObjectA.Name = body.Name
Set myObjectA.Date = body.Date
Set myObjectA.Salary = body.Salary
I would like to know if I can avoid doing the manual mapping, doing a casting, since I am sure that FromJSON will return a class A. Something like this:
Using Interoperability, I can't figure out how to create separate XML's files from a CSV-file using the GUI-features Record Maps/Complex Record Mapper -> Data Transformations. I'm familiar with reading/writing the files using File Service/Operation, but don't understand the processing-steps.
The preferred method by my colleagues is to do this without any Objectscript or Embedded Python coding, but if this can only be done by some coding that's fine as well.
I need to split existing tables from database and put some parts of them into a new namespace. I dont know where to start, other than the installer.cls file. If you can provide clear instructions i would be greatful.
Example:
I have NAMESPACE=NEWTEST and DB
The i need to take TABLES from that DB pull specific data from them and bind it to NEWTEST
In terms of general through-put design and long term support, I'm considering what would be a "best approach" for needing to create multiple batch files in a few different layouts from the same data-sets.
I'm VERY novice on all things "OpenAM", and beyond knowing that Caché supports working with OpenAM, I have nothing else to go on.
The documentation doesn't seem to be very deep on the nature of how this works beyond a single paragraph saying it's supported for Single Sign On (SSO).
I've found a couple of methods that will tell me whether a package is mapped from another database, but not which database. Is there such a method/routine?
My dilemma is that I'm working with a file that has three different data records plus the header and trailer. The record type is in positions 21-23. The header and trailer have spaces in positions 1-20. The data records have a variable data in positions 1-20.
I'm trying to setup a REST server with CORS support. I have created a class that handles a specific part of the service (printer control). This class I have referenced in my main REST class by adding <Map Prefix="/print" Forward="myClass.PrintAPI" />
My main class does have its own <Route>s and handles CORS requests perfectly. But in my subclass OnHandleCorsRequest() is only run when requesting from same origin and never run when making a CORS request. I have set Parameter HandleCorsRequest = "true" in both my main REST class and the subclass.
I like to know if we need to have the message in a file to process a Record Map?
I am working with Interoperability Production that processes files /messages using Record Maps. My team was asked to redesign the solution for deployment in AWS. We use containers. We had problems with having multiple containers processing files from the same directory. We are considering Amazon Simple Queue Service instead of having files on a shared file system.
We are wondering how to create translation profile for HL7 to SDA in "HEALTH SHARE". can you please help us with the class definition containing the mapping for that(so that we can import directly and understand the mapping).
I am putting together a new interface to take in a CSV file and output an HL7 message.
I have used the record mapper to create the source class and I am putting together the transform, but when trying to test it I am not seeing the data populate the HL7 as intended.
When I start a fresh installed IRIS or a Container I always find Interoperabiliy (aka. Ensemble) mapped to namespace USER.
Is there any utility to remove this mapping by a click ? unmapping it global by global, routine by routine, Package by Package is just a boring exercise.
To be clear: I look for a utility inside IRIS.
The external utility is obvious: Notepad (or any other text editor) - clean iris,cpf, - restart IRIS It's fast, it's efficient, but it's really hardcore.