This topic unites posts which describe approaches, tools, and solutions to import and export data into InterSystems IRIS and other InterSystems Data platform products: CSV, JSON, SQL, flat files, binary files, globals, streams, etc.
Hi! I am trying to export Cache data to my SQL database and am trying to export the schema as well.
The Data Export wizard on the Management Portal only allows me to export the data but not the schema. Is there anyway to get the schema exported as well?
As the title suggests, I would like to programmatically export each HL7 schema category as XML, either as a stream object or to a file. How would I go about doing this?
As the title suggests, I would like to programmatically export each individual data lookup table as XML, either as a stream object or to a file. How would I go about doing this?
It can sometimes be useful to list or export all of the subclasses that are derived, directly or indirectly, from a given class. In Studio, the Class -> Derived Classes menu option will show such a list, but I'm not aware of a built-in API for programmatically exporting their source code.
I am trying to populate a table using the sql Data Import Wizard. The input file is a tab delimited text file. But the import keeps failing with a 104 error showing validation for the columns which use %Library.TimeStamp and %Boolean datatypes is failing. Yet when I insert values into the table through a SQL insert command, the values get saved correctly in the table.
For the TimeStamp format in the wizard form, I am choosing YYYY-MM-DD-HH:MI:SS because there was no option for this format: YYYY-MM-DD HH:MM:SS.
Hi everyone. I'm new to cache, and i was looking for a command who reads a .txt file and store the informations in variables. I've found on the documentation the EnsLib.SQL.SNapshot class, and, even if i'm not sure if that's what i need, when i run the code it says that the class doesn't exist, but i couldn't find the right superclass to extend.
If I export a class definition to an XML file, is there a way to programmatically import it so that I could schedule a task to look in a given location for XML files once a day and import class definitions?
We are migrating from AIX to Linux and part of our testing is trying to figure the best method to migrate the code. I am trying to export an entire Production, however I keep running into an error...
Error generating export list for production osuwmc.TestClin and all items may not be listed. ERROR #5002: Cache error: <CLASS DOES NOT EXIST>zgetRecordandComplexMapClasses+34^Ens.Config.Production.1 *(No name)
I went through any Complex Record Maps, and recompiled them but I am still getting the same error
I want to move a development IRIS for Health database to another server. I will do this manually for specific reasons. If I simply copy the /mgr folder along with all the files (.DAT, .GBK, etc.) and configure it in the new server, will it work?
Is it planned that LOAD DATA takes into account several DATE/DATETIME formats with, for example, a parameter indicating the format used in the source data?
example :
LOAD DATA .../...
USING
{
"from": {
"file": {
"dateformat": "DD/MM/YYYY"
}
}
}
We are trying to migrate all our production to new IRIS servers. To test everything is working fine, and to be able to script the process, we want to import the data into new IRIS servers using a backup file (created with ^BACKUP). But we've found that IRIS doesn't recognize Ensemble backups so we can't import it using ^DBREST :-O
Any of you know how to import in IRIS an Ensemble backup file?
I'm trying exported a project from version '2012.5' to '2018.1.4' but is returning ERROR #7602. Please, could you help me. How can I do it? Below is steps that I did:
It is happening for all the namespaces. Looks like some permission issue. Same issue with Data Export wizard. Help to resolve this will be appreciated.
I am using
Cache for Windows (x86-64) 2017.2.2 (Build 865_0_18763U)
Is there a way to read JSON and transform it (in DTL) by using a VDOC of this JSON (without transform it to internal message) like I can do with HL7 or XML?
If it possible, I guess that I should have a schema of the JSON so the second question is how to build a schema for JSON and load it to the IRIS?
We use the %Net.FtpSession class to perform FTPS file transfers. This has worked fine with Implcit FTPS servers but crashes when trying to connect to an Explicit FTPS server. Explicit FTPS has a slightly diffferent negotiation in the intitial connection and different ports.
Has anyone gotten Explicit FTPS working with the Cache classes?
So I've been reviewing a lot of questions posted in the InterSystems community regarding NULL properties in JSON. I've also been reviewing the JSON documentation. None of these things have been able to help me so far.
1. We don't seem to have the %JSON.Adaptor class available for us to use in our system.
2. I'm not really confident enough to create JSON Type classes or backporting code, etc.
Hello. When you export and then import a table of data, is the import smart enough to figure out if a row already exists in the new namespace, and if so update the row rather than just save/add the row?
For example, we have a table in DEV, and the same table in QA. The DEV table has more fields than QA. When we moved up the class, the field definitions went with the table into QA, so now the table definition is the same in both.
Our team is looking for a way to export all of our Cache SQL tables into Microsoft SQL Server. I have only found a method to export one table at a time into an ASCII file. We have over 170 tables so this would be very tedious and time consuming. Is there a way to directly export from Cache to SQL Server. Alternatively is it possible to export the entire database in a single shot or even multiple tables to text files?
Is there a command that will loop through the flat files of a given Linux/Unix folder? I can write the code to open and read each file. But the file names are unknown. I am looking for a way to access each file given a named Linux folder. The files have differing structures so a record map will not work.
Thank you for reading and thank you even more for answering!