Question
· May 25, 2022

Data import/export

I am creating "models" that contain rows in several class tables called Model, Path, Node.

So, model 19 includes 1 row in the Model table, 11 rows in the Path table, and 10 rows in the Node table. 

I'd like to write some utilities to "move" model 19 from one instance to another.

I found some old (maybe deprecated?) functionality, the %SYNC.Transporter, which seems to work.  I can write a SQL query to select the rows of interest, and export those record to a ,gof file.  I can create 3 of them (one file per class), and then on the target instance, I can import from the .gof into the same, preexisting class.

Is there a different or better way to do this?  

Discussion (4)2
Log in or sign up to continue

If you're working on a version of Cache '16.2 or higher (or any version of IRIS) you have access to %DynamicAbstractObject (DAO) and its subclasses %DynamicObject and %DynamicArray.  While most people know these classes only as our "in-memory model for JSON" they are actually much more powerful than that.  DAO was originally designed to be the basic building block for all forms of abstract data modeling; a staging area, if you will, for Sanitizing, Merging, Annotating, Redacting and Transcoding arbitrary data.  

The original vision was that data could be procured from a variety of sources (URLs, result sets, classes, globals, external files, etc.) in a variety of formats (JSON, XML, CSV, YAML, HTML, etc) to populate an Abstract Entity Tree (AET), a directed, but schema-less, graph that focuses on the semantic meaning and core relationships of the data, rather than syntactic artifacts introduced by its serialization.  Under the AET model, data is organized into nested trees of _objects_ (unordered collections of named items) and _arrays_ (ordered collections of unnamed items).  Individual nodes may be added, edited or removed using the DAO API.  Finally, the finished AET (or subtrees thereof) may be published (serialized in a variety of formats) to a stream or persisted to class instances (late-binding schema) or directly to globals (no formal schema required).

DAO currently supports kernel level support for serializing to and from JSON.  It also has an API for AET node manipulation and ObjectScript enhancements for initializing nodes directly from JSON-like expressions.

Using DAO, you could build up a single abstract data "document" complete with both meta-data and data to populate a variety of classes and/or globals, serialize this to JSON and import and unpack the payload on the target system (either by importing a file or building a small REST service (JSON travels very well over REST without the overhead of building up a full SOAP/WSDL enterprise - this can be especially useful if working on a prototype, one-off or "actual requirements still in flux" situation). 

Out of the box, unfortunately, we do not currently ship many of the other nice to have supporting routines (ResultSetToAET(), XMLToAET(), AETToGlobal(), AETToRegisteredObject(), ValidateToSchema() etc.) but many developers have made such contributions on an ad hoc basis (and some are posted elsewhere on this site).

Also, if you have specific use cases in mind, feel free to contact me, I've prototyped quite a number of DAO type conversion and serialization methods over the years, I may have something that can jumpstart your efforts.

[Note: DAO and its support for %ToJSON() and %FromJSON() is a different beast from the %JSON.Adaptor.  The latter works like our XML adaptor and generates methods at class compilation time that write JSON syntax snippets based on the static schema of a %Persistent class.  DAO processes ORefs dynamically at run time and has no requirement that classes be "JSON enabled" at compile time, and, as mentioned above, with the right glue code, DAO can also be used to serialize/initialize raw globals, result sets, %Zen.ProxyObjects, streams, %RegisteredObjects, $Lists and ObjectScript multi-dimensionals. ]

Hope this helps.

I think XML would be the best way, especially if you want to do it programmatically. This snippet will export your table to the specified folder

ClassMethod ExportTable(table As %String, path As %String)
{
	set class = $replace($replace(table,"_",""),"SQLUser.","User.")
	set sqlQuery = "select $listbuild(%Id,'"_class_"') as oid from "_table
	
	set rset = ##class(%SQL.Statement).%ExecDirect(,sqlQuery)
	while rset.%Next() {
		set elements(rset.%Get("oid"))=""
	}
	
	do ##class(XML.Element).Export(.elements, path_"/"table_".xml", "r")
}

And then you import it on the other instance with:

##class(XML.Element).Load(file,"/overwriteguid/notransaction/changecontrolskipevents")

The advantage of this solution is that it uses %GUIDs as identifiers, as such you can add/edit data on your target system and it won't be overwritten when importing the file. 

An alternative way without having direct access to file system would be use a JDBC client with export functionality. I use DBeaver and you can right click on your query/table -> Execute -> Export from query: