· Jun 8, 2021

displaying %GlobalCharacterStream

Hi Guys,

I'm wondering how can I get fields defined as %GlobalCharacterStream diplayed in a SQL query or even objects?

I've some data that I need to convert over to a new system and the old uses %GlobalCharacterStream data type for one of its fields and I''m unable to display or extract to a spreadsheet, any help pls?



Discussion (13)2
Log in or sign up to continue

You can try this query -

SELECT parent, Name
FROM %Dictionary.PropertyDefinition
WHERE Type = '%GlobalCharacterStream'

Here's a sample run (in %SYS) -

You can find more info about %Dictionary.PropertyDefinition here:

And in general about using the %Dictionary classes:

Thanks but that wouldn't help in my case smiley

Basically, I'm looking to export the data of a certain class with record ID, and Full_Description fields from eg. RMI.myclass into a spreadsheet than import it to another class, either by using a sql query or maybe better using objects (eg. S rec=##class(RMI.myclass).%OpenId(id)) and loop through the records and save to a .csv files?


Indeed I see now I misunderstood your question...

Note that with InterSystems IRIS I could use the SQL Data Export and Import Wizard.

I performed a quick test -

One class:

Class Test.StreamExport Extends %Persistent

Property SomeString As %String;

Property SomeStream As %GlobalCharacterStream;  


And a 2nd class:

Class Test.StreamImport Extends %Persistent

Property SomeString As %String;

Property SomeStream As %Stream.GlobalCharacter;


I had a stream with a 1000 lines in SomeStream in the 1st class, exported it to a file, and imported it to the 2nd and worked ok.

Note an SQL query would display the text in the stream and not just the OID.

This is what the data looked like in the table to be exported -

And this is the data that was imported to the 2nd class -

Checking the size of the Stream property, it is identical.

Note this was tested on InterSystems IRIS v2021.1.

In Ensemble v2018.1.x you would see something like this (per Eduard's example):

Thanks but I'm looking for more flexible way where I can manage the export and import process using objects or query (although I prefer objects) in a class method or routine as I'm exporting data from one class from Server 1 to a different class in server 2 (class name has changed), so using my class method or routine to export only the data i need into a cvs file then import it to server 2 class 2 in a specific way using  an import routine, so using the Wizard may not be suitable in this case ? well, if the export wizard can give me a csv file that would be good, but I using ensemble 2014?


Of course you could do this writing your own code in a routine (or method) and accessing the data via objects.

But the Wizard is flexible in the sense that it allows you to choose the table you are exporting from, and the fields you want to include in the export (and some other related settings); and on the import side what table you want to import into. Of course the classes can be different, and you don't need to to include all of the fields.

As for the product/version that's apparently an issue. I don't know if it is worth the hassle (vs. implementing your own mechanism), but you could in theory consider exporting the relevant classes and globals from Ensemble into InterSystems IRIS, perform the SQL Data Export and Import there, and then go the other way back - exporting and importing back (the classes and globals) from InterSystems IRIS into Ensemble. Again, not sure this makes sense for you... You could make a small test to (a) check if it indeed works as you desire, and (b) estimate how much work is entailed in this approach.

I actually saw now that the Wizard would work (as per above) also in Ensemble (I tested on v2018.1.x).

The fact that an SQL query just shows the OID, doesn't bother the Wizard to do it's job.

[If you take a peek at the code you can see it generates special stream handling -

 Do rtn.WriteLine("STREAMOUT(oref) {")
 do rtn.WriteLine(" while (oref.AtEnd = 0) {")
 do rtn.WriteLine(" set len = 32000")
 do rtn.WriteLine(" set val=oref.Read(.len)")
 do rtn.WriteLine(" write val")
 do rtn.WriteLine(" }")


It's not clear to me what you want to do.

A property like

Property MyData as %(Global-or-File)Stream;

means, the size of MyData can be something between 0 and the free space on your (hard) drive.
That's the reason, why is MyData defined as a stream and not as a %String.

On the other hand, in an excel cell you can put no more then 32767 characters, hence the plan to extract those data to an spreadsheet will work only if the MyData properties do not have more then 32767 chars, see

Nevertheless, you could use the following stored procedure to extrac the first 32767 chars from those stream data:

Class Your.Table Extends %Persistent
Property StreamData As %GlobalCharacterStream;
// other properties
ClassMethod StreamDataAsText(ID) As %String [ SqlProc ]
    set obj = ..%OpenId(ID,0), text = ""
    if obj { do obj.StreamData.Rewind() set text obj.StreamData.Read(32767) }
    quit text

Now you can get, beside the other data, the first 32767 chars of those stream data too

select Your.Table_StreamDataAsText(ID), * from Your.Table


About a year ago while I was experimenting with IRIS for Health 2019.1 to see what advantages it would give us with the Enterprize Master Patient Index (EMCI) application that HST has been developing. HST had started developing this before IRIS for Health had been released and so they had created FHIR Classes based on the FIR STU3 specification. They created the FHIR classes as %Persistant classes and we use DTL's to convert a FHIR Patient JSON into the HST FHIR Patient class. The one issue we ran into was the Patient's picture which is a Binary Stream. The developer who was writing the UI using Outsystems was using ODBC to retrieve data from the FHIR classes and she was able to specify 'Picture' in her SQL queries and then pass the binary data through some render utility to get it to display correctly on the UI form however there is a restriction in ODBC that the max size for any column is 4000 bytes. In SQL Server the maximum size of a row of data is 8000 bytes and blobs are stored in separate structures from the main table. (Similar to our 'S' global in Cache Default Storage) so though Blobs can be up to 2GB in size you are still restricted if you intend to use ODBC to retrieve or update data in a table.  The ODBC restriction is specific to the version of ODBC. Earlier versions of ODBC had a limit of 2000 bytes. This is irrespective of whether you are working with SQ Server, Oracle or IRIS ODBC drivers. In the documentation I was reading on SQL Server there are performance issues as well and they basically recommend keeping your binary or character stream data in files and they support a couple of techniques that allow their users to execute queries that include BLOB data but the SQL statements are hectic.