Hello,

We are trying to migrate all our production to new IRIS servers. To test everything is working fine, and to be able to script the process, we want to import the data into new IRIS servers using a backup file (created with ^BACKUP). But we've found that IRIS doesn't recognize Ensemble backups so we can't import it using ^DBREST :-O

Any of you know how to import in IRIS an Ensemble backup file?

Thanks a lot,
David

0 3
0 547

Hello,

When i click on the menu to run the Data import wizard from MP, i receive following CSP error

<UNDEFINED>zOnPageHEAD+229^%cspapp.exp.utilsqleximwizardcontent.1 *schemaname : CSP Error

It is happening for all the namespaces. Looks like some permission issue. Same issue with Data Export wizard. Help to resolve this will be appreciated.

I am using

Cache for Windows (x86-64) 2017.2.2 (Build 865_0_18763U)

Thanks,

0 3
0 267

Hi everyone. I'm new to cache, and i was looking for a command who reads a .txt file and store the informations in variables. I've found on the documentation the EnsLib.SQL.SNapshot class, and, even if i'm not sure if that's what i need, when i run the code it says that the class doesn't exist, but i couldn't find the right superclass to extend.

0 3
0 435

Hi Developers!

Here're the technology bonuses for the InterSystems IRIS Datasets Contest 2021 that will give you extra points in the voting:

  • Dataset Usage Demo Repository - 4
  • LOAD DATA Usage - 3
  • Questionnaire - 2
  • Unique Real Dataset - 4
  • Docker container usage - 2
  • ZPM Package deployment - 3
  • Online Demo - 2
  • Code Quality pass - 1
  • First Article on Developer Community - 2
  • Second Article On DC - 1
  • Video on YouTube - 3

See the details below.<--break->

0 4
0 327

Is there a command that will loop through the flat files of a given Linux/Unix folder? I can write the code to open and read each file. But the file names are unknown. I am looking for a way to access each file given a named Linux folder. The files have differing structures so a record map will not work.

Thank you for reading and thank you even more for answering!

0 1
0 280