Hi,

I exported selected globals from a Cache 2017 database into a single 4 Gb gof file. Now I tried to import from this file via Management Portal on a different machine . Only about half of the globals was imported and my attempts to select additional globals led to nothing, no new globals have been imported. Well, obviously I am mildly curious what's going on and how can I see the corresponding error which did not appear in the Import window but I can also shrug it off and consider what should I do next.

0 23
0 522

After linking in Oracle Table with Field Column's Data Types of NUMBER, my updates into these table fields are resulting in data that is being rounded to 2 decimal places. I insert a record with 1234.1234 and 1234.12 is stored.

It appears Cache xDBC might be manipulating my values prior to sending to Oracle. Is there a setting or system parameter that is controlling this? If so, is there a way to relax this from occurring so the values I send are being stored in Oracle with the same values?

0 10
0 202

I have csv date file with date values like this "4/10/2021" for April 10, 2021. I defined a table with this property: Property TranDate As %Library.Date.

I capture error

[SQLCODE: <-104>:<Field validation failed in INSERT, or value failed to convert in DisplayToLogical or OdbcToLogical>] [Location: <ServerLoop>] [%msg: <Field 'dc_data_finance.transact.TranDate' (value '4/10/2021') failed validation Field ...

I do not really want to change TranDate to %String. How can I import "4/10/2021" into %Date property?

0 10
0 171

Hi,

I am brand new to Cache and have been tasked with extracting data from the dB. I see that there are a bunch of tools that can be used to extract via SQL Server statements. DataGrip and DBeaver to name 2. I've gone through some documentation and it doesn't look as if SSMS can connect directly.  Is there a preferred extraction tool that anyone can recommend. Does the Cache dB have anything built in where I can access the data while seeing a visual of the tables/schema?

Thanks for any advice,

0 9
0 626

Hi,

I have a process that is importing lots of data from an external file, but every now and then I come across a STORE error.

I know that the STORE error is occurring in the %Save method of the class but I know very little else and I think its something to do with a lage amounts of Related Objects.

Is there a method of finding out more information regarding this error i.e. which actual object was being saved etc etc.

Thanks

Jim

0 9
0 1,848

Hey there,

I'm writing an import Routine to read files into a global. The code is working fine except for the 'Delete' command. The files are being imported, copied but not deleted. Maybe someone has an Idea what ist happening.

I get the low level return value of -32 but i couldn't find anywhere to show me what that actually means. And my Caché version doesn't support the $ZU command.

Here's the Code

0 7
0 142

Hello Developers!

Have you ever had to convert HL7v2 messages to FHIR (Fast Healthcare Interoperability Resources) and found the process complicated and confusing? InterSystems is rolling out a new cloud based SaaS offering called HealthShare Message Transformation Services, which makes the process easy.  We are excited to announce an Early Access Preview Program for our new offering, and we would love to have you kick the tires and let us know what you think!  All you need is a free AWS account, with an S3 bucket to drop in your HL7v2 messages, and another S3 bucket to get your FHIR output. 

25 5
0 640

How to find a global's original namespace ? Potentially mapped from a different namespace .

I have a global ^Custom that exists in multiple namespaces but it could mapped from namespace Drone(A) to Launch(B)

Without access to Cache management portal how to find where is my global located using cache code ?

Like if ^Custom == ^[Drone]Custom ??

0 5
0 217

Hi Community,

Join the upcoming InterSystems Interoperability Contest Kick-off Webinar dedicated to the Interoperability Contest.

In this webinar, you'll get an overview of the interoperability capabilities of InterSystems IRIS, and we'll show you how to use the PEX. Also, we’ll discuss and answer your questions on how to build interoperability solutions using InterSystems IRIS and IRIS for Health.

Date & Time: Monday, October 4 — 12:00 AM EDT

Speakers:  
🗣 @Stefan Wittmann, InterSystems Product Manager 
🗣 @Bob Kuszewski, InterSystems Product Manager 
🗣 @Evgeny Shvarov, InterSystems Developer Ecosystem Manager

5 5
0 274
Article
Oleh Dontsov · Jun 4, 2020 1m read
Easy data import into IRIS

Sometimes you need quickly and easily import data into IRIS. For this, an IRIS import manager has been developed.

This application allows you to import JSON data and also provides a really simple interface for transferring data from MongoDB collections to IRIS globals. It has never been so easy.

Let's look at examples.

Import JSON

2 5
0 375

Hi Developers!

Here're the technology bonuses for the InterSystems IRIS Datasets Contest 2021 that will give you extra points in the voting:

  • Dataset Usage Demo Repository - 4
  • LOAD DATA Usage - 3
  • Questionnaire  - 2
  • Unique Real Dataset - 4
  • Docker container usage - 2 
  • ZPM Package deployment - 3
  • Online Demo - 2
  • Code Quality pass - 1
  • First Article on Developer Community - 2
  • Second Article On DC - 1
  • Video on YouTube - 3

See the details below.<--break-><--break->

0 4
0 215

I am trying to populate a table using the sql Data Import Wizard. The input file is a tab delimited text file. But the import keeps failing with a 104 error showing validation for the columns which use %Library.TimeStamp and %Boolean datatypes is failing. Yet when I insert values into the table through a SQL insert command, the values get saved correctly in the table.

For the TimeStamp format in the wizard form, I am choosing YYYY-MM-DD-HH:MI:SS because  there was no option for this format: YYYY-MM-DD HH:MM:SS. 

0 4
0 163

Hey community! How are you doing?

I hope to find everyone well, and a happy 2022 to all of you!

Over the years, I've been working on a lot of different projects, and I've been able to find a lot of interesting data.

But, most of the time, the dataset that I used to work with was the customer data. When I started to join the contest in the past couple of years, I began to look for specific web datasets.

I've curated a few data by myself, but I was thinking, "This dataset is enough to help others?"

5 4
0 199