Databases

Syndicate content 20 

I am trying to convert a string to date but can not get it to work I have  function that I would like to take in a date string and covert it to date object

here is the ezample so far can not get it to work any help appreciated 

 

set p="12/03/2019"
 
w $System.SQL.TODATE(p,"YYYY-MM-DD")
 
<ILLEGAL VALUE>todate+32^%qarfunc

if I try this still get the wrong value returned

set p="12/03/2019" 
w $ZDATE(p,3)
1841-01-12

 

Last answer 5 days ago Last comment 12 March 2019
0 6
109

views

0

rating

Hi guys

I've added a new mirrored failover member to an existing one.

They are both Arbiter controlled.

When I do Add database to mirror on the primary I'm always getting an error. Missing Mirrored database.

I followed everything in the documentation and always get the same error.

I've copied the CACHE.dat file after umounting the database on both servers. then mounted them.

Could you help me out?

 

Kind Regards

Last answer 4 March 2019
0 1
0

comments

39

views

0

rating

Good afternoon,

We have a very old version of Ensemble with one of our clients and they have no desire to upgrade anytime soon. We have gotten the all-clear to purge really old messages from the database, changing the days kept from 60 to 30. The option to Compact/Truncate is displayed in this version of Ensemble, but does not execute as it mentions not being actually present in this version.

There is an option in ^d DATABASE that restores unused space, however this does not return nearly as much free space as the refined Compact/Truncate procedure.

My question is: Is it possible to Compact/Truncate properly in Ensemble 2013?

Many thanks and kind regards, Alexi Demetriou

Last comment 28 February 2019
0 2
42

views

0

rating

Is there a way or can it be done to use conditional logic in sql like so

    

 Query Q1(formal as %String) As %SQLQuery [ Final ]
{
    
    SELECT patientnumber,ID, CASE
    WHEN ID = 50 THEN "The is 50"
    WHEN ID = 30 THEN "This is 30"
    ELSE "The quantity is under 30"
END FROM Audit.Table WHERE ID = :formal AND EndDate is null} 

}

Last answer 14 February 2019 Last comment 14 February 2019
0 4
103

views

0

rating

Edit:

May have found the issue but not the solution.

"SELECT * FROM wmhISTORYdETAIL" runs as a passthrough without asking for the DNS.

but

'SELECT Count([wmhISTORYdETAIL].[HistHMNumber] AS CountOfHistHMNumber FROM [wmhISTORYdETAIL] WHERE ((([wmhISTORYdETAIL].[HistMovType])='Receipt') AND (([wmhISTORYdETAIL].[HistMovDate])>=Date()-1) AND (([wmhISTORYdETAIL].[HistMovDate])<Date()));'

asks for the DNS but both are linked to a table that has the password saved.

Any Ideas please?

Rob

Hi

I have created an MS Access database with a passthrough query to our Intersystems Cache WMS system. If I use "SELECT * from thetable"  as the passthough query I can use VB.NET to query the passthrough and it works fine but this dataset getting rather large so I changed it to

"Select field1, field2, filed3 from thetable" but the passthrough no longer works as it did.....it works in MS Access but not from the VB.NET app

0 0
0

answers

0

comments

46

views

0

rating

I have this query That I am trying  to use in my class when testing on the terminal I expect to get the results printed on the terminal but I am only getting zero printed please can anyone out there advice on what I am doing wrong

Method PatientInfo(ID As %String) As %Status
{ #dim status as %Status=$$$OK
  SET myquery="SELECT GUID, IDType,IDValue FROM MergeHyland.TypeTwoDimesionCollection WHERE GUID ="_ID
  SET rset=##class(%ResultSet.SQL).%Prepare(myquery,.err,"")
    WHILE rset.%Next() {
    WRITE !,rset.GUID & ":" & rset.IDType& ":" & rset.IDValue
    }
  WRITE "End of data"
    return status
}
Last answer 22 January 2019
0 2
0

comments

103

views

0

rating

Astronomers’ tools

5 years ago, on December 19, 2013, the ESA launched an orbital telescope called Gaia. Learn more about the Gaia mission on the official website of the European Space Agency or in the article by Vitaly Egorov (Billion pixels for a billion stars).

However, few people know what technology the agency chose for storing and processing the data collected by Gaia. Two years before the launch, in 2011, the developers were considering a number of candidates (see “Astrostatistics and Data Mining” by Luis Manuel Sarro, Laurent Eyer, William O’Mullane, Joris De Ridder, pp. 111-112):

Comparing the technologies side-by-side produced the following results (source):

TechnologyTime
DB213min55s
PostgreSQL 814min50s
PostgreSQL 96min50s
Hadoop3min37s
Cassandra3min37s
Caché2min25s

The first four will probably sound familiar even to schoolchildren. But what is Caché XEP?

Last comment 9 January 2019
1 3
451

views

+ 8

rating

We are currently using Ensemble on AIX. We are on 2015.2.2. If I install Field Test on a windows desktop, is it possible that I can import the Cache.dat from my AIX server, so I can do some Proof of Concept development?

Thanks

Scott

Last answer 7 January 2019 Last comment 7 January 2019
2 3
121

views

0

rating

The following post outlines an architectural design of intermediate complexity for DeepSee. As in the previous example, this implementation includes separate databases for storing the DeepSee cache, DeepSee implementation and settings. This post introduces two new databases: the first to store the globals needed for synchronization, the second to store fact tables and indices.

Last comment 2 January 2019
0 4
284

views

+ 4

rating

Mirroring 101

Caché mirroring is a reliable, inexpensive, and easy to implement high availability and disaster recovery solution for Caché and Ensemble-based applications. Mirroring provides automatic failover under a broad range of planned and unplanned outage scenarios, with application recovery time typically limited to seconds. Logical data replication eliminates storage as a single point of failure and a source of data corruption. Upgrades can be executed with little or no downtime.

Deploying a Caché mirror does, however, require significant planning, and involves a number of different procedures. And like any other critical infrastructure component, the operating mirror needs ongoing monitoring and maintenance

Last comment 8 December 2018
2 10
2829

views

+ 7

rating

Let's say you have about 100TB of data in multiple CACHE.DAT. The biggest one is about 30TB but mostly more than 1TB. You have limited time for maintenance during a day, and it is only a few hours at night. You have to check Integrity as much often as possible. And of course backup it.

How would you do it?

Last answer 5 December 2018
0 5
0

comments

200

views

0

rating

Hi

Totally new to IRIS and Cache.

Trying to evaluate it and work out how we could use it.

As a standard application database. Object or relational etc. does not matter. 

Issue is ObjectScript.

So:

1) Can we develop, maintain and use an IRIS database and never use ObjectScript i.e. use only Java, Python, C++ interfaces etc. (exactly which one does not matter)? Would that make designing and using the IRIS database more prone to inefficiency and error?

2) Can we import an existing Cache database into IRIS and convert its ObjectScript code into Java, Python whatever? Is that a big, difficult, error-prone job?

If the answers are no that may not be a showstopper but would like to know it now. 

A lot of training will be involved in any case I know and Oracle has PL/SQL but ObjectScript developers are rare.

Apologies if the answers are in the doco. Have read some of it but need some indication about the above urgently.

Last answer 14 November 2018 Last comment 14 November 2018
0 2
223

views

+ 1

rating

Hi All,

Actually, I'm developing few restful API's. I want to create a authentication tokens and display it on my login restful API. If I'm using CSP sessionId, how can I validate the session Id's in another or continues restful API's. else, is there any other approach to handle this task. 

My Primary goal is, I have to integrate 2 different front end applications. One is Zen framework another one is web pages from Python. 

If any lead, it would be appreciated. 

Thanks,

Arun Kumar Durairaj. 

0 1
0

answers

0

comments

98

views

0

rating

Hello everyone,

i already asked this question in another post (https://community.intersystems.com/post/how-can-i-import-my-json-formatt...) and i'm sorry for creating a new post but i still didn't get an answer so i try it again. Now i know how to import my JSON formatted data from my API to my DocDB but now i have the problem that my JSON formatted data from my API is imported only to the %Doc  column and not to the columns that i created with my properties.  I'm not really sure but i think that the document database only puts my information from my API to the right columns if my JSON is in a JSONArrray format. But the format of my API is JSONObject and looks like this

Last answer 11 October 2018 Last comment 5 October 2018
0 2
411

views

0

rating

Hello everyone,

i want to create an iris document database with Atelier with some properties, where i can import my JSON formatted data from an API to the database which i created. Right now i know how to import my local JSON formatted data to my created database:

Class User.Classtest

Last answer 4 October 2018 Last comment 7 October 2018
0 2
259

views

0

rating

Hello,

I have a question about creating properties with curl.

I already did create properties in Java with the following command.

<DO db.%CreateProperty("TotalSteps","%Integer","$.TotalSteps")>

It created the property TotalSteps with the type %Integer and the data path $.TotalSteps (since the header of my data source is also TotalSteps).

Now I would like to create the same property in curl with the following command

<curl -i -X POST -H "Content-Type: application/json" http://localhost:53774/api/docdb/v1/namespaceName/prop/databaseName/ propertyName?type= propertyType& path= propertyPath& unique=propertyUnique>

Port: 52773

namespaceName: fitnessnamespace

databaseName: teststream

propertyName: TotalSteps

propertyType: %Integer

propertyPath: ??

Last answer 4 October 2018 Last comment 4 October 2018
0 0
81

views

0

rating

We have started to see Journal Daemon inactive and DBLatency warnings in the Console log of our Healthshare server. OS is Windows Server 2008 running in a VM. See below

10/03/18-00:46:39:344 (3840) 1 Journal Daemon has been inactive with I/O pending for 10 seconds:
gjrnoff=6642068,iocomplete=6637348,filecnt=771,fail=0
10/03/18-10:08:47:620 (6064) 1 [SYSTEM MONITOR] DBLatency(c:\intersystems\healthshare\mgr\cachetemp\) Warning: DBLatency = 2300 ( Warnvalue is 1000).
10/03/18-10:08:47:755 (6064) 1 [SYSTEM MONITOR] DBLatency(d:\databases\adtfeed\) Warning: DBLatency = 1251 ( Warnvalue is 1000).
10/03/18-10:08:47:756 (6064) 1 [SYSTEM MONITOR] DBLatency(d:\databases\bloodbank\) Warning: DBLatency = 1426 ( Warnvalue is 1000).
10/03/18-10:12:08:636 (6064) 1 [SYSTEM MONITOR] DBLatency(c:\intersystems\healthshare\mgr\) Warning: DBLatency = 1813 ( Warnvalue is 1000).

Has anyone else seen this type of warning?

MikeD

Last answer 3 October 2018
0 1
0

comments

91

views

0

rating

InterSystems Caché globals provide very convenient features for developers. But why are globals so fast and efficient?

Theory

Basically, the Caché database is a catalog having the same name as the database and containing the CACHE.DAT file. On Unix systems, the database can also be an ordinary disk partition

Last comment 24 September 2018
1 9
2266

views

+ 8

rating

I have already mentioned my project CacheBlocksExplorer recently in two articles

  1. Internal Structure of Caché Database Blocks, Part 2
  2. Internal Structure of Caché Database Blocks, Part 3

Now I would like to inform that this project can be easily run with docker.

Version for Caché and for IRIS, now publicly available on docker hub.

Remember that you need the appropriate license key (for RedHat Linux) to be able to run this project

1 1
0

comments

136

views

+ 3

rating

Hi -

 

I need an example of what I need to "map" to have a common dashboard defined so it will visible/usable in multiple namespaces.

 

I have created a dashboard in "SAMPLES" (namespace and database) and I would like to have this dashboard be accessable/useable from a 2nd namespace, but I'm not having any success in doing mappings (global/package/routine/data) to be able to get DeepSee to be able to see/display the dashboard.

What is the minimum that I need to map?

Last answer 20 August 2018 Last comment 25 November 2015
0 4
161

views

0

rating