Caché

Syndicate content 29  

I am trying to convert a string to date but can not get it to work I have  function that I would like to take in a date string and covert it to date object

here is the ezample so far can not get it to work any help appreciated 

 

set p="12/03/2019"
 
w $System.SQL.TODATE(p,"YYYY-MM-DD")
 
<ILLEGAL VALUE>todate+32^%qarfunc

if I try this still get the wrong value returned

set p="12/03/2019" 
w $ZDATE(p,3)
1841-01-12

 

Last answer 5 days ago Last comment 12 March 2019
0 6
111

views

0

rating

Version 2016

Created a business process and ticked "Is component".

Later trying to use it as component  from another business process, and when trying to set up target for a call activity, selecting process components does not show anything in the list.

Any idea why?

Last answer 24 January 2019 Last comment 5 days ago
0 2
63

views

0

rating

I worked through the Community for proposals to provide end users
in an easy way with data formatted as EXCEL sheet.

There is an great article Tips & Tricks - SQL to Excel

there's an important message embedded: "EXCEL can interpret HTML tables and display them as usual"

Nice!
But the result is on your server and you are left alone to get it out to your remote user somewhere.
Could be a subject to be solved using REST or WebServices

Last comment 6 days ago
0 10
602

views

+ 13

rating

This is a continuation of my story about the development of my project isc-tar started in the first part.

Just having tests is not enough, it does not mean that you will run tests after all changes. Running tests should be automated, and when you cover all your functionality with tests, everything should work well after any change in any place.  And Continuous Integration (CI) helps to keep the code and deployment procedure with as fewer bugs as possible and automates the routine procedures, like publishing releases.

I use GitHub to store the source code. And some time ago GitHub started to work on their own CI/CD platform and named it GitHub Actions. It is not widely available, yet. You have to be signed as a beta tester for this feature, as I did. GitHub Actions uses quite a different way how to deal with a build workflow. What is important that Github Actions allows to use docker, and it’s quite easy to customize available actions. And interesting that GitHub Actions is really much bigger than any classic CI like we have in Travis, Circle or Gitlab CI and so on. You can find more in the official documentation.

1 1
0

comments

41

views

+ 1

rating

Hello everyone,

 

i am in process of changing our authentication method, so we can integrate our AD authentication in our programs. At the moment i am using they %SYS.LDAP object, and trying to use the .Bind() method with the user information to authenticate. This seems to work without issues, but here the problems start.

When i flag a user 'Change password on next logon' in our Active Directory, the Bind fails with a status error: "Invalid Credentials". To make sure the user who logged in is in fact the user to change the password. I still need to check if this user entered the correct current login information.

Checking the fields 'badPwdCount' or 'badPasswordTime' does not help since they are not filled after a failed .Bind() it seems.

Anyone has experience with this issue and knows how to work around the change password issue?

Thank you guys in advance!

 

Thomas

0 1
0

answers

0

comments

25

views

0

rating

Hello all,

I am having some issues creating a docker image with a fresh cache installation. The error that I am getting right now is that gzip is required for a cache installation, but was not found in my system. Even though, this is shown as installed in the base centos 7 image. My host machine is Windows 10 using the latest docker version.

Here is my dockerfile, its simple:

FROM centos:latest

RUN yum update -y && yum -y upgrade

COPY ./cache-2017.1.3.317.0.18571-lnxrhx64.tar.gz .

RUN tar -zxf cache-2017.1.3.317.0.18571-lnxrhx64.tar.gz

RUN ISC_PACKAGE_INSTANCENAME="MyDatabase" \

          ISC_PACKAGE_INSTALLDIR="/usr/cachesys/" \

          ISC_PACKAGE_UNICODE="Y" \

         ISC_PACKAGE_CLIENT_COMPONENTS="" \

         ISC_PACKAGE_INITIAL_SECURITY="Minimal" \

        /cache-2017.1.3.317.0.18571-lnxrhx64/cinstall_silent



EXPOSE 57772 22 1972

ENTRYPOINT ["ccontrol", "start cache"]

Last answer 6 days ago Last comment 6 days ago
0 3
48

views

0

rating

Hi! We have received a request to create a new rule on CachéQuality to identify when a developer uses double quotes (" ") within any SQL statement.

We have been asked many times about SQL validation rules, and we would like to open a debate to allow everyone discuss what would you like to be checked on a SQL statement.

Current examples are for basic situations:

  • Using SQL.Statement class:

Set stmt = ##CLASS(%SQL.Statement).%New()
Set query = "Select Val1, Val2 FROM Table WHERE Val1=""SomeCondition"""

  • Using embedded SQL

&SQL(SELECT Val1, Val2
                INTO :val1, :val2
                FROM Table
                WHERE Val1="SomeCondition")

Last answer 6 days ago Last comment 6 days ago
0 3
73

views

0

rating

Starting with 2017.1, InterSystems is adding Ubuntu (64-bit) as a third linux server platform. Prior to 2017.1 Ubuntu was already available as a development platform and customers could use InterSystems distributions build for SUSE to run on Ubuntu. As a result there are a few license key implications for 64-bit linux versions starting with Caché and Ensemble 2017.1:

a) Customers using RedHat will observe no changes

b) Customers using InterSystems products(1) for SUSE on SUSE will need new license keys (no charge)

c) Customers using InterSystems products(1) for SUSE on Ubuntu will need new license keys (free of charge for 12 month(2))

(1) Caché, Ensemble, HealthShare, and TrakCare

(2) Starting with the Release of 2017.1

0 9
802

views

+ 9

rating

Hi Community!

I think everyone keeps the source code of the project in the repository nowadays: Github, GitLab, bitbucket, etc. Same for InterSystems IRIS projects  check any on Open Exchange.

What do we do every time when start or continue working with a certain repository with InterSystems Data Platform?

We need a local InterSystems IRIS machine, have the environment for the project set up and the source code imported.

So every developer performs the following:

  1. Check out the code from repo
  2. Install/Run local IRIS installation
  3. Create a new namespace/database for a project
  4. Import the code into this new namespace
  5. Setup all the rest environment
  6. Start/continue coding the project 

If you dockerize your repository this steps line could be shortened to this 3 steps:

  1. Check out the code from repo
  2. Run docker-compose build 
  3. Start/continue coding the project 

Profit - no any hands-on for 3-4-5 steps which could take minutes and bring head ache sometime.

You can dockerize (almost) any your InterSystems repo with a few following steps. Let’s go!

Last comment 6 days ago
5 4
93

views

+ 2

rating

This question is about calling AWS REST APIs. Based on:

http://docs.aws.amazon.com/general/latest/gr/sigv4_signing.html

AWS requires REST clients to call their APIs using Signature Version 4 which in case you don't know what I am talking about is a pain in the neck.  Here comes the question:

Has anybody, by any chance implemented the v4 signing alg. in COS? If yes, would she or he have the kind heart to share?

Thanks,

Chris

 

 

Last answer 23 March 2017 Last comment 7 days ago
0 4
262

views

0

rating

InterSystems states that Caché supports at least three data models – relational, object and hierarchical (globals). On can work with data presented in relational model in a program written on C# the same way one works with any other relational DB. To work with data presented by object model in C# one needs to use .NET Managed Provider or some kind or ORM. And starting with version 2012.2 one can work directly with globals (or use direct access to hierarchical data) via Caché eXTreme for .NET

Last comment 7 days ago
2 5
410

views

+ 4

rating

I'm attempting to use the .NET Entity Framework provider that is provided by InterSystems (see: https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=GBMP_eframe).  In our environment, the "Support Delimited Identifiers" option is turned off and we are not allowed to turn it on without significant testing effort.  When this setting is off, the SQL that is generated by the Entity Framework provider is not considered valid and therefore the provider doesn't work (it DOES work, however, when this setting is turned ON).

Do you know of any way to work around this issue? Again, we don't have the option of turning the "Support Delimited Identifiers" option ON unfortunately.

0 1
0

answers

0

comments

18

views

0

rating

Is it possible to mimic what selenium does like navigating to a site and logging in  and filling out a form then logout in COS.I am trying to do that in COS using %Net.HttpRequest class or should I be using a different class the idea is to be able to call a web app login into it and fill out form and log out

Last answer 8 days ago Last comment 11 December 2018
0 3
142

views

0

rating

Database systems have very specific backup requirements that in enterprise deployments require forethought and planning. For database systems, the operational goal of a backup solution is to create a copy of the data in a state that is equivalent to when application is shut down gracefully.  Application consistent backups meet these requirements and Caché provides a set of APIs that facilitate the integration with external solutions to achieve this level of backup consistency.

Last comment 8 days ago
1 4
1229

views

+ 1

rating

I'm sure most of you have already familiar with the possibility of using GZIP in InterSystems products. But, the problem is that GZIP working only with one file or stream, and it does not support folders. When you work in Unix systems, there is a possibility how to solve it, using tar compress tool which goes with every Linux system from out of the box. But what to do if you have work on Windows as well, which does not have it. 

I am pleased to offer you my new project isc-tar, which will help you do not care about operating system, and deal with tar files anywhere.

Last comment 8 days ago
4 2
115

views

+ 5

rating

I am just recently announced my project isc-tar. But sometimes it is not less interesting what’s behind the scene: how it was built, how it works and what happens around the project. Here is the story:

  • How to develop this project
  • How to test it
  • How to release new versions for publishing
  • And finally how to automate all above
  • Continuous integration

So, I would like to tell all about it.

2 1
0

comments

79

views

+ 1

rating

Hello All,

I have been associated with Intersystems technologies for over a decade working on Cache, Zen, Ensemble etc.

This is a very niche field and a lovely community. I wanted to extend my hands to connect with people who are of same field or related to it.

Here is my linkedin profile. Pls feel free to send me an invite or drop me a message

https://www.linkedin.com/in/vneerav/

Last comment 8 days ago
0 1
76

views

0

rating

I have a DR Mirror   with a WIJ that is 5 times as large as the Primary Failover member. My Read-Write Reporting mirror WIJs are the same size as the Primary.  I don't  know why the DR WIJ i so large and would like to shrink it to the same size as the others.  Any suggestions are welcome. Thanks!

Last answer 10 days ago
0 3
0

comments

72

views

0

rating

Hi All,
 How to create Https request with Negotiate,NTLM Authentication in cache using %Net.HttpRequest package.
 I tried with basic authentication it is throwing >401 - Unauthorized: Access is denied due to invalid credentials.
 
 can anyone please guide me 
 
 Many Thanks !!
 Vicky

Last answer 11 days ago Last comment 11 days ago
0 3
49

views

0

rating

 

Keywords:   Jupyter Notebook, Tensorflow GPU, Keras, Deep Learning, MLP,  and HealthShare    

 

1. Purpose and Objectives

In  previous"Part I" we have set up a deep learning demo environment. In this "Part II" we will test what we could do with it.

Many people at my age had started with the classic MLP (Multi-Layer Perceptron) model. It is intuitive hence conceptually easier to start with.

So let's try a Keras "deep learning MLP" with standard demo data that everybody in AI/NN community has been using. It is a kind of so called "supervised learning". We will see how simple to run it on the Keras level.

We could later touch on its history and on why it's called "deep learning" the buzz word - what actually evolved over the recent 20 years.

1 2
0

comments

58

views

+ 1

rating