I have an existing application on HealthShare 2015 and decide to move it to HealthShare 2018 to make use of Atelier support. I am using Eclipse Photon with Atelier Plugin 1.3.
Most of things are working better on Atelier comparing with Built-in studio of HealthShare. However, when I tried debugging CSP file with Atelier I encounter 2 problems:
Hi, I want to use Single Sign On(SSO) in SAML tokens using SOAP. I need X.509 certificate. Where can I get that certificates from a trusted source? Is any other way to do SSO in Intersystem Cache 2017. Can anyone please help me on this. Thanks.
– I want to “copy” a file to another, appending if the file already exists. The below code works, except it overwrites the new file, rather than append. The documentation says CopyFile will append
I have noticed a problem in some new code I did on one of our productions. I think it is leading to 2 problems.
I have a file, lets call it 1111111111_2300.pdf.
We make the file UNIQUE to avoid an issue. This filename in the working directory gets a session ID Added to it before the .pdf. So lets say it was session ID 9 it would be 1111111111_2300#SID9.pdf in a working directory .
There are further operations to be done on this document before it is sent. For various errors an email is sent back to service users.
I created a csp that will capture image data from a html canvas element. I created a button element to save the contents as a dataURL string from jpeg and make a server call.
the string is too long
I would appreciate any suggestions in saving the data into the server, currently just a global.
During some consulting activity, I found at the client's site CACHEAUDIT database of more than 100 GB size. The reason was simple: several processes produced a great amount of %System/%System/OSCommand audit records due to frequent external calls ($zf(-100,...)). As it is well-known, those events can be easily disabled systemwide, while this can be hardly considered secure enough. Reducing the number of days before audit cleanup from default 62 to some reasonable figure (e.g. 15) seems to be a better solution, but...
We have an interface that need to be disabled then re-enabled when it starts to queue up. I wrote the following code to do this functionality in a process. This works in our development domain, but in production it says it fails to disable the job - it only shuts down the interface without updating the production/starting the interface back up. Error message: "Failed to stop job '36831290' within 60 seconds. Status '<unknown>"
Is there something wrong with how I'm trying to do this?
set tSC = ##class(Ens.Director).EnableConfigItem(itemname,0,0)
I am trying to use Dynamic SQL because I need to supply data at runtime. The generated query returns 0 rows for some reason. If I copy/paste the query into Monitor, it works correctly. I am suspecting it has something to do with dates being the wrong format (I am supplying them in 'YYYY-MM-DD' format). Is that the cause? And if so, how do I supply dates in correct format?
I usually save the path in database like "C:\folder\picture.png", but now i want to save the photo in Iris or Caché database? Which way is better to recover the image and to maintain the original quality?
I am using MDX2JSON do display data, it uses CSP REST to retrieve data and uses Password Authentication. I enabled LDAP authentication for this applicaiton, but it does not work.
we use Rose to do HA with cache2016.2, the database is placed on the hard disk array, which is essentially an instance, and the same array disk is mounted by rose switch, another mirror machine is made.
Is there any problem with this structure for disaster recovery and backup, I hope to give some advice. Thanks!
I have a DR member and in this cache server the database "cachetemp" start to getting bigger without any reason (50GB that was all the free disk space we have)
In the members of the mirrors the cachetemp its ok and the size is 31MB.
I restarted the server because I read that the cachetemp database purge when restarting, but didnt happend.
Any recommendation to clean this database? can I just deleate the CACHE.DAT from this database?
I have a client that no longer wants to use sftp to transmit their data file to me. Instead they want me to pick it up via a web service.
Email from client:
Here is the postman collection and mocking service to start your development. The API has only one URI parameter {id} for which you need to pass UniqueIdentifier(will let you know exact value later).
I work on an ERP system and am trying to set up a dashboard according to a customer request. It's a pivot table with a few controls and filters, nothing too difficult, but I'm having some issues with drill down.
We have an application that is logged using AWS SSO. We are passing username and password from the AWS SSO to our application for validating the user. But what we need is, we need to login to application without using the password. Is there any possible way available for this?
I have a terminal script that queries sys.process' and then parses the results writing to a logfile. As part of the processing of the results, each row of the result set includes an element from the static string. But I am not clear on the scoping of the static string. The write statement sees the static string as undefined. What I want is something to the affect:
I have a workstation with a CACHE instance up and running.
On that same workstation there is also an instance of IRIS (fresh install). I would like to migrate manually the CACHE database to IRIS (ideally, all globals, routines and classes).
What I tried is to copy C:\InterSystems\Cache\mgr\CACHE.DAT to C:\InterSystems\IRIS\mgr\IRIS.DAT (after shutting down both instances) but it does not work.
I got the following message :(112) The service for the IRIS instance did not start.
Our team is looking for a way to export all of our Cache SQL tables into Microsoft SQL Server. I have only found a method to export one table at a time into an ASCII file. We have over 170 tables so this would be very tedious and time consuming. Is there a way to directly export from Cache to SQL Server. Alternatively is it possible to export the entire database in a single shot or even multiple tables to text files?