I'm dealing with a situation that leaves in doubt what I understand about CACHE.WIJ (C:\Roche\CobasInfinity\HealthShare\mgr).. Journal files store records already written to the base.. and CACHE.WIJ records not yet written to the base.. theoretically, the data contained in CACHE.WIJ is temporary... until the record is written to the base (consequently generating journal).
We have several clients in different productions all accessing a web-service. We are trying to add another client in a new production, and its not working. The messages between the relevant Process and the SOAP web-client Operation are the same in the working and the non-working productions. But the web-service is reporting an XML parsing error to the non-working production. Here's the error as reported in the SOAP web-client Operation...
AWS requires REST clients to call their APIs using Signature Version 4 which in case you don't know what I am talking about is a pain in the neck. Here comes the question:
Has anybody, by any chance implemented the v4 signing alg. in COS? If yes, would she or he have the kind heart to share?
Is it possible to have a single MUMPS variable representing all subscripts for a global? For example, if I have an ^ABC global and the ^ABC(1,2) node exist, is it possible to have a variable TEMP so that ^ABC(TEMP) would represent ^ABC(1,2)? TEMP="1,2" obviously does not work since it is interpreted as a single subscript, not two subscripts.
We are retiring a hosted application for an electronic health care records (EHR) system which stored the data on Cache for UNIX (Red Hat Enterprise Linux for x86-64) 2017.2.2 (Build 867_4_20245) Thu Oct 8 2020 16:58:40 EDT. The hosting company is providing me with a single CBK file. I need to install a database system to restore the database and provide occasional SQL access for reports when necessary. I'll need to maintain access to the data for an approximately 10 year retention period. Not sure how to approach restoring this old of a database and eventually upgrading it to a newer re
I need to ensure that the task created/scheduled, by system user, is created in the routine database and not remotely in the ECP to which it is connected. How can I guarantee the creation/scheduling of this task?
Here is a suggestion for creating the routine in both environments:
I'm trying to configure a specific process which dynamically sends messages to different endpoints based on datalookup keys, I've configured this aspect. What I'd like is to be able to visually see these connections without hardcoding them so is there a way to dynamically link them, i'll share what I tried below.
I am dealing with a very old code base (some routines date back to 1985 and were running on a M system 😉). It is rather huge and currently contains around 5000 compilable *.int routines.
My goal is to export all routine code as *.int UDL files and setup a git repository containing all routines.
I hope you are all doing well. I am currently facing an issue while trying to set up the SNMP subagent functionality for my InterSystems Cache installation.
I am using InterSystems Cache for Windows (AMD64) version 5.2.4 (Build 809_0_9006U). The SNMP subagent functionality requires the iscsnmp.dll dynamic library, which I have been unable to locate in my installation directory.
Currently, the SQL privileges (SELECT, INSERT, UPDATE, DELETE) are managed at the tables level, which can be very tedious when you have to administer many roles in an organization, and need to keep them sync with a constantly evolving data models. By managing privileges at the schemas level, will allow to give SELECT and other DML privileges to *all* or *several schemas* to a role|user, fixing the need to manually synchronize the new tables|views to the roles.
We currently have CSP application that runs under 2 servers(usually primary), and every month the server reboots for patching SERVER1(primary) in the morning and SERVER2(backup) at night.
Whenever the SERVER1 reboots SERVER2 behaves as primary and when SERVER1 comes back up it will act as backup server.
First Patching:
So, when SERVER1 is down, I need to start httpd service for SERVER2 and stop httpd service for SERVER1 (which is now backup server).
Hi all, I am trying to execute a query like the below code. set statement = ##class(%ResultSet).%New("some_class:query_method"). // here query method is empty and with rowspec some columname
statement.Execute(param1)
I want to fetch data type of column value returned from above. eg - Name - VARCHAR, amount - INTEGER etc. How can I get it. Or if not possible directly. Is there any other way to validate or get datatype of values returned. Line we have type() in python3
The program below works perfectly when I call it directly from the Terminal, however when I call it from within a CSP it does not work (It does not do the SELECT).
In the USER namespace, the program works both in the Terminal and on the CSP , but in another namespace it only works when called directly in the Terminal.
Testing ghostscript version 9.52 (ps2pdf command) from Cache Studio and it's taking about 1 minute to complete. Size of EQ110823BG1001A.ps file is about 11MB.
Running from ensemble command line completed in 1 second
Not sure why it's taking so much longer to distill from Cache Studio.
I am getting below error while connecting cache from VS Code
Authorization error: Check your credentials in Settings, and that you have sufficient privileges on the /api/atelier web application on 127.0.0.1:52773[USER]
For Login in CSP application, I am displaying custom Login page which is rendered from subclass CSS.CSP.Login that extends %CSP.Login, and also got IBA.CSP.Page that extends %CSP.Page with overridden method OnPreHTTP(). This setup is working perfectly for normal login.
Hello everybody. I have a problem a little bit strange. The thing is that there is a Task on Cache that by default is executed everyday at 4:00:00 that, with my settings, will delete all the Audit logs with more than 70 days of existence. The problem is that everyday this task is executed without an error message (status "Success" after the task is finished) but no data is cleaned, the same if I executed this particular task on the "Task Schedule" screen. I'll put here a screenshot of the message after executing the task on the "Task Schedule":