go to post Yaron Munz · Jan 28 Hello Caio, There is no DECLARE @variable (SQL server) or DECLARE variable (Oracle) on Cache but there are few options: 1. Use a host variable(s) in embedded SQL: SET variable = 2000&SQL(SELECT Column FROM Table WHERE ID = :variable) 2. Use the same with Dynamic SQL SET variable = 2000SET sql = "SELECT Column FROM Table WHERE ID = " _ variableDO ##class(%SQL.Statement).%ExecDirect(, sql) 3. Writing a SP CREATE PROCEDURE Procedure(IN variable INT)ASBEGIN SELECT Column FROM Table WHERE ID = variable;END
go to post Yaron Munz · Jan 28 Hi Frank, There is no "out of the box" tool that can give you ^%SS and ensemble componet dashboard to measure performance (GREFs, memory, CPU etc.) Maybe worth to check if someone has already developed something similar in "open exchange"
go to post Yaron Munz · Jan 27 Hello Frank, You can correlate the pid in O/S with the pid in Cache processes (in management portal). there you can see which namespace this pid is running in Cache. You can also have more information for that component: last line in source code, last global reference, if there are any locks might give you a good clue what this component is doing. If you are using ensemble, you can match the same pid to an ensemble job
go to post Yaron Munz · Jan 17 Hello Gabriel, It seems that updates to the other database (PostgreSQL) need to be "close to real-time," though a slight delay is acceptable. What matters most to you is ensuring stability and preventing any loss of updates. I would consider the following:1. Using the "SQL Gateway Connection" capability to connect remote tables directly to Cache. The benefit is that you have all logic on Cache side (having a remote REST/API will need also some remote logic to return the status of the operation in case of local updates failures)2. Loosely coupling the local updates (Cache) with the remote updates:a. Create a "staging area" (which could be a global or a class/table) to hold all updates to the remote table. These updates will be set by a trigger, ensuring that updating the local object/table in Cache is not delayed by updates to the remote database, The staging area delete its entries only on successful update (when failing they will be kept) - so you might need a monitor process to alert when the staging area is cleaning (e.g. remote DB is down, or network issues)b. Use a separate (dependent) component to handle the updates. If you have interoperability (Ensemble), this might be easier to implement. However, it’s not mandatory; you could also use a background job (or a task) to periodically scan the "staging area" and perform the updates
go to post Yaron Munz · Jan 15 According to the documentation for Embedded SQL: "A host variable cannot be used to specify an SQL identifier, such as a schema name, table name, field name, or cursor name. A host variable cannot be used to specify an SQL keyword".Using Embedded SQL | Using InterSystems SQL | InterSystems IRIS Data Platform 2023.1 You will need to use dynamic SQL
go to post Yaron Munz · Jan 14 to be able to handle endStr in any length you should have:S value = $E(text, start, end-$L(endStr)-1)
go to post Yaron Munz · Jan 8 Yes. the SMTPTRACE is the one that will allow log on SMTP (pity that they did not do that by an external parameter)The DB that you need to mount as RW is IRISLIB
go to post Yaron Munz · Jan 7 I would try to increase the ODBC timeout on the client side (PDA). If this does not help, opening a WRC might be helpful, since they can help you analyze your IRIS instance load and configuration (high CPU, cache buffers, memory) that might make your instance more robust and stable to those errors
go to post Yaron Munz · Jan 7 VS-Code will not delete on server. you need either to do that from studio, or use the d $System.OBJ.Delete() on terminal
go to post Yaron Munz · Dec 16, 2024 Hi @Ashok Kumar,you are correct, I was mistaken. The correct one is: "%System/%Login/Terminate"Auditing | InterSystems IRIS Data Platform 2024.3
go to post Yaron Munz · Dec 16, 2024 Hello, If you have audit enabled, and the item "%System/%Login/JobEnd" is enabled, you may know which user killed a process
go to post Yaron Munz · Dec 13, 2024 Hello, You may use an existing library for SSO using MS or google account with "Microsoft Authentication Library (MSAL)"Overview of the Microsoft Authentication Library (MSAL) - Microsoft identity platform | Microsoft Learn For CSP pages, we used the JavaScript library MSAL.js
go to post Yaron Munz · Oct 29, 2024 Does your class have a relationship property to another class?In that case you might want to consider to use: "CompileAfter" or "DependsOn" keywords (those might help the compiler to have a correct order when compiling).
go to post Yaron Munz · Oct 28, 2024 maybe there are dependencies for that class? (e.g. class depends on another class(es) that need to be compiled before). try to add "r" (recursive) flag, so your flags will look like: "ckr"
go to post Yaron Munz · Oct 28, 2024 You have other 2 options:1. With SQL against the %SYS.Task class/table (delete from %SYS.Task where id=taskID)2. Set sc = ##class(%SYS.Task).%DeleteId(taskID)
go to post Yaron Munz · Oct 28, 2024 you are correct, but what I've suggested a way to check also that the interoperability is running, on that specific server
go to post Yaron Munz · Oct 28, 2024 the SMP portal "about" page, has an option to choose the language. However, this persists for the current session only (in %session object). I would try to go with the solution proposed by @Raj Singh to use a browser add-on that can modify HTTP headers: (e.g. the: HTTP_ ACCEPT_LANGUAGE CGI variable). Intersystems could think of adding a user defined language, but not on the user profile since non local users (e.g. LDAP) are non persistent, so a global like ^ISC.someName(user)=Language could be the "best" way.We don't want to (or can't) modify classes for the portal (some of them doesn't have sources).This is a good candidate for the "intersystems ideas"
go to post Yaron Munz · Oct 28, 2024 If some of those 3000 classes are divided to different packages, I would try to do the load in "segments" (using the queue manager, to have this done in parallel). This might speed things up a bit.
go to post Yaron Munz · Oct 17, 2024 You can't directly do that in a BPL since it doesn't have persistence methods. You may convert your %DynamicArray into objectScript array or serialize your data into JSON that can be passed to PBL as a string
go to post Yaron Munz · Oct 17, 2024 The CSP gateway has a "mirror aware" function that will always point you to the primary in a failover pair. This works most of the times, but in rare cases it keep a connection disabled after a primary swtich. Another option is to use an external load balancer that has some kind of "health probe". Then, you could have a simple REST/API call (called by that health probe) that will return 200 for the primary and 404 (or 500) for the backup. This way going through that LB will always point you to the primary.