go to post Yaron Munz · Aug 22, 2022 First create a class: Class MyClass Extends %SYS.Monitor.SAM.Abstract Add a parameter that will indicate the prefi name for al your used defined metrics. Parameter PRODUCT = "Prefix"; Create a wrap method GetSensors() for all your user defined sensors (which can be ClassMethods): Method GetSensors() As %Status{Try { D ..SetSensor("sensor1", ..Sensor1()) D ..SetSensor("sensor2", ..Sensor2()) }Catch e { ;call your store error function g.e. ##class(anyClass).StoreError($classname(),e.DisplayString()) }}ClassMethod Sensor1() As %Integer{ ; do any calculation Quit Value}ClassMethod Sensor1() As %Integer{ ; do any calculation Quit Value}} Now you will get by the API REST call for the /api/mertics your "user defined" sensors at names:Prefix_sensor1 and Prefix_sensor2 Remarks:- Make sure that your GetSensors() and all your "used defined" sensors (classmethods) have a proper error handling so they are fail safe (you may use a try/catch or any other error trap like $ZT="something")- Make sue all your "user defined" sensors are preforming fast. This will enable the SAM metrics API REST call to get the data quickly without delays. In case some calculations are "heavy" it is better to have a separate process (task manager) to do those calculations, and store them in a global for fast data retrieval by the sensor
go to post Yaron Munz · Aug 19, 2022 When you install SAM, it is usually installed with a container. (we use docker, so I don't have experience with podman). We have this on a seperate machine (Linux) when our IRIS servers are Windows, but I don't see any limitation (except memory & CPU = performance) to run a container with SAM on the same IRIS machine. Grafana, and Promethus are part of the "bundle" (container) for SAM do you do not need to install them separately.
go to post Yaron Munz · Aug 17, 2022 Hello, I have done a similar thing in the past with Cache. As long as at the end of the process the IRIS will be at a new partition with the same letter, all the keys in registry that were created during the install, will still be valid.The risk with the procedure you mentioned is minimal, and all should work as expectged.
go to post Yaron Munz · Aug 15, 2022 get the mirror name: ZN "%sys"Set mirrorName=$system.Mirror.GetMirrorNames()Set result = ##class(%ResultSet).%New("SYS.Mirror:MemberStatusList")Set sc = result.Execute(mirrorName)while result.Next() { Set transfer=result.GetData(6) // you may filer the check for a specific machine on GetData(1) // Do any check on "transfer" to see if behind and calculate the threshold time e.g. // For i=1:1:$l(transfer," ") { // If $f($p(transfer," ",i),"hour") { W !,"hour(s) behind" } // Elseif $f($p(transfer," ",i),"minute") { Set minutes=$p(transfer," ",i-1) W !,minutes_" minutes behind" } // }}
go to post Yaron Munz · Aug 11, 2022 To get any component status, you may use: SELECT Name, Enabled FROM Ens_Config.Item where Name['Yourname' To check queues you may use the following sql : select Name,PoolSize from ENS_Config.Item where Production='YourProductionName' Then, iterate on the result set and get the queue depth by: Set QueueCount=##class(Ens.Queue).GetCount(Name) To check latest activiy on a component, I would go to the: SELECT * FROM Ens.MessageHeader where TargetQueueName='yourComponentName' and then, to check the TimeProcessed
go to post Yaron Munz · Aug 9, 2022 There is an option to get a long time token (refresh token), if tokens are "cached" locally, you may have a scheduled task to refresh them.Another approach I would try here, is to use embedded python with this library:https://pypi.org/project/O365/#authentication
go to post Yaron Munz · Aug 9, 2022 On IRIS 2022.1 the class (%Net.POP3).Connect(...) has a 4th parameter: AccessToken I di not try, but maybe it will allow a connection with OAuth
go to post Yaron Munz · Aug 5, 2022 You may do it programmatically: set mirrorName=$lg(##class(%SYSTEM.Mirror).GetMirrorNames(),1)zn "%SYS"Set result = ##class(%ResultSet).%New("SYS.Mirror:MemberStatusList") Set sc = result.Execute(mirrorName)while result.Next() { s status=result.GetData(9) // get all other info you need from GetData(nnn) }
go to post Yaron Munz · Aug 4, 2022 You have to distinguish between "journals" and "mirror-journals" files. the 1st are to ensure any instance DB integrity (DB corruption) in case of a failure. The 2nd are you ensure proper mirror "failover" and any A-Sync members. When LIVETC01 (as a Backup) is "catch up" its a good source to copy .DAT files to the LIVEDR.It is also safe to delete its mirror-journals.The steps you did to catch up the LIVEDR are correct. (I assume you did "activate" & "catch up" after that in LIVEDR) After the IRIS.DAT copy (of all DBs in mirror) from LIVETC01 to LIVEDR and both are "catch up" - It is safe to delete mirror-journals up to the point of the copy from your primary LIVETC02
go to post Yaron Munz · Aug 3, 2022 Hello, I would try to contact the author of the GIT you mentioned, to find out which version of Cache he used and get some help.looks like he was updating the GIT at Apr 10, 2022, so this might indicate that he is active.I saw that he is using a Unix driver "libcacheodbcur6435.so" but I'm not sure which version is this.
go to post Yaron Munz · Jul 14, 2022 Hello, The best way it to do it is to use the dictionary to loop on properties of the original class and create a new class which is identical, but with a different storage. The cloning is done by using %ConstructCloneUsually, the new class for backup, does not need to have methods, indices or triggers, so those can be "cleaned" before saving it. Have the original and the destination class objects: S OrigClsComp=##class(%Dictionary.CompiledClass).%OpenId(Class)S DestCls=OrigCls.%ConstructClone(1) You should give the destination class a name and type: S DestCls.Name="BCK."_Class , DestCls.Super="%Persistent" Usually the destination class does not need to have anything than the properties, so in case there are methods, triggers or indices that need to be removed from the destination class, you may do: F i=1:1:DestCls.Methods.Count() D DestCls.Methods.RemoveAt(i) ; clear methods/classmethodsF i=1:1:DestCls.Triggers.Count() D DestCls.Triggers.RemoveAt(i) ; clear triggersF i=1:1:DestCls.Indices.Count() D DestCls.Indices.RemoveAt(i) ; clear indices Setting the new class storage: S StoreGlo=$E(OrigCls.Storages.GetAt(1).DataLocation,2,*)S StoreBCK="^BCK."_$S($L(StoreGlo)>27:$P(StoreGlo,".",2,*),1:StoreGlo) S DestCls.Storages.GetAt(1).DataLocation=StoreBCKS DestCls.Storages.GetAt(1).IdLocation=StoreBCKS DestCls.Storages.GetAt(1).IndexLocation=$E(StoreBCK,1,*-1)_"I"S DestCls.Storages.GetAt(1).StreamLocation=$E(StoreBCK,1,*-1)_"S"S DestCls.Storages.GetAt(1).DefaultData=$P(Class,".",*)_"DefaultData" Then just save the DestCls S sc=DestCls.%Save()
go to post Yaron Munz · Jul 8, 2022 Usually the GREF is the total number of global references (per second). A given process can do a limited number of I/O operations/sec (this is due to the CPU clock speed).When there are bottlencecks, there are some tools that can tell you which part of your system (or code) can be improved. Monitoring with SAM or other tools can give you some numbers to work with. there is also a %SYS.MONLBL that can help you improve your code.Storage is also a consideration, sometimes a DB can be optimized to store data in a more compact way and save I/O (especially when you are on the cloud when disks are somehow slower than you have on premisse). One easy improvment is to do some "heavy" parts on your system (e.g. reports, massive data manipulatipons etc.) in parallel. This can be done with using the "queue manager" or with the %PARALLEL keyword fr SQL queries.A more complex way to go is to do a vertical or horizental scale of the system, of even sharding.
go to post Yaron Munz · Jul 8, 2022 Hello, We are using this on an a-sync mirror member, that we want to be "behind" for any X minutes (parameter) Here is a sample code: s sc=1 try { I ##class(%SYSTEM.Mirror).GetMemberType()'="Failover" { S diff=..GetDiff(debug) I diff < minutes { I $zcvt(##class(SYS.Mirror).AsyncDejournalStatus(),"U")="RUNNING" { D ##class(SYS.Mirror).AsyncDejournalStop() } } else { I $zcvt(##class(SYS.Mirror).AsyncDejournalStatus(),"U")'="RUNNING" { D ##class(SYS.Mirror).AsyncDejournalStart() } } } } catch e { ; any error trp you want } Quit sc
go to post Yaron Munz · Feb 28, 2022 I also suggest that you will try to use %PARALLEL. sometimes it helps. We have experienced that in some (very heavy queries) a good option is in some cases is to have a SP (stored procedure) that will get the query request parameters and run it in parallel "segments" using the build in "queue manager"
go to post Yaron Munz · Jul 29, 2021 Usually you do not mix between your "own" persistent (classes) data and Interoperability (aka Ensemble) request & respond messages (which are also persistent). When you purge Ensemble data. the request/response messages are also being purged ! A best practice is to have for each component its own request/response messages. e.g.PROD.BO.ComponentName.Msg.Request & PROD.BO.ComponentName.Msg.Responsewhere BO = business operation (so you could have BS for business service and BP for business process)and ComponentName is the name of your components.(sometimes few components, can share the same request & response messages, which is totally fine !)
go to post Yaron Munz · Jul 14, 2021 I agree with Eduard. However, there is an option to have global mapping (by subscript) to different databases that might be located on multiple disks for getting better I/O https://docs.intersystems.com/irislatest/csp/docbook/Doc.View.cls?KEY=GG...
go to post Yaron Munz · Jul 1, 2021 Hello Subramaniyan, If you can have a downtime for the DB, than you could write a script that dismount the DB, FTP it to another server and mount it again. This of course depends on the DB size and your network speed. If a downtime is not possible, I would recommend doing a "hot backup" then copy (or FTP) it to another server and restore it. Another option is to use "external backup" with using a "freeze" & "Thaw" to ensure data integrity. Further information: https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls...
go to post Yaron Munz · Jun 2, 2021 Hello, I assume you are looking for something like CDC (capture data changes) for . The basic idea is to programmatically read journal files record by record and analyze the SET/KILL ones (according to some dictionary you build to determine which globals or classes need the CDC capability). I have done something similar using the ^JRNUTIL https://docs.intersystems.com/irislatest/csp/docbook/Doc.View.cls?KEY=GC...
go to post Yaron Munz · Oct 14, 2019 Hello, your method returns %ArrayOfObjects so you need to create and populate it within your code... your code should look like (I have highlighted the relevant changes in code) : set booksRS = ##class(%ResultSet).%New("Library.Book,BooksLoaned") set rsStatus = booksRS.Execute() S books = ##class(%ArrayOfObjects).%New() if rsStatus = $$$OK { while booksRS.Next() { set book = ##class(Library.Book).%New() set book.Title = booksRS.Get("Title") set book.Author = booksRS.Get("Author") set book.Genre = booksRS.Get("Genre") set dbFriend = ##class(Library.Person).%OpenId(booksRS.Get("Friend")) set book.Friend = dbFriend Set sc = books.SetAt(book,$Increment(i)) } }else{ w !,"Error fetching books in GetoanedBooks()" } do booksRS.Close() return books
go to post Yaron Munz · Aug 27, 2019 Hello,\You may find documentation on how to work with streams here: https://docs.intersystems.com/iris20191/csp/docbook/DocBook.UI.Page.cls?KEY=GOBJ_propstreamin BPL you can add a "code" element where you can create and populate your stream with data,. or you can add a "call" element to call a classmethod where you implement the code.For example: to create a stream and write some data into it :(if you need to have this stream data vailable for other components of the BPL it is best to use a %context property for it) set stream = ##class(%Stream.GlobalCharacter).%New() do stream.Write("some text") do stream.Write(%request.anyproperty) do stream.Write(%context.anyproperty) in your request / response messages to pass to BO you will have to add a stream property : Property MyProp As %Stream.GlobalCharacter;