go to post David Underhill · Nov 7, 2019 Could you tell us the OS and open command you are using, it might help with suggestions.
go to post David Underhill · Nov 6, 2019 Open devices for a process can be seen in the Processes section of the portal. The initial process list shows the principal or current device and in the individual process page it shows all open devices, this can also be found using the %SYS.ProcessQuery OpenDevices method so you could write code that goes through all processes checking. What these show may depend on how you open the device in the first place.
go to post David Underhill · Nov 1, 2019 I know Jeff and Raj are busy people but there never was a response. Intersystems simply need to commit to an answer as at this point the actual answer isn't really important anymore, just getting an answer so commitments to any change can be made.
go to post David Underhill · Oct 28, 2019 That looks better than the WRC supplied code when I fist noticed the binary rather than hex values returned. set text = "message digest" set md5hash = $system.Encryption.MD5Hash(text) set md5HashHex = "" for i=1:1:$Length(md5hash) { set md5HashHex = md5HashHex _ $Zhex($Ascii($Extract(md5hash,i))) } write md5HashHex, !
go to post David Underhill · Oct 28, 2019 I am in the same position as you, only selectively using full commands or function names, being too much in the habit of MUMPs abbrevations. Like you though I tend to extend when iterating or using something less usual in the code, I have also found curly braces to be much better for readability than the old block structure. I also agree that I find it faster to read and write abbrevated code for things like sets and do's and especially for functions such as $p and $g. Hopefully I can get into the habit of less abreviation as I do notice the new programmers finding this harder. Probably the biggest habit I have broken is also shown in the bad examples given and shows how long I have been using MUMPS, yes it's entering everything in UPPERCASE. I finally use camel case and also more meaningful/longer routine and variable names.
go to post David Underhill · Oct 22, 2019 Thanks for adding that, in the case of a preview it might be worth adding this to the "Notes" section to save cross referencing with release notifications or the usual downloads page.
go to post David Underhill · Oct 21, 2019 The answers given should help with the $ZU's you have listed but if you have further queries or new entries then the Cache 2008 documentation available at the link below should help. https://cedocs.intersystems.com/ens20082/csp/docbook/DocBook.UI.Page.cls?KEY=RCOS_C24792
go to post David Underhill · Oct 18, 2019 That looks ideal for what we need, not sure how I missed this in the past.
go to post David Underhill · Oct 17, 2019 Thanks for the clarification. The fact that the current download is slightly different from the norm due to a Global Summit launch does highlight the need for a brief explanation of the version you are downloading though.
go to post David Underhill · Oct 14, 2019 I second having the conatiners available in the same place. Also could you show the version number available before downloading and are these production releases or previews?
go to post David Underhill · Oct 3, 2019 We had a similar issue at a site and it turned out the routine index was corrupt, recompiling all routines solved this.
go to post David Underhill · Oct 3, 2019 Just a quick note. I found that when creating a new database it was best to initially use SYS.Database so you can specifiy max size etc.. s db=##class(SYS.Database).%New() s db.Directory=directory s db.Size=initialSize s db.MaxSize=maxSize s db.GlobalJournalState=3 s Status=db.%Save() Then finalise with Config.Database s Properties("Directory")=directory s Status=##Class(Config.Databases).Create(name,.Properties) s Obj=##Class(Config.Databases).Open(name) s Obj.MountRequired=1 s Status=Obj.%Save() This might not be the best way to do it, I'm open to improvements.
go to post David Underhill · Oct 3, 2019 Hello Dmitry, I have used the extension and am very impressed but obviously any change in toolset has to be agreed accross the business and will require changes to the development and version control process, hence a reluctance until we know what Intersystems have planned. Apologies if I have missed a feature post but does the extension replicate the Studio Add-Ins such as the SOAP Wizard as these are very useful to us. Regards David
go to post David Underhill · Oct 1, 2019 Hello Evgeny, Thanks for the response, of course I am aware of the various plugins hence my question of relying on 3rd party tools. That old post does not really answer the question on what Intersystems future plans for development tools are, it just confirms effective end of life for the current ones. Also, only fixing critical issues means reported problems will generally not be fixed anymore as they are pretty stable, something I have already experienced. I don't have a particular problem with moving to VSCode supported by 3rd parties but some confirmation of this from Intersystems so developers can plan to move in that direction would be appreciated but noone seeems to want to commit to any answers. It also starts the question of how will Intersystems provide the ability for these 3rd party tools to replicate and extend the functionality available already, again there are no answers. Regards David
go to post David Underhill · Oct 1, 2019 Hello Evgeny, Slightly off-topic but it is interesting you are using VSCode. I have asked at various times, including the last symposium, what the plans are for development tools given that both Studio and Atelier have essentially been end of life for quite a while now, well over a year, but the response is always the same, there is a plan but we cannot announce it yet. Is VSCode the way Intersystems is going? Does this mean we are now relient on 3rd party development tools? Are there plans to create some way to replicate functionality such as the SOAP wizard in VSCode? I was hoping there might be some details out of the summit but I haven't seen anything yet. Regards David
go to post David Underhill · Aug 21, 2019 Yes, when Cache was up we had a class that would run in Cache Task Manager which would alert on issues and also log metrics.In Windows we had a script that ran in Scheduled Tasks which would alert if the Cache status was invalid (i.e. not "running" or "down") using "ccontrol list nodisplay > outputfile" or if alerts.log existed.
go to post David Underhill · Aug 21, 2019 We coded all the journal and data file monitoring in Cache itself and just had simple checks on the running status and alerts.log externally, keeps it fairly platform independent.On windows we used vbscript to check service status and log contents.
go to post David Underhill · Aug 20, 2019 Hello Gagan,Have you checked the value in sc from ftp.Store?Also at that point you can check the values in ftp.ReturnCode and ftp.ReturnMessage for more details on the failure. This is also assuming that the status from ftp.Connect is ok and ftp.Connected is true.
go to post David Underhill · Jul 12, 2019 No, this works independantly. It could be a lack of understanding but the problem we had is that the monitor is just that, it provides warnings and alerts based on parameters or out of norm measurements.We wanted an easy way to report on usage so we could spot trends/growth over time and possible future issues.
go to post David Underhill · May 10, 2019 A valid point but it can depend on how the command string is formed in the first place, unless you write a parser to break a command string down into a command plus arguements. I agree that you may as well use $zf(-1 but as the documentation will point you to use $zf(-100 then it can be valid.It's also usefull to know you can use brackets in this way for general knowledge.