I have an above error when purging record map batches and was wondering if anyone out there has ever experienced this and if they have please any advice
Failed to purge body for header 9747192, BodyClassname='******.Batch':ERROR #5823: Cannot delete object, referenced by '*****.Record.%ParentBatch'
I needed to know programmatically if last ran failed or not.
After some exploring, here's the code:
ClassMethod isLastTestOk() As %Boolean
{
set in = ##class(%UnitTest.Result.TestInstance).%OpenId(^UnitTest.Result)
for i=1:1:in.TestSuites.Count() {
#dim suite As %UnitTest.Result.TestSuite
set suite = in.TestSuites.GetAt(i)
return:suite.Status=0 $$$NO
}
quit $$$YES
}
I am trying to get the time difference between two time stamps one is recorded earlier to the one happening current but the problem is sql expect string while I have the other stored in a variable and if I do the following I get errors any help please
How Tax Service, OpenStreetMap, and InterSystems IRIS could help developers get clean addresses
Pieter Brueghel the Younger, Paying the Tax (The Tax Collector), 1640
In my previous article, we just skimmed the surface of objects. Let's continue our reconnaissance. Today's topic is a tough one. It's not quite BIG DATA, but it's still the data not easy to work with: we're talking about fairly large amounts of data. It won't all fit into RAM at once, and some of it won't even fit on the drive (not due to lack of space, but because there's a lot of junk). The name of our subject is FIAS DB: the Federal Information Address System database - the databases of addresses in Russia. The archive is 5.5 GB. And it's a compressed XML file. After extraction, it will be a full 53 GB (set aside 110 GB for extraction). And when you start to parse and convert it, that 110 GB won't be enough. There won't be enough RAM either.
First time post, also a new Cache developer, hence the <Beginner> tag.
If our data has Predefined terms in a dictionary, and a user can add terms on their own, can the terms exist in different tables?
Lets call the tables "Terms" and the user data in "UserTerms".
If a third class definition has a property of "Term" can it not be either Terms or UserTerms?
I'm leaning towards using a Subclass strategy where the pseudo "Parent" (forgive me) is Dictionary.Term and the child is along the lines of Dictionary.Term.User
Is it possible to mimic what selenium does like navigating to a site and logging in and filling out a form then logout in COS.I am trying to do that in COS using %Net.HttpRequest class or should I be using a different class the idea is to be able to call a web app login into it and fill out form and log out
I am planning to implement Business Intelligence based on the data in my instances. What is the best way to set up my databases and environment to use DeepSee?
In part 1 we started working on a security model for DeepSee and create a user type having privileges typical of end users. In this part we are going to create a second user type with ability to edit and create DeepSee pivot tables and dashboards.
This is my first post, I have only been using Healthshare for a year.
We support multiple Healthshare test and development environments. We are trying to come up with the best solution for building an environment from scratch, as well as incremental updates. I am interested in hearing the pros and cons between using the Ensemble -> Export Production feature versus creating custom classes to do the install and setup.
InterSystems Data Platform includes utilities and tools for system monitoring and alerting, however System Administrators new to solutions built on the InterSystems Data Platform (a.k.a Caché) need to know where to start and what to configure.
This guide shows the path to a minimum monitoring and alerting solution using references from online documentation and developer community posts to show you how to enable and configure the following;
Caché Monitor: Scans the console log and sends emails alerts.
System Monitor: Monitors system status and resources, generating notifications (alerts and warnings) based on fixed parameters and also tracks overall system health.
Health Monitor: Samples key system and user-defined metrics and compares them to user-configurable parameters and established normal values, generating notifications when samples exceed applicable or learned thresholds.
History Monitor: Maintains a historical database of performance and system usage metrics.
pButtons: Operating system and Caché metrics collection scheduled daily.
Remember this guide is a minimum configuration, the included tools are flexible and extensible so more functionality is available when needed. This guide skips through the documentation to get you up and going. You will need to dive deeper into the documentation to get the most out of the monitoring tools, in the meantime, think of this as a set of cheat sheets to get up and running.
In this series of articles, I'd like to present and discuss several possible approaches toward software development with InterSystems technologies and GitLab. I will cover such topics as:
Git 101
Git flow (development process)
GitLab installation
GitLab Workflow
Continuous Delivery
GitLab installation and configuration
GitLab CI/CD
In the previous article, we covered Git basics, why a high-level understanding of Git concepts is important for modern software development, and how Git can be used to develop software. Still, our focus was on the implementation part of software development, but this part presents:
GitLab Workflow - a complete software life cycle process - from idea to user feedback
Continuous Delivery - software engineering approach in which teams produce software in short cycles, ensuring that the software can be reliably released at any time. It aims at building, testing, and releasing software faster and more frequently.
I have a few cubes and numerous dashboards and I am ready to deploy them to our end users and administrators. How to configure DeepSee so that users don’t disrupt each other’s areas and are restricted from using functionalities specific to developers?
As you see, timezone is lost. Docs for $zdth in timeopt (5) state: Specify time in the form "hh:mm:ss+/-hh:mm" (24-hour clock). The time is specified as local time. The following optional suffix may be supplied, but is ignored: a plus (+) or minus (–) suffix followed by the offset of local time from Coordinated Universal Time (UTC)
This is a quick tutorial how to install and use TFS in Atelier. It is based on my self experience and some tricks that I 've noted.
If you are used to using visual studio maybe you feel that is a bit slow and heavy, but you have the same TFS panel as you have in Visual Studio, so don't need any special "training" to use it
Recently I needed a classmethod that returns annotation value based on a name of a activity.
As doing it at runtime seemed inefficient, I wrote compile-time utility that iterates over all business process activities and generates relevant code.
This code could be used in a variety of situations when you need to iterate over business process activities, just add it as a secondary superclass to your BPL processes.
How we can reduce the size of cache.dat file? Even after deleting the globals of a particular database from management portal size of its cache.dat file is not reduced.
I have an in-memory list of items and I want to check which items match my pattern string.
Pattern string is a comma-separated list of items and special symbols like '*' and maybe '?'.
There's something similar in $system.OBJ.Compile, it accepts patterns: "*.data.*,Sample.*" - and it would compile 'Sample' package and all 'data' packages.
For example:
set list=$lb("abc", "c", "aaa", "bbb")
set result = ..match(list, "a*,*b")
zw result
result=$lb("abc","aaa","bbb")
There are situations where we want to provide immediate feed back to inbound Web Service that a particular business operation is not running (status <> "running"). We don't even want to queue up the message. We just want the webservice to respond with an error stating the business operation is down.
I have some beginner questions as I am working through the InterSystems Cache learning path:
- Where I work, we us Cache, but we often learning about and train on MUMPS. No one really talks about or mentions MUMPS here, but my understanding is that ObjectScript is basically MUMPS plus whatever new things InterSystems put on top of it. Is that a fair assessment?
The following post outlines an architectural design of intermediate complexity for DeepSee. As in the previous example, this implementation includes separate databases for storing the DeepSee cache, DeepSee implementation and settings. This post introduces two new databases: the first to store the globals needed for synchronization, the second to store fact tables and indices.
Example: I have a list of tags that I have to find, and a string with these and other tags separated by commas.
How to find the desired tags in the string optimally?