@Michael Davidovich , try this:
Do $System.Process.Terminate($Job,<desired error code>)
See e.g. https://github.com/intersystems-community/zpm/blob/master/src/%25ZPM/Pa…
- Log in to post comments
@Michael Davidovich , try this:
Do $System.Process.Terminate($Job,<desired error code>)
See e.g. https://github.com/intersystems-community/zpm/blob/master/src/%25ZPM/Pa…
select 1672185599,CONVERT(TIMESTAMP,CAST(1000000 * 1672185599 + POWER(2,60) As POSIXTIME))
Quite intuitive. ![]()
On further investigation, it seems that what I need just isn't in the old version I'm running.
Yes - see Security.Users (class reference) in the %SYS namespace.
Thank you Dan! The index/constraint distinction and SQL standard context are particularly helpful facts for this discussion. :)
There would still need to be some enforcement of the super parent being the only node with a NULL parent (and the point here is that the unique index wouldn't do that). Also finding all of the top-level nodes (assuming we could have multiple independent trees) would be a slightly more complicated.
Thank you for pointing this out! I saw this in docs but believe it wouldn't work for object-valued properties.
Hi @Steve Pisani - the same issue was reported via GitHub issues a little while back (https://github.com/intersystems/git-source-control/issues/137) but discussion trailed off and there wasn't any information there on the resolution.
You should always be able to upgrade zpm. I think the issue with <CLASS DOES NOT EXIST> error could be solved by running:
do ##class(%Studio.SourceControl.Interface).SourceControlClassSet("")then reinstalling, then reenabling SourceControl.Git.Extension as the source control class for the namespace.
Ultimately something funky is going on with SQL stats collection. Given a bit more info it might be possible to mitigate the issue in the git-source-control package. Happy to connect sometime to discuss/troubleshoot.
From a diagnostic perspective, I think the things to do (which we would do on such a call) would be:
* Force single-process compilation: Do $System.OBJ.SetQualifiers("/nomulticompile")
* Running a fancy zbreak command:
zbreak *%objlasterror:"N":"$d(%objlasterror)#2":"s ^mtempdbg($i(^mtempdbg))=%objlasterror"
* Force single-process load of the package (zpm "install git-source-control -DThreads=0")
Then look at the contents of ^mtempdbg to figure out where our errant %Status is coming from and go from there.
Where these third-party apps are mostly reporting tools, it could make sense to set up a Reporting Async mirror with read-only databases. That would handle the "clients shouldn't be able to insert/update/delete" issue and protect your main instance from rogue queries. (And these clients would only be allowed to connect to the reporting async.)
For the record, actually having a page with the above allows anyone with access to the page to get access to arbitrary files on the server. (see my comment below from May 18)
You need to either do very strict input validation on filepath, or (if the full path is known in the database) use %CSP.StreamServer properly with an encrypted path.
GitHub repo is here - https://github.com/SeanConnelly/CloudStudio
This is WICKED COOL.
@Jonathan Wald I'd recommend this approach, but using %XML.Exchange (in place of XML.Element - that's from TrakCare) and first having your persistent classes all extend %XML.Exchange.Adaptor.
@Michael Davidovich we'd generally set ^UnitTestRoot to the path in the cloned repo and then run tests right from there. Another workflow might be to have unit tests run in their own namespace that has all the same code/supporting data but is distinct from the namespace where you're doing dev work - in that case, it's safe to delete the unit tests from that second namespace. (But practically speaking we just use one namespace and don't delete.)
Re: best practices/shortcuts as an individual developer, I'd honestly just say embrace the automation and iterate in a feature branch with the tests running on commit to feature branches (not just main). It also helps to use the package manager to make a more modular codebase where each module has its own associated tests, so you're not rebuilding and testing everything every time.
In a case where you're (a) using the package manager and (b) being really strict about test coverage, combining the techniques in the article I linked earlier is a good approach - e.g., saying to run specific tests *and* measure test coverage. That helps answer "how well do the tests I'm currently writing cover the code I'm currently adding/changing?"
Of course, you can do this with the normal APIs for running unit tests with test coverage too. (TestCoverage.Manager also has DebugRunTestCase - think of it like %UnitTest.Manager but with test coverage built in.)
@Michael Davidovich - @Pravin Barton is from my team, so I have little to add to what he said - other than to also point to https://community.intersystems.com/post/unit-tests-and-test-coverage-ob…. If you're using InterSystems' package manager it makes tests much easier/more natural to run.
WARNING! When you implement this (using %CSP.StreamServer or whatever), make sure that you're not opening up a way for a malicious user to download any arbitrary file on the server.
@Vitaliy Serdtsev thank you for pointing this out. We've fixed the issue; it was actually a separate root cause from before.
This is a bug. We'll look into it. Thank you for bringing it to our attention.
Not a full answer, but OnDrawCell is the place to start: https://docs.intersystems.com/ens201815/csp/docbook/DocBook.UI.Page.cls…
Sure, here's one:
set req = ##class(%Net.HttpRequest).%New()
set req.SSLConfiguration="MBTA" // Set this up in the management portal under Security Management > SSL/TSL Configurations; this is the SSLConfig's "Name"
// For HTTP Basic authentication, simple route is e.g.:
set req.Username = "foo"
set req.Password = "bar"
// For other modes, e.g.:
do req.SetHeader("Authorization","Bearer abcd")
set sc = req.Get("https://api-v3.mbta.com/routes")
// TODO: check sc / $$$ThrowOnError(sc)
// TODO: check req.HttpResponse.StatusCode to make sure it's 200 and error handle appropriately
set obj = {}.%FromJSON(req.HttpResponse.Data)
zw obj // There's your object!Turns out the problem was *not* the woff files at all - it was the CSS file that used the font. Solution was:
set ^%SYS("CSP","MimeFileClassify","CSS")=$lb("text/css",0,"utf-8")Update: it looks like we now support local edit/save/compile of CSP application files, provided the web application name starts with "/csp" - see: https://github.com/intersystems-community/vscode-objectscript/pull/622
@Vitaliy Serdtsev thank you for pointing there! I couldn't find that via the new-and-improved (but still in progress) doc search and have provided feedback to that effect.
Side note: I love having the "Feedback" button on the right side in the docs, and it's great to see the amazing effort being put in to improving the docs, search, etc.
Yes. Here's a quick sample:
Class DC.Demo.SerialObject Extends %SerialObject
{
Property foo As %String;
Property bar As %String;
}
Class DC.Demo.IndexOnSerialObject Extends %Persistent
{
Property blah As DC.Demo.SerialObject;
Index blahFooBar On (blah.foo, blah.bar);
ClassMethod RunDemo()
{
Do ..%KillExtent()
Set inst = ..%New()
Set inst.blah.foo = "foo"
Set inst.blah.bar = "bar"
Do inst.%Save()
zw ^DC.Demo.IndexOnSerialObjectD,^DC.Demo.IndexOnSerialObjectI
}
}Which produces output:
d ##class(DC.Demo.IndexOnSerialObject).RunDemo()
^DC.Demo.IndexOnSerialObjectD=1
^DC.Demo.IndexOnSerialObjectD(1)=$lb("",$lb("foo","bar"))
^DC.Demo.IndexOnSerialObjectI("blahFooBar"," FOO"," BAR",1)=""Could be a difference of behavior related to select mode - could you try running the management portal query in ODBC and Display mode as well and see if that makes a difference? (This is most often the root cause I see when getting different results for the same query in different contexts.)
Generally I'd do this as follows:
Class DC.Demo.DefinesIndices Extends %Persistent [ Abstract, NoExtent ]
{
Index TXSBI On TextSearch(KEYS);
Index TXSSI On TextSimilarity(KEYS) [ Data = TextSimilarity(ELEMENTS) ];
Property TextSearch As %Text(LANGUAGECLASS = "%Text.English", MAXLEN = 1000, XMLPROJECTION = "NONE");
Property TextSimilarity As %Text(LANGUAGECLASS = "%Text.English", MAXLEN = 1000, SIMILARITYINDEX = "TXSSI", XMLPROJECTION = "NONE");
}
Class DC.Demo.InheritsIndices Extends DC.Demo.DefinesIndices
{
Storage Default
{
<Data name="InheritsIndicesDefaultData">
<Value name="1">
<Value>%%CLASSNAME</Value>
</Value>
<Value name="2">
<Value>TextSearch</Value>
</Value>
<Value name="3">
<Value>TextSimilarity</Value>
</Value>
</Data>
<DataLocation>^DC.Demo.InheritsIndicesD</DataLocation>
<DefaultData>InheritsIndicesDefaultData</DefaultData>
<IdLocation>^DC.Demo.InheritsIndicesD</IdLocation>
<IndexLocation>^DC.Demo.InheritsIndicesI</IndexLocation>
<StreamLocation>^DC.Demo.InheritsIndicesS</StreamLocation>
<Type>%Storage.Persistent</Type>
}
}Key points:
Simple solution:
Create a class extending %CSP.Page with:
ClassMethod OnPreHTTP() As %Boolean
{
Set %response.Status = ##class(%CSP.REST).#HTTP403FORBIDDEN
Quit 0
}From the %CSP.SessionEvents subclass, in OnStartRequest:
set %response.ServerSideRedirect = "<that classname>.cls"
Here's the problem:
The solution might be using a custom error page too.
So fun fact: in JavaScript, the string "0" is truthy (although the number 0 is falsy). That's what you're seeing here.
@Michael Davidovich you could/should definitely do that validation on the client as well. No need to go to the server to compare two dates.
To work as written, the ObjectScript method itself should return a truthy/falsy result (e.g., quit 0 or quit 1) rather than doing that return in JS.
If you turn on auditing for Terminate, Login, Logout, and Protect events you may see helpful things about what's happening with the JOB command (e.g., if it hits an error).