How can I code without:
quit:$$$ISERR(sc) scit's basically 5% of the code I write.
- Log in to post comments
How can I code without:
quit:$$$ISERR(sc) scit's basically 5% of the code I write.
How about
do:condition ..Method()Your options are:
Cache is officially supported on Ubuntu 16.04 LTS according to Supported Platforms table.
If you're just starting I recommend using InterSystems IRIS (which is supported on Ubuntu 16.04 LTS and 18.04 LTS among other platforms).
You can download all kits from WRC:


You can try to write to a TCP device with SSL. Doesn't require additional permissions:
ClassMethod Exists(ssl As %String) As %Boolean
{
#dim exists As %Boolean = $$$YES
set host = "google.com"
set port = 443
set timeout = 1
set io = $io
set device = "|TCP|" _ ##class(%PopulateUtils).Integer(5000, 10000)
try {
open device:(host:port:/SSL=ssl):timeout
use device
// real check
write "GET /" _ $c(10),*-3
// real check - end
// should be HTTP/1.0 200 OK but we don't really care
//read response:timeout
//write response
} catch ex {
set exists = $$$NO
}
use io
close device
quit exists
}It's slower than direct global check but if you want to do it rarely, I think it could be okay. Doesn't require additional permissions.
Code to compare times:
ClassMethod ExistGlobal(ssl) [ CodeMode = expression ]
{
$d(^|"%SYS"|SYS("Security","SSLConfigsD",ssl))#10
}
/// do ##class().Compare()
ClassMethod Compare(count = 1, ssl = "GitHub")
{
Write "Iterations: ", count,!
Write "Config exists: ", ..Exists(ssl),!
set start = $zh
for i=1:1:count {
set exists = ..Exists(ssl)
}
set end = $zh
set time = end - start
Write "Device check: ", time,!
set start = $zh
for i=1:1:count {
set exists = ..ExistGlobal(ssl)
}
set end = $zh
set time2 = end - start
write "Global check: ", time2,!
}Results:
Iterations: 1
Config exists: 1
Device check: .054983
Global check: .000032
Iterations: 1
Config exists: 0
Device check: .017351
Global check: .00001
Iterations: 50
Config exists: 1
Device check: 2.804497
Global check: .000097
Iterations: 50
Config exists: 0
Device check: .906424
Global check: .000078You can use negative integers to subtract hours. DATEADD also words with timestamps:
write $SYSTEM.SQL.DATEADD("hour", -3, "2019-03-09 10:00:00")
>2019-03-09 07:00:00I can use ($ztimestamp) to get UTC time and then convert it into local time i am using the below way is this correct?
SET stamp=$ZTIMESTAMP
w !,stamp
SET localutc=$ZDATETIMEH(stamp,-3)
w $ZDATETIME(localutc,3,1,2)
Yes, sure.
My Question is how i can program this task in the below way
You heed to add three hours. Use DATEADD method for this:
write $SYSTEM.SQL.DATEADD("hour", 3, yourDate)ODBC log and maybe Audit log can contain additional information.
IRIS Text Analytics/iKnow and Analytics/DeepSee are enabled on per-application basis. Interoperability/Ensemble/HealthShare are enabled on a per-namespace basis.
First you need to get default application from the namespace:
set namespace = "USER"
set app = $System.CSP.GetDefaultApp(namespace) _ "/"And then call one of these methods:
do EnableIKnow^%SYS.cspServer(app)
do EnableDeepSee^%SYS.cspServer(app)If you want to enable Interoperability/Ensemble call:
set sc = ##class(%EnsembleMgr).EnableNamespace(namespace,1)You can do that in Analyzer.
Choose the row/column you want displayed this way, click on it's settings and set italic header:

You can use CreateDirectoryChain method of %File class to create directory tree instead of several calls to CreateDirectory.
Link to file doesn't need for a file to exist, but the containing directory must exist and should be writable by a OS user (cacheusr in uyour case probably).
I'd try to write into a temp dir first, where you're sure you have access:
set file = ##class(%File).TempFilename("pdf")
set sc = stream2.LinkToFile(file)
quit:$$$ISERR(sc) screcord value of file somewhere (output to display or store in global) and check if the file was created).
%Save method also returns status, you should return it instead of $$$OK:
set sc = stream2.%Save()
quit scThe code looks good. What error are you getting?
Try replacing stream2 %Stream.FileCharacter with %Stream.FileBinary.
Try replacing stream1 %Stream.FileCharacter with %Stream.TmpBinary.
If you have message sample less than 3,5 mb in size try to write a test without intermediate stream.
You probably should write to a pdf file and not a txt one.
If you can, get a sample original/decoded file. Compare original file and your stream2 file using hex editor to spot differences.
Check this table of Ens resources. Your role should have some of these resources to give user access to Ensemble pages.
I try to separate SQL from ObjectScript code. So mainly, I'm using queries, for example:
/// Some report. To display in terminal call:
/// do ##class(class).reportFunc().%Display()
Query report(date As %String = {$zd($h-1,8)}) As %SQLQuery(SELECTMODE = "ODBC")
{
SELECT
ID,
Value,
EventDate
FROM myTable
WHERE EventDate>=TO_POSIXTIME(:date,'YYYYMMDD') AND EventDate<TO_POSIXTIME(:date+1,'YYYYMMDD')
ORDER BY EventDate
}Queries can be calles from ObjectScript using autogenerated Func method:
/// Really %sqlcq.<NAMESPACE>.cls<NUMBER>
#dim rs As %SQL.ISelectResult
set rs = ..reportFunc(date)
//do rs.%Display()
while rs.%Next() {
write rs.ID,!
}I found this approach improves readability of the codebase. More about queries in this article.
Special case - one value.
Sometimes you don't need a resultset, but one value. In that case:
If you know ID it's possible to use GetStored method:
set value = ##class(test).<PropertyName>GetStored(ID)If you know unique indexed value but don't know ID, it's possible to get id with Exists method:
ClassMethod <IndexName>Exists(val, Output id) As %Boolean
And after that use GetStored method.
More on auto-generated methods, such as GetStored and Exists in this article.
Finally, if you can't use above methods or you need one value but it's an aggregate, use embedded SQL if it's a short SQL and Query if it's long.
Use methods from %CSP.Portal.Utils class:
set pageID = "" //page url, URL encoded via $zconvert(url,"O","URL")
set currentResource = ##class(%CSP.Portal.Utils).%GetCustomResource(pageID)
set sc = ##class(%CSP.Portal.Utils).%SetCustomResource(pageID, newResource)
I' do it in 2 steps.
You can add to array using %Push method:
do Obj.data.%Push(newItem)If you want to push at a specific posiiton, use %Set
do Obj.data.%Set(position, newItem)That said your later structure contains data which does not exist in the original structure (text values for projects and sub-projects) so you need to get it from somewhere.
Also is project - subproject hierarchy one-level, or there could be an arbitrary number of sub-project levels (i.e. 10 → 10-1 → 10-1-1 → 10-1-1-1)?
Enable ODBC log. Maybe there would be errors pointing to the root of the issue?
Maybe something to do with INFORMATION_SCHEMA.
Okay, even if the classes/globals are the same there's a solution. Let's say you have Sample.Person class in namespaces SAMPLES and USER, each with their own data:
Class Sample.Person Extends %Persistent
{
Property Name;
Storage Default
{
<Data name="PersonDefaultData">
<Value name="1">
<Value>%%CLASSNAME</Value>
</Value>
<Value name="2">
<Value>Name</Value>
</Value>
</Data>
<DataLocation>^Sample.PersonD</DataLocation>
<DefaultData>PersonDefaultData</DefaultData>
<IdLocation>^Sample.PersonD</IdLocation>
<IndexLocation>^Sample.PersonI</IndexLocation>
<StreamLocation>^Sample.PersonS</StreamLocation>
<Type>%Library.CacheStorage</Type>
}
}
And you want to query both from the USER namespace. In that case create a new class extending Sample.Person in the USER namespace and modify storage like this:
Class Utils.SamplesPerson Extends (%Persistent, Sample.Person)
{
Storage Default
{
<Data name="PersonDefaultData">
<Value name="1">
<Value>%%CLASSNAME</Value>
</Value>
<Value name="2">
<Value>Name</Value>
</Value>
</Data>
<DataLocation>^["SAMPLES"]Sample.PersonD</DataLocation>
<DefaultData>PersonDefaultData</DefaultData>
<ExtentSize>200</ExtentSize>
<IdLocation>^["SAMPLES"]Sample.PersonD</IdLocation>
<IndexLocation>^["SAMPLES"]Sample.PersonI</IndexLocation>
<StreamLocation>^["SAMPLES"]Sample.PersonS</StreamLocation>
<Type>%Library.CacheStorage</Type>
}Note that DataLocation, IndexLocation, IdLocation and StreamLocation point to the SAMPLES namespace.
Now query:
SELECT *
FROM Utils.SamplePersonWould fetch data from SAMPLES namespace.
Some ideas.
1. The problem is isc.rabbitmq.API class was imported with an error. Delete this class and try this code (post output):
Set class = "isc.rabbitmq.API"
Set classPath = ##class(%ListOfDataTypes).%New()
Do classPath.Insert(PATH-TO-JAR)
Set gateway = ##class(%Net.Remote.Gateway).%New()
Set sc = gateway.%Connect("localhost", PORT, $Namespace, 2, classPath)
Zwrite sc
Set sc = gateway.%Import(class)
Zwrite sc
Write ##class(%Dictionary.CompiledClass).%ExistsId(class)2. Try recompiling isc.rabbitmq.API class.
3. Maybe have 2 amqp jars with com.rabbitmq.tools.jsonrpc.JsonRpcException class. The only class you should import is isc.rabbitmq.API class. It shouldn't pull many additional ones.
4. What version of AMQP are you using? I'm using amqp-5.0.0. Try the same verison?
I think the easier solution would be to map classes and data to a target namespace and just execute the query in the same namespace. Documentation.
You can check out this RabbitMQ adapter. I had the same issue with stream interpretation and it's fixed there.
You can check out this RabbitMQ adapter. I had the same issue with stream interpretation and it's fixed there.
Great to see more AoC participants!
Also please note, that $ListValid returns 1/0 (%Boolean) and not a %Status. So $system.Status methods won't work.
set status = $LISTVALID(list)you can write something like this:
set isValid = $listValid(list)
if isValid {
// main code
} else {
write "List is invalid:",!
zwrite list
}Thank you Robert. That's exactly what I was searching for.
Check out my series of articles Continuous Delivery of your InterSystems solution using GitLab it talks about many features, related to automating these kind of tasks. In particular, Part VII (CD using containers) talks about programmatic enabling of OS-level authentication.
Abstract class could have this method:
ClassMethod GetNew() As test.Order {
set class = "test.Order"
if ..GetCompileDate($classname())'=..GetCompileDate(class) {
set sc = $system.OBJ.Compile($classname(),"cukb /display=none")
}
quit $classmethod(class, "%New")
}
ClassMethod GetCompileDate(class As %Dictionary.CacheClassname) [CodeMode = expression]
{
$$$comClassKeyGet(class,$$$cCLASStimechanged)
}
Eduard, that's a good suggestion, but one issue I see is that if you are processing millions of documents, that's a lot of database overhead. Using a simple global reference with locks would work better as you would only have n files persisted up to the Pool Size.
The solution I outlined at all times contains exactly up to PoolSize records, so I don't think it's a very big overhead. You can lock filenames I suppose, why not?
My next challenge is to determine a better way for the file service to pick off files from the OS folder without loading all the files into a potentially massive result set which can blow out local memory and in extreme cases quickly blow out CacheTemp DB size.
Ensemble File inbound adapter uses FileSet from %File class. This query uses $zsearch to iterate over files in a directory and populates a ppg with results at once. Calling Next on that result set only moves the ppg key. You can rewrite FileSet query to advance $zsearch on a Next call. Don't know how it would affect performance though.
I suppose you can solve this problem by separating your persistent class into abstract definition and persistent "storage only" class.
It could look like this:
Class test.Abstract.Order [ Abstract ]
{
Property a;
Method doStuff()
{
}
}and persistent class:
Class test.Order Extends (test.Abstract.Order, %Persistent)
{
/*Generated Storage */
}And only map test.Abstract package. This way you'll need to:
But in this setup persistent class could be tuned, etc.. Also you can automate deployment steps with CI/CD tools.
I should read the docs again. Removing M option helped.
Thank you, @Nick Zhokhov.