Key Questions
Value Answers next?
- Log in to post comments
Key Questions
Value Answers next?
Check this article series.
Is there a solution for latest?
Spent like half an hour on this puzzle with no results.
What particular question do you have? Please consider providing more information.
You get a callresponse from a BO, copy it (or copy id as you'd leave the stream immutable) and create a new callrequest to another BO.
Alternatively use ResponseClass to save results into persistent objects.
Completely agreed.
Yes. Cube builds don't modify source data.
Cubes copy all source data into an entirely separate set of tables.
Absolutely not.
During SYNC or REBUILD events Data Quality Manager cubes would only read patient index linkage definition data.
There's no way for cubes to modify patient index linkage definition data.
That said, reads, cube data writes, and additional cpu load during SYNC or REBUILD events might impact the running system, since there's a limited amount of resources available. You might want to schedule SYNCs and especially REBUILDs for low-load times (at nights or weekends depending on your usage patterns).
Using async reporting mirror for cubes would completely remove the impact on the patient index.
I mean try executing something else using $zf(-1) to check if $zf(-1) fails entirely or only for this specific command.
Do you nave xargs and zip installed?
Do other $ZF(-1) commands work?
You soft delete by creating a new property, usually a DeletedOn timestamp. If it's empty then the record is not deleted.
Deletion now consists of setting DeletedOn property to a soft deletion timestamp.
As an additional precaution you can add a BEFORE DELETE trigger which always errors out, forbidding hard deletions. It would save you from every delete except for global kill.
Additionally, you can add versioning, check out this discussion.
Create a unified DELETE trigger (foreach row/object). It would catch both SQL and Object access.
That said I usually advise against hard delete. In virtually all cases soft delete is better.
You soft delete by creating a new property, usually a DeletedOn timestamp. If it's empty then the record is not deleted.
Deletion now consists of setting DeletedOn property to a soft deletion timestamp.
As an additional precaution you can add a BEFORE DELETE trigger which always errors out, forbidding hard deletions. It would save you from every delete except for global kill.
There seems to be different ways to approach declared IRIS state by codifying things, you can codify the exported objects and import them or like you mentioned, use the installer method that builds things as code....
Indeed, there is a declarative approach and imperative code approach. And there are several declarative ways to populate the data. %Installer is excellent for the initial population, but the user must remember to check for the existence of items he's trying to create. That adds challenges for CI/CD in non-containerized environments.
Ended up with this implementation:
Parameter NOSECTION = "DEFAULT";
/// do ##class().INIToLocal(,.ini)
ClassMethod INIToLocal(file, Output ini) As %Status
{
#dim sc As %Status = $$$OK
kill ini
set stream = ##class(%Stream.FileCharacter).%New()
do stream.LinkToFile(file)
set section = ..#NOSECTION
while 'stream.AtEnd {
set line = stream.ReadLine()
set line=$zstrip(line, "<>w")
continue:($e(line)="#")||($l(line)<3)
if $e(line)="[" {
set section = $e(line, 2, *-1)
} else {
set key = $zstrip($p(line, "="), "<>w")
set value = $zstrip($p(line, "=", 2, *), "<>w")
set ini(section, key) = value
}
}
kill stream
quit sc
}
/// do ##class().LocalToINI(.ini)
ClassMethod LocalToINI(ByRef ini, file) As %Status
{
merge iniTemp = ini
#dim sc As %Status = $$$OK
set stream = ##class(%Stream.FileCharacter).%New()
do stream.LinkToFile(file)
set section = ..#NOSECTION
set key=$o(iniTemp(section, ""),1,value)
while (key'="") {
do stream.WriteLine(key _ "=" _ value)
set key = $o(iniTemp(section, key),1,value)
}
do stream.WriteLine()
kill iniTemp(section)
set section=$o(iniTemp(""))
while (section'="") {
do stream.WriteLine("[" _ section _ "]")
set key=$o(iniTemp(section, ""),1,value)
while (key'="") {
do stream.WriteLine(key _ "=" _ value)
set key = $o(iniTemp(section, key),1,value)
}
set section = $o(iniTemp(section))
do stream.WriteLine()
}
set sc = stream.%Save()
kill stream, iniTemp
quit sc
}Open /restapi/sql/ web application and confirm that Password Auth is enabled. Might be only UnAuthenticated access is enabled, resulting in this error.
I think you need to init msgstream.
In the same directory you have your custom CSP.ini.
While local access is not possible from Native API, you can work with production items directly:
prod = iris_native.classMethodObject("Ens.Config.Production", "%OpenId", "ProductionClassName")
items = prod.getObject("Items")
count = items.invokeInteger("Count")
for i in range(1,count+1):
item = items.invokeObject("GetAt", i)
// do something with item hereAlternatively, you can either use SQL access (check Ens_Config.Item table) or write a proxy method in objectscript to marshall locals (to dicts probably).
Calling @Stefan Wittmann
Thank you, @Julius Kavay got it.
Thanks for noticing this. Fixed the tests.
What about defining a super parent, which is the only node with a NULL parent?
In that case every other root node has this node as a parent and filtering is easy (WHERE parent IS NULL filters only the super parent). And there's no need for an additional calculated property in that case.
Great article, Sergei!
If it's a non-production instance you can try running iris windows service (IRIS controller for IRISHEALTH in your case) under your windows user account, instead of the default system one. That usually helps.
If it's not possible, either give system account rights to irissession or create a new user with rights for irissession and run the iris service under that account.
Each app should use it's own user with a minimal set of roles and permissions.
DTLs work in-proc rather than in-queue so you can avoid the creation of new messages altogether.
To be specific let's say you have a DTL:
<transform sourceClass='Ens.StringRequest' targetClass='Ens.StringResponse' create='new' language='objectscript' >
<assign value='source.StringValue' property='target.StringValue' action='set' />
</transform>It would be compiled into this (simplified for clarity):
Transform(source,target,aux="") {
Set (tSC,tSCTrans,tSCGet)=1
Set target = ##class(Ens.StringResponse).%New()
Do:$S($D(%Ensemble("DoTrace")):%Ensemble("DoTrace"),1:##class(Ens.Util.Trace).DoTrace()) ##class(Ens.Util.Trace).WriteTrace("xform",$classname(),"Transform","transform from source "_source_$S(source.%Extends("%Persistent"):"/"_source.%Id(),1:"")_" to target "_target_$S(target.%Extends("%Persistent"):"/"_target.%Id(),1:"")_"")
Try { Set zVALz=source.StringValue, zVALz=$S($IsObject(zVALz):zVALz.%ConstructClone(), 1:zVALz) }
Catch ex { If (..#IGNOREMISSINGSOURCE&&($$GetOneStatusText^%apiOBJ(ex.AsStatus())["<INVALID OREF>")) { Set tIgnore=1 } Else { Set tSC=ex.AsStatus() } }
If 'tIgnore { Set target.StringValue=zVALz }
}
As you see the only new message is response and request/aux are not saved anywhere.
The same holds true for BPL invocations. Assuming this process:
<process language='objectscript' request='Ens.StringRequest' response='Ens.StringResponse' height='2000' width='2000' >
<sequence xend='200' yend='350' >
<transform name='dtl' class='dtl.dtl' source='request' target='response' xpos='200' ypos='250' />
</sequence>
</process>You'll have this S method:
Method S1(process As Ens.BusinessProcess, context As Ens.BP.Context, synctimedout As %Boolean, syncresponses As %ArrayOfObjects(ELEMENTTYPE="%Library.Persistent"), request As %Library.Persistent, response As %Library.Persistent) As %Status [ Language = objectscript, PublicList = (process, context) ]
{
Set $ZT="Trap",status=$$$OK do {
Set iscTemp=$G(response)
Set status=$classmethod("dtl.dtl","Transform",request,.iscTemp,"")
If $$$ISERR(status) Quit
Set response=iscTemp
Do process.ClearAllPendingResponses()
Set ..%NextState="Stop"
} while (0)
Exit Quit ..ManageState(status)
Trap Set $ZT="",status=..ManageStatus(status,"S1") Goto Exit
}which does nothing except for calling a Transform method. No queueing is used throughout DTL usage.
So, there are several options.
1. Do not create a new message, but rather pass your existing message. DTLs do not modify source or aux messages.
2. Use a registered, rather than a persistent class to pass values - in that case it won't be saved at all.