go to post Eduard Lebedyuk · Aug 2, 2022 Yes? Cubes need space to store thier copy of the facts and CPU is needed to build it all. As I said above, using async reporting mirror for cubes would completely remove the impact on the patient index.
go to post Eduard Lebedyuk · Jul 25, 2022 Yeah, the process had 64,5 Gb of virtual memory allocated and 40Gb actually used. Not surprised OOM killed that.
go to post Eduard Lebedyuk · Jul 25, 2022 It's a windows box issue, not InterSystems IRIS issue. I've seen a few, easiest solution is to delete localised DLLs, which would make all text English. Not perfect, but better than the gibberish.
go to post Eduard Lebedyuk · Jul 25, 2022 In your Business Service settings set ClassName to pacIPM.reqICNARC and ElementName to ADMISSION. I think you set only ClassName and did not set ElementName. In that case interoperability tries to match root object - ICNARC in your case to pacIPM.reqICNARC, and it fails because pacIPM.reqICNARC class does not have ADMISSION property.
go to post Eduard Lebedyuk · Jul 20, 2022 What particular question do you have? Please consider providing more information. You get a callresponse from a BO, copy it (or copy id as you'd leave the stream immutable) and create a new callrequest to another BO. Alternatively use ResponseClass to save results into persistent objects.
go to post Eduard Lebedyuk · Jul 18, 2022 SAM gives you graphana/prometheus out-of-the-box and connected to IRIS, that would be the easiest solution to deploy.
go to post Eduard Lebedyuk · Jul 16, 2022 Yes. Cube builds don't modify source data. Cubes copy all source data into an entirely separate set of tables.
go to post Eduard Lebedyuk · Jul 15, 2022 Absolutely not. During SYNC or REBUILD events Data Quality Manager cubes would only read patient index linkage definition data. There's no way for cubes to modify patient index linkage definition data. That said, reads, cube data writes, and additional cpu load during SYNC or REBUILD events might impact the running system, since there's a limited amount of resources available. You might want to schedule SYNCs and especially REBUILDs for low-load times (at nights or weekends depending on your usage patterns). Using async reporting mirror for cubes would completely remove the impact on the patient index.
go to post Eduard Lebedyuk · Jul 14, 2022 Replace hard delete with soft delete. You soft delete by creating a new property, usually a DeletedOn timestamp. If it's empty then the record is not deleted. Deletion now consists of setting DeletedOn property to a soft deletion timestamp. As an additional precaution you can add a BEFORE DELETE trigger which always errors out, forbidding hard deletions. It would save you from every delete except for global kill. Additionally, you can add versioning, check out this discussion.
go to post Eduard Lebedyuk · Jul 14, 2022 Create a unified DELETE trigger (foreach row/object). It would catch both SQL and Object access. That said I usually advise against hard delete. In virtually all cases soft delete is better. You soft delete by creating a new property, usually a DeletedOn timestamp. If it's empty then the record is not deleted. Deletion now consists of setting DeletedOn property to a soft deletion timestamp. As an additional precaution you can add a BEFORE DELETE trigger which always errors out, forbidding hard deletions. It would save you from every delete except for global kill.
go to post Eduard Lebedyuk · Jul 12, 2022 Ended up with this implementation: Parameter NOSECTION = "DEFAULT"; /// do ##class().INIToLocal(,.ini) ClassMethod INIToLocal(file, Output ini) As %Status { #dim sc As %Status = $$$OK kill ini set stream = ##class(%Stream.FileCharacter).%New() do stream.LinkToFile(file) set section = ..#NOSECTION while 'stream.AtEnd { set line = stream.ReadLine() set line=$zstrip(line, "<>w") continue:($e(line)="#")||($l(line)<3) if $e(line)="[" { set section = $e(line, 2, *-1) } else { set key = $zstrip($p(line, "="), "<>w") set value = $zstrip($p(line, "=", 2, *), "<>w") set ini(section, key) = value } } kill stream quit sc } /// do ##class().LocalToINI(.ini) ClassMethod LocalToINI(ByRef ini, file) As %Status { merge iniTemp = ini #dim sc As %Status = $$$OK set stream = ##class(%Stream.FileCharacter).%New() do stream.LinkToFile(file) set section = ..#NOSECTION set key=$o(iniTemp(section, ""),1,value) while (key'="") { do stream.WriteLine(key _ "=" _ value) set key = $o(iniTemp(section, key),1,value) } do stream.WriteLine() kill iniTemp(section) set section=$o(iniTemp("")) while (section'="") { do stream.WriteLine("[" _ section _ "]") set key=$o(iniTemp(section, ""),1,value) while (key'="") { do stream.WriteLine(key _ "=" _ value) set key = $o(iniTemp(section, key),1,value) } set section = $o(iniTemp(section)) do stream.WriteLine() } set sc = stream.%Save() kill stream, iniTemp quit sc }
go to post Eduard Lebedyuk · Jul 10, 2022 Open /restapi/sql/ web application and confirm that Password Auth is enabled. Might be only UnAuthenticated access is enabled, resulting in this error.
go to post Eduard Lebedyuk · Jun 17, 2022 Use Merge in your Business Service: Set genericRequest = ##class(Core.API.V1.Msg.GenericRequest).%New() Merge genericRequest.urlParameters = %request.Data And in BusinessOperation you can use this syntax: genericRequest.urlParameters("key")
go to post Eduard Lebedyuk · Jun 13, 2022 DTLs work in-proc rather than in-queue so you can avoid the creation of new messages altogether. To be specific let's say you have a DTL: <transform sourceClass='Ens.StringRequest' targetClass='Ens.StringResponse' create='new' language='objectscript' > <assign value='source.StringValue' property='target.StringValue' action='set' /> </transform> It would be compiled into this (simplified for clarity): Transform(source,target,aux="") { Set (tSC,tSCTrans,tSCGet)=1 Set target = ##class(Ens.StringResponse).%New() Do:$S($D(%Ensemble("DoTrace")):%Ensemble("DoTrace"),1:##class(Ens.Util.Trace).DoTrace()) ##class(Ens.Util.Trace).WriteTrace("xform",$classname(),"Transform","transform from source "_source_$S(source.%Extends("%Persistent"):"/"_source.%Id(),1:"")_" to target "_target_$S(target.%Extends("%Persistent"):"/"_target.%Id(),1:"")_"") Try { Set zVALz=source.StringValue, zVALz=$S($IsObject(zVALz):zVALz.%ConstructClone(), 1:zVALz) } Catch ex { If (..#IGNOREMISSINGSOURCE&&($$GetOneStatusText^%apiOBJ(ex.AsStatus())["<INVALID OREF>")) { Set tIgnore=1 } Else { Set tSC=ex.AsStatus() } } If 'tIgnore { Set target.StringValue=zVALz } } As you see the only new message is response and request/aux are not saved anywhere. The same holds true for BPL invocations. Assuming this process: <process language='objectscript' request='Ens.StringRequest' response='Ens.StringResponse' height='2000' width='2000' > <sequence xend='200' yend='350' > <transform name='dtl' class='dtl.dtl' source='request' target='response' xpos='200' ypos='250' /> </sequence> </process> You'll have this S method: Method S1(process As Ens.BusinessProcess, context As Ens.BP.Context, synctimedout As %Boolean, syncresponses As %ArrayOfObjects(ELEMENTTYPE="%Library.Persistent"), request As %Library.Persistent, response As %Library.Persistent) As %Status [ Language = objectscript, PublicList = (process, context) ] { Set $ZT="Trap",status=$$$OK do { Set iscTemp=$G(response) Set status=$classmethod("dtl.dtl","Transform",request,.iscTemp,"") If $$$ISERR(status) Quit Set response=iscTemp Do process.ClearAllPendingResponses() Set ..%NextState="Stop" } while (0) Exit Quit ..ManageState(status) Trap Set $ZT="",status=..ManageStatus(status,"S1") Goto Exit } which does nothing except for calling a Transform method. No queueing is used throughout DTL usage. So, there are several options. 1. Do not create a new message, but rather pass your existing message. DTLs do not modify source or aux messages. 2. Use a registered, rather than a persistent class to pass values - in that case it won't be saved at all.
go to post Eduard Lebedyuk · Jun 3, 2022 Picks up a message maybe (not sure what are you talking about in regards of jobs)? Check Reply Code Actions.
go to post Eduard Lebedyuk · May 25, 2022 Windows error codes are here. ERROR_SHARING_VIOLATION 32 (0x20) The process cannot access the file because it is being used by another process.
go to post Eduard Lebedyuk · May 6, 2022 Try: Set production = ##class(Ens.Config.Production).%OpenId(productionId) Set item = ##class(Ens.Config.Item).%OpenId(itemId) Do production.RemoveItem(item) Set sc = production.%Save()