Absolutely not.

During SYNC or REBUILD events Data Quality Manager cubes would only read patient index linkage definition data.

There's no way for cubes to modify patient index linkage definition data.

That said, reads, cube data writes, and additional cpu load during SYNC or REBUILD events might impact the running system, since there's a limited amount of resources available. You might want to schedule SYNCs and especially REBUILDs for low-load times (at nights or weekends depending on your usage patterns).

Using async reporting mirror for cubes would completely remove the impact on the patient index.

Replace hard delete with soft delete.

You soft delete by creating a new property, usually a DeletedOn timestamp. If it's empty then the record is not deleted.

Deletion now consists of setting DeletedOn property to a soft deletion timestamp.

As an additional precaution you can add a BEFORE DELETE trigger which always errors out, forbidding hard deletions. It would save you from every delete except for global kill.

Additionally, you can add versioning, check out this discussion.

Create a unified DELETE trigger (foreach row/object). It would catch both SQL and Object access.

That said I usually advise against hard delete. In virtually all cases soft delete is better.

You soft delete by creating a new property, usually a DeletedOn timestamp. If it's empty then the record is not deleted.

Deletion now consists of setting DeletedOn property to a soft deletion timestamp.

As an additional precaution you can add a BEFORE DELETE trigger which always errors out, forbidding hard deletions. It would save you from every delete except for global kill.

Ended up with this implementation:


Parameter NOSECTION = "DEFAULT";

/// do ##class().INIToLocal(,.ini)
ClassMethod INIToLocal(file, Output ini) As %Status
{
	#dim sc As %Status = $$$OK
	kill ini
	set stream = ##class(%Stream.FileCharacter).%New()
	do stream.LinkToFile(file)
	set section = ..#NOSECTION
	while 'stream.AtEnd {
		set line = stream.ReadLine()
		set line=$zstrip(line, "<>w")
		continue:($e(line)="#")||($l(line)<3)
		if $e(line)="[" {
			set section = $e(line, 2, *-1)
		} else {
			set key = $zstrip($p(line, "="), "<>w")
			set value = $zstrip($p(line, "=", 2, *), "<>w")
			set ini(section, key) = value
		}
	}
	
	kill stream
	quit sc
}

/// do ##class().LocalToINI(.ini)
ClassMethod LocalToINI(ByRef ini, file) As %Status
{
	merge iniTemp = ini
	#dim sc As %Status = $$$OK
	set stream = ##class(%Stream.FileCharacter).%New()
	do stream.LinkToFile(file)
	
	set section = ..#NOSECTION
	set key=$o(iniTemp(section, ""),1,value)
	while (key'="") {
		do stream.WriteLine(key _ "=" _ value)
		set key = $o(iniTemp(section, key),1,value)
	}	
	do stream.WriteLine()
	kill iniTemp(section)
	
	
	set section=$o(iniTemp(""))	
	while (section'="") {
		do stream.WriteLine("[" _ section _ "]")
		
		set key=$o(iniTemp(section, ""),1,value)
		while (key'="") {
			do stream.WriteLine(key _ "=" _ value)
			set key = $o(iniTemp(section, key),1,value)
		}
		
	 	set section = $o(iniTemp(section))
	 	do stream.WriteLine()
	}
	
	set sc = stream.%Save()	
	kill stream, iniTemp
	quit sc
}

DTLs work in-proc rather than in-queue so you can avoid the creation of new messages altogether.

To be specific let's say you have a DTL:

<transform sourceClass='Ens.StringRequest' targetClass='Ens.StringResponse' create='new' language='objectscript' >
<assign value='source.StringValue' property='target.StringValue' action='set' />
</transform>

It would be compiled into this (simplified for clarity):

Transform(source,target,aux="") {
	Set (tSC,tSCTrans,tSCGet)=1
	Set target = ##class(Ens.StringResponse).%New()
	Do:$S($D(%Ensemble("DoTrace")):%Ensemble("DoTrace"),1:##class(Ens.Util.Trace).DoTrace()) ##class(Ens.Util.Trace).WriteTrace("xform",$classname(),"Transform","transform from source "_source_$S(source.%Extends("%Persistent"):"/"_source.%Id(),1:"")_" to target "_target_$S(target.%Extends("%Persistent"):"/"_target.%Id(),1:"")_"")
	Try { Set zVALz=source.StringValue, zVALz=$S($IsObject(zVALz):zVALz.%ConstructClone(), 1:zVALz) }
	Catch ex { If (..#IGNOREMISSINGSOURCE&&($$GetOneStatusText^%apiOBJ(ex.AsStatus())["<INVALID OREF>")) { Set tIgnore=1 } Else { Set tSC=ex.AsStatus() } }
	If 'tIgnore { Set target.StringValue=zVALz }
}

As you see the only new message is response and request/aux are not saved anywhere.

The same holds true for BPL invocations. Assuming this process:

<process language='objectscript' request='Ens.StringRequest' response='Ens.StringResponse' height='2000' width='2000' >
<sequence xend='200' yend='350' >
<transform name='dtl' class='dtl.dtl' source='request' target='response' xpos='200' ypos='250' />
</sequence>
</process>

You'll have this S method:

Method S1(process As Ens.BusinessProcess, context As Ens.BP.Context, synctimedout As %Boolean, syncresponses As %ArrayOfObjects(ELEMENTTYPE="%Library.Persistent"), request As %Library.Persistent, response As %Library.Persistent) As %Status [ Language = objectscript, PublicList = (process, context) ]
{
 Set $ZT="Trap",status=$$$OK do {
 Set iscTemp=$G(response)
 Set status=$classmethod("dtl.dtl","Transform",request,.iscTemp,"")
 If $$$ISERR(status) Quit
 Set response=iscTemp
 Do process.ClearAllPendingResponses()
 Set ..%NextState="Stop"
 } while (0)
Exit Quit ..ManageState(status)
Trap Set $ZT="",status=..ManageStatus(status,"S1") Goto Exit
}

which does nothing except for calling a Transform method. No queueing is used throughout DTL usage.

So, there are several options.

1. Do not create a new message, but rather pass your existing message. DTLs do not modify source or aux messages.

2. Use a registered, rather than a persistent class to pass values - in that case it won't be saved at all.