As long as you insert only with SQL (without %NOINDEX flag) or objects (%Save), you don't have to rebuild indices. But some gotchas to remember:

  • If the table has some data before you added an index, you need to build the index after adding it (as adding an index does not build it for preexisting data)
  • If you ran sql queries which filtered based on an indexed column value, they won't automatically take advantage of the index, you need to purge all queries associated with the newly indexed table.
  • If you use %NOINDEX or direct global access to add rows, indexes must be built manually later

Here's the code to create task to purge messages in all interoperability namespaces:

set sc = ##class(%SYS.Namespace).ListAll(.result) 
kill result("%SYS"), result("HSCUSTOM"), result("HSLIB"), result("HSSYS"), result("REPO") 
while 1 {set ns = $o(result($g(ns))) quit:ns="" continue:$e(ns,1,2)="^^" set $namespace = ns continue:'##class(%Dictionary.ClassDefinition).%ExistsId("Ens.MessageHeader") set task=##class(%SYS.Task).%New(),task.Name = "Purge old Interoperability data in " _ $namespace,task.NameSpace=$Namespace,task.TimePeriod=0,task.TimePeriodEvery=1,task.DailyFrequency=0,task.DailyFrequencyTime="",task.DailyIncrement="",task.DailyStartTime = 3600,task.DailyEndTime = "",task.StartDate = $p($H,",",1)+1,task.Priority = 2,task.Expires = 0,taskdef = ##class(Ens.Util.Tasks.Purge).%New(),taskdef.BodiesToo = 1,taskdef.KeepIntegrity = 1,taskdef.NumberOfDaysToKeep = 1,taskdef.TypesToPurge = "all",sc = task.AssignSettings(taskdef),task.TaskClass=$classname(taskdef),sc = task.%Save()}

Here's how to fix that.

1. Create Inbound adapter which extends default inbound adapter and exposes DeleteFromServer setting:

Class Test.InboundAdapter Extends EnsLib.File.InboundAdapter
{
Parameter SETTINGS = "DeleteFromServer:Basic";
}

2. Create Passthrough Service, which uses your custom adapter:

Class Test.PassthroughService Extends EnsLib.File.PassthroughService
{
Parameter ADAPTER = "Test.InboundAdapter";
}

3. Use your class (2) when you create a new BS, it will have DeleteFromServer property:

I filed an enhancement request, please use DP-422980 as an identifier if you would contact WRC on this topic.

Now, why isn't $ZOBJREF() in the documentation?

What's the use case for this function?

Here's some (autotranslated) info about thesefunctions.

Also $zobjref accepts only integers, so you can pass just the part before @:

set a={}
set b={}
set obj1=$zobjref(1)
set obj2=$zobjref("1@Sample.Person")
zw

Results in:

a=<OBJECT REFERENCE>[1@%Library.DynamicObject]
b=<OBJECT REFERENCE>[2@%Library.DynamicObject]
obj1=<OBJECT REFERENCE>[1@%Library.DynamicObject]
obj2=<OBJECT REFERENCE>[1@%Library.DynamicObject]

There's also no guarantee that the object would be the same i.e.:

set a={"a":1}
set b={"b":1}
set aoref = ""_ a
kill a
set c={"c":1}
set obja=$zobjref(aoref)
zw obja
> obja={"c":1}  ; <DYNAMIC OBJECT>

I recommend you to check this article, but here's a summary:

1. Calculate a list of BHs which need a restart (not sure why you need regexp, all BHs are in Ens_Config.Item table):

SELECT %DLIST(Name) bhList
FROM Ens_Config.Item 
WHERE 1=1
  AND Enabled = 1
  AND Production = :production
  AND ClassName %INLIST :classList -- or some other condition

2. Restart them all at once instead of one by one:

for stop = 1, 0 {
  for i=1:1:$ll(bhList) {
    set host = $lg(bhList, i)
    set sc = ##class(Ens.Director).TempStopConfigItem(host, stop, 0)
  }
  set sc = ##class(Ens.Director).UpdateProduction()
}

here I want to capture the error details in trace , log , that I can see in production web page

If you want to quit processiong, it's enough to either quit:

 quit:$$$ISERR(sc) sc

Or raise an error (if you're several levels deep for example):

$$$TOE(sc, sc)

If you don't want to interrupt processing, use $$$LOG macroes, for example:

$$$LOGWARNING($System.Status.GetErrorText(sc))

This code would create a new Log entry of a warning type.

Project the list of geo.model.Point as a separate table:

Class geo.model.Line Extends %Persistent
{
Property points As list Of geo.model.Point(SQLPROJECTION = "table/column");
}

And you can use SQL query (via iris.sql) to get all points in line:

SELECT
    points_latitude, 
    points_longitude
FROM geo_model.Line_points
WHERE Line = ?
ORDER BY element_key

If you have thousands of points that would likely be the fastest way to transfer (barring callin/callout shenanigans).

I guess you need to flush the buffer so only python writes? Something like this should work:

Class Python.App.Dispatch Extends %CSP.REST
{

XData UrlMap [ XMLNamespace = "https://www.intersystems.com/urlmap" ]
{
<Routes>
    <Route Url="/test" Method="GET" Call="Wrapper" />
</Routes>
}

ClassMethod Wrapper()
{
	write *-3
	do ..Hello()
	q $$$OK
}

ClassMethod Hello() [ Language = python ]
{
    import iris

    print('Hello World')
}

}

Calling @Bob Kuszewski

I usually follow these steps when I have two similar but distinct codebases:

  1. Create a new repo.
  2. Export everything from the LIVE server into the repo. Commit.
  3. Export everything from the TEST server into the repo. Commit.

Commit from step (3) would have all the differences between LIVE and TEST. I assume the code on TEST is newer, so that should be a later commit, but it you want to, you can swap the export order.

Before making a commit (3) you might want to remove trivial differences such as whitespaces, etc. Also Gitlab has a compare mode for commits which automatically ignores whitespace differences.

While testing, I see I can easily set %session.Data to hold data I want to preserve.  

No problem! I thought you were having issues with that part.

how, on my next API call can I use that session  

You just need to supply the cookies CSPSESSIONID and CSPWSERVERID. With that you'll have the same session. In browsers (and I think in postman) that's automatic, so you don't have to do anything. It should work out of the box as long as you have UseSession set to 1.

Pythonic way is to use with. In that case close is automatic as soon as we get outsude of the context:

ClassMethod ReadFileUsingPython(pFile As %String) [ Language = python ]
{
  from datetime import datetime
  import iris
  time1 = datetime.timestamp(datetime.now())
  print(time1)
  if pFile=="":
    raise Exception("filename is required.")

  with open(pFile,"r", encoding="utf-8", errors="ignore") as file:
    log = iris.cls('otw.log.Log')
    for line in file:
      status = log.ImportLine(line)

  time2 = datetime.timestamp(datetime.now())
  print(time2)
  print("Execution time: ",(time2-time1))
}