Zn "%SYS"
Set Ref = ##class(Config.NLS.Locales).OpenCurrent(.Sc)
Write "Locale name: ",Ref.Name, !
Do Ref.GetTables(.Tables)
Set Key = ""
For { Set Key = $O(Tables("XLT", Key)) Quit:Key=""  Write Key,!}

What would be best practice for data type assigned to the "Password" property and securing that password against prying eyes, both just browsing the global as well as via SQL? 

Do not store passwords at all. If you need to check passwords hash and salt them - there are enough functions in $System.Encryption to get by. The only exception is when you need credentials to auth against some external system, in that case

  1. Encrypt login/pass.
  2. Store them in a separate encrypted-at-disk DB with separate resource.
  3. Write API to switch into this DB and get your credential, ideally this API returns authenticated object and not the password itself.
  4. Check that it fails for app user.
  5. Use Privileged Routine Application to allow your app access to a separate DB.

Also check Managed Key Encryption - from the docs:

Data-element encryption for applications, also known simply as data-element encryption — A programmatic interface that allows applications to include code for encrypting and decrypting individual data elements (such as particular class properties) as they are stored to and retrieved from disk.


Better yet, how can I prevent a specific property from being projected to SQL?

The best approach is not to store password at all. Still, there are several options:

  • Access management - if the user does not have access to the password column/table he can't access it. InterSystems IRIS supports CLS.
  • Internal - hides property from docs.
  • Private - hides property from SELECT * explorers, but it can still be explicitly referenced by name
  • Custom datatype - redefine getter to return data only to verified callers, for example check Security.Datatype.Password datatype implementation which returns *** instead of actual password value in ODBC context.

It depends. Essentially Interoperability Productions take care of:

  • Parallelization
  • Queues / Async
  • Error management, mitigation, and recovery
  • After-error investigation (Visual Trace / Audit)
  • Unified overview of integration flows

For each integration or part of an integration you need to decide if you need these features and usually you  do. In that case all you need to do is to develop one or more Business Hosts containing the business logic and as long as they conform to Interoperability Production structure you would automatically get all the above mentioned benefits.

You pay for the convenience with the overhead for messages and queues.

In the cases where some (most) of these conditions are true:

  • external system (whatever it is) is reliable - downtime is a scheduled rarity
  • external system does not throw errors much
  • response time is stable and in the sync realm
  • interaction with the system is one simple flow
  • integration is extremely highload

You can choose to interface more directly.

Furthermore it's not Interoperability/no Interoperability, but rather a scale of how much you expose as Interoperability Hosts. In your example maybe having only a BO is enough and you can forego the BP?


FROM users.users

Note that package is underscored.

I encounter this issue fairly often, but I need not a complete documentaiton but rather Interoperability production documentation.

As all the class info is also available as a %Dictionary package I just query it and generate XLSX.

Here are some queries to get started (but they usually need to be adjusted on per-project basis). Also queries should be rewritten to use SubclassOf proc instead of the current matching. Also I'm not sure why I don't pass filestream directly. That also  needs to be fixed.


1. Use Visual Trace to see message processing times. You can also query this information via SQL (Ens.MessageHeader table).

2. Add $$$TRACE events. The best time counter is $zh. They also can be queried via SQL (Ens_Util.Log table).