I see that the ID is not in the export.  However the import does figure out how to match an import to an existing item since it does not  allow you to overwrite that existing records.  Otherwise we would see a duplication of records under new ID's.  

As far as the methods to use I would have to disagree with this recommendation.  A system administrator should not have to write code to maintain the systems under their care.  Documentation is important, however that can be accomplished with a literal document.  Then, failing to have a export/import or enterprise management  function, those changes would manually be done to all effected systems.  Writing code is less clear to everyone and is no less work.   

Let me clarify.  this has to do with the ExportTasks and ImportTasks methods of %SYS.Task class.  I need to know the qspec options that have impact on these processes.

as to question 2. The process is that they are setting up a new server for backup and want to replicate what they have setup for the current server.   Exporting and importing what is currently present is the best way.  If they are going to write a program then they could just as well  compare each existing task and do the changes manually.   There is an ongoing maintenance side to this which would also be better done with an export an import.

So to the original question is there anyway to tell ImportTasks to override the tasks that exist?

Eduard,

Ok, I had not noticed that, but you are correct.  I had tried other methods first as I noted before and ran into issues loading onto the new systems.  I had obviously skipped the step of verifying the export file when I tried this method.  So I gather that you HAVE to pass in a list of IDS to export. Leaving it blank does not export all.  As I mentioned the documentation is extremely sparse on this api.  I will test this again later.

Ok,  Here is the procedure that worked for me:

Export:

  • merge ^TaskList = ^SYS("Task","TaskD")
  • set sc = $System.OBJ.Export("TaskList.GBL","c:\temp\TaskList.gbl")
    • note the extension GBL is not part of the globalname.  It is used to indicate that we want to export a global
    • the file destination is completely up to you.

Import

  • set sc = $System.OBJ.Load("c:\temp\TaskList.gbl",,.log)
  • merge ^SYS("Task","TaskD") = ^TaskList

First, you can access Ensemble Credentials using the Ens.Config.Credentials class.  To be clear this is NOT User definitions from the Security module.  These are defined via the Ensemble Management portal options under Ensemble -> Configure ->Credentials.

That should work for you.  I would still like to better understand what is going on in the application here that drives this.  You seem to be indicating that this is a user logging into Ensemble.   If you could detail out the workflow that is happening and how it related to Ensemble Services we might be able to better advise you.

Finally,  I want to make you aware that the LDAP interface in InterSystems technologies has a method for using groups to define everything the security model needs.   In fact that is the default method in recent versions.

The best path forward is to get your Sales Engineer (SE) involved in what you are trying to achieve.  That person would be best suited to dig into your requirements and advise you.  If, for some reason, you cannot contact your SE or don't know who that is send me a private message.  I'd be happy to help out more directly.

Ensemble Credentials are normally used to satisfy security for an Ensemble Business host.  This separates the maintenance of security from the maintenance of the actual interfaces.   The application of the security is handled completely by Ensemble in that scenario.   This does not appear to be how you are attempting to utilize this.  It would help to better understand your use case here.   What is the entry path/service  that is utilizing delegated authentication?  

No it is not 'necessary'.  However I do like to be able to have an environment that more closely matches what one might need in production.  This is both for my experience and to be able to show InterSystems technology in a manner that might occur for a client.

I do use Docker exec, thought I choose to go to BASH so I have more general access.  I actually wrote  a simple cmd file to do this and added it to a menu on my toolbar.

@echo off
docker container ls --format "table {{.Names}}\t{{.Status}}\t{{.Ports}}"
echo:
set /P Container=Container ID: 
docker exec -it %Container% bash

Let me add my experience to this comment.  I have been wading into the Docker ocean.  I am on Widows and really did not want to run a Linux VM to get Docker Containers (seemed a bit redundant to me) so Docker for Windows was the way to go.  So far this has worked extremely well for me.  I am running an Ubuntu container with Ensemble added int.   My dockerfile is a simplistic version of the one earlier in these comments.   I am having only one issue related to getting the SSH daemon to run on when the container starts.

 I hope to have all my local instances moved into containers soon.

My feeling is that this will be great for demonstrations, local development, and proofs of concept.   I would agree that for any production use having a straight Linux environment with Docker would be a more robust and stable solution.