Rich Taylor · Feb 6, 2019 go to post

You don't indicate which product you are using (IRIS, Cache, Ensemble) which may impact on some details.  However the basic code to load the form data and execute the Post call is going to be  the same.

You need an instance of %Net.HTTPRequest.  If you are in Ensemble then you would work through the adapter.  For Cache or IRIS create an instance of this class and configure it for the server end point.

Add your form data to the request by calling the InsertFormData method.

ex. req.InsertFormData("NameFld", "NameValue")

To cause the request to be performed call the Post method

ex.  set status = req.Post(REST resource URL)

If you assign the server and port properties of the request then do not include them in the URL here.  

You can access the response in the HTTPResponse property of the request.  This will be an instance of the %Net.HTTPResponse class.

You can read more on the %Net.Request class in the class reference at https://docs.intersystems.com/latest/csp/documatic/%25CSP.Documatic.cls…;

Rich Taylor · Dec 28, 2018 go to post

Connor,

This is true for the Docker0 bridge which I have setup and noted in the post.  The problem is that docker-compose does not use this setting at all.  Unless you configure the docker-compose.yml file using one of the methods I mention you will still get a 172.x.x.x address.

Rich

Rich Taylor · Oct 8, 2018 go to post

You can use the community license for what ever you want. There are limitations on configuration.  Once you exceed the capabilities enabled by this license you would need to move up to some kind of paid license.   Using it for learning the product should not come close to the limit however. 

Have fun!!

Rich Taylor · May 30, 2018 go to post

I see that the ID is not in the export.  However the import does figure out how to match an import to an existing item since it does not  allow you to overwrite that existing records.  Otherwise we would see a duplication of records under new ID's.  

As far as the methods to use I would have to disagree with this recommendation.  A system administrator should not have to write code to maintain the systems under their care.  Documentation is important, however that can be accomplished with a literal document.  Then, failing to have a export/import or enterprise management  function, those changes would manually be done to all effected systems.  Writing code is less clear to everyone and is no less work.   

Rich Taylor · May 30, 2018 go to post

Let me clarify.  this has to do with the ExportTasks and ImportTasks methods of %SYS.Task class.  I need to know the qspec options that have impact on these processes.

as to question 2. The process is that they are setting up a new server for backup and want to replicate what they have setup for the current server.   Exporting and importing what is currently present is the best way.  If they are going to write a program then they could just as well  compare each existing task and do the changes manually.   There is an ongoing maintenance side to this which would also be better done with an export an import.

So to the original question is there anyway to tell ImportTasks to override the tasks that exist?

Rich Taylor · May 30, 2018 go to post

Eduard,

I had some additional questions.

  1. What are the options for qspec beyond the default 'd'?
  2. What if I want to override system defined tasks because I have adjusted things like schedules?  Is there an option to do this? qspec?
Rich Taylor · May 10, 2018 go to post

Eduard,

Ok, I had not noticed that, but you are correct.  I had tried other methods first as I noted before and ran into issues loading onto the new systems.  I had obviously skipped the step of verifying the export file when I tried this method.  So I gather that you HAVE to pass in a list of IDS to export. Leaving it blank does not export all.  As I mentioned the documentation is extremely sparse on this api.  I will test this again later.

Rich Taylor · May 9, 2018 go to post

ERROR #6037: Nothing imported.

Not terribly helpful I'm afraid.  I know that when I attempted to use SQL I was getting many Field Read Only errors.  So this may be related to that.

Rich Taylor · May 9, 2018 go to post

I agree except that the import would not work.  There is little documentation on utilizing these tools and I cannot even read the source code.

Rich Taylor · May 9, 2018 go to post

Ok,  Here is the procedure that worked for me:

Export:

  • merge ^TaskList = ^SYS("Task","TaskD")
  • set sc = $System.OBJ.Export("TaskList.GBL","c:\temp\TaskList.gbl")
    • note the extension GBL is not part of the globalname.  It is used to indicate that we want to export a global
    • the file destination is completely up to you.

Import

  • set sc = $System.OBJ.Load("c:\temp\TaskList.gbl",,.log)
  • merge ^SYS("Task","TaskD") = ^TaskList
Rich Taylor · May 9, 2018 go to post

I am clicking on the class and not the package in my build.  and getting what looks like package documentation.  There was obviously an issue in the build I was using.  Regardless the code for export worked.  But I can't get the import to function. Looking as exporting globals now.

Rich Taylor · May 9, 2018 go to post

Eduard,

Thanks for the info.  I was looking at that but could not find the global reference in the class definition or in searching the global list.  Now I know why. It's not in a global by itself.  Let me see if this will work.

Rich Taylor · May 9, 2018 go to post

I see this in the latest version of the online docs which is version 2017.2.  obviously something was amiss in the build I was using.  

I have tried this, but the import does not work.  I get an error that only states nothing was imported.   Not terribly helpful .

Rich Taylor · May 9, 2018 go to post

Interesting.  I am looking at Cache version 2017.1.0.792 and these classes have non of this documentation.  I will have to see if this exists on the version the customer is using.  This is how the Class reference for %SYS.Task looks on that version:

Rich Taylor · Feb 26, 2018 go to post

First, you can access Ensemble Credentials using the Ens.Config.Credentials class.  To be clear this is NOT User definitions from the Security module.  These are defined via the Ensemble Management portal options under Ensemble -> Configure ->Credentials.

That should work for you.  I would still like to better understand what is going on in the application here that drives this.  You seem to be indicating that this is a user logging into Ensemble.   If you could detail out the workflow that is happening and how it related to Ensemble Services we might be able to better advise you.

Finally,  I want to make you aware that the LDAP interface in InterSystems technologies has a method for using groups to define everything the security model needs.   In fact that is the default method in recent versions.

The best path forward is to get your Sales Engineer (SE) involved in what you are trying to achieve.  That person would be best suited to dig into your requirements and advise you.  If, for some reason, you cannot contact your SE or don't know who that is send me a private message.  I'd be happy to help out more directly.

Rich Taylor · Feb 26, 2018 go to post

Ensemble Credentials are normally used to satisfy security for an Ensemble Business host.  This separates the maintenance of security from the maintenance of the actual interfaces.   The application of the security is handled completely by Ensemble in that scenario.   This does not appear to be how you are attempting to utilize this.  It would help to better understand your use case here.   What is the entry path/service  that is utilizing delegated authentication?  

Rich Taylor · Jan 10, 2018 go to post

No it is not 'necessary'.  However I do like to be able to have an environment that more closely matches what one might need in production.  This is both for my experience and to be able to show InterSystems technology in a manner that might occur for a client.

I do use Docker exec, thought I choose to go to BASH so I have more general access.  I actually wrote  a simple cmd file to do this and added it to a menu on my toolbar.

@echo off
docker container ls --format "table {{.Names}}\t{{.Status}}\t{{.Ports}}"
echo:
set /P Container=Container ID: 
docker exec -it %Container% bash

Rich Taylor · Jan 9, 2018 go to post

Let me add my experience to this comment.  I have been wading into the Docker ocean.  I am on Widows and really did not want to run a Linux VM to get Docker Containers (seemed a bit redundant to me) so Docker for Windows was the way to go.  So far this has worked extremely well for me.  I am running an Ubuntu container with Ensemble added int.   My dockerfile is a simplistic version of the one earlier in these comments.   I am having only one issue related to getting the SSH daemon to run on when the container starts.

 I hope to have all my local instances moved into containers soon.

My feeling is that this will be great for demonstrations, local development, and proofs of concept.   I would agree that for any production use having a straight Linux environment with Docker would be a more robust and stable solution.

Rich Taylor · Nov 17, 2017 go to post

Marco,

I would suggest contacting InterSystems Support.  Got to WRC.InterSystems.com.  That would be the quickest way to resolve this particular issue.

Rich Taylor · Oct 27, 2017 go to post

In order to edit anything in Atelier the  code has to be part of a project. To create an Atelier project right click and choose New->Atelier project.  Then you can right click the class or routine you need to edit and choose "copy to project".

Open the copy associated with the project to editit

Rich Taylor · Oct 12, 2017 go to post

Below is a class method that returns details of a Contact.  Note the line that that is highlighted below.  Basically any output written from your REST Classmethod becomes part of the body of the response.  So the WRITE of the output of %ToJSON() puts a JSON object in the response body.

ClassMethod ContactDetail(ContactID As %String) As %Status
{
    #dim %response as %CSP.Response
    #dim %request as %CSP.Request
    #dim ContactItem as WebREST.Data.Contact
    #dim ActionItem as WebREST.Data.Actions
    
    set tSC = $System.Status.OK()
    Try {
        if ContactID {
            set ObjSC = ""
            set ContactItem = ##Class(WebREST.Data.Contact).%OpenId(ContactID,,.ObjSC)
            if $System.Status.IsError(ObjSC) {
                // test for object not found error
                if $System.Status.GetErrorCodes(ObjSC) = "5809" {
                    set %response.Status = ..#HTTP404NOTFOUND
                } else {
                    throw ##class(%Exception.StatusException).CreateFromStatus(ObjSC)
                }
            } else {
                set ResponseObj = {}
                set ResponseObj.ContactID = ContactItem.ContactID
                set ResponseObj.FirstName = ContactItem.FirstName
                set ResponseObj.LastName = ContactItem.LastName
                set ResponseObj.Position = ContactItem.Position
                set ResponseObj.Company = ContactItem.Company.Name
                set ResponseObj.Phone = ContactItem.Phone
                set Address = {}
                set Address.Street = ContactItem.Address.Street
                set Address.City = ContactItem.Address.City
                set Address.State = ContactItem.Address.State
                set Address.PostalCode = ContactItem.Address.PostalCode
                set ResponseObj.Address = Address
                set ResponseObj.Agent = ContactItem.Agent.Name
                
                // now load each action into an array
                set Actions = []
                set ActionKey = ""
                do {
                    set ActionResponse = {}
                    set ActionItem = ContactItem.Actions.GetNext(.ActionKey)
                    if ActionItem '= "" {
                        set ActionResponse.ActionType = ActionItem.ActionType
                         set ActionResponse.DueDate = $zdate(ActionItem.DueDate,3)
                        set ActionResponse.Notes = ActionItem.Notes
                         set ActionResponse.Complete = ActionItem.Complete
                        set ActionResponse.Agent = ActionItem.Agent.Name
                        do Actions.%Push(ActionResponse)
                    }
                } while ActionKey '= ""
                set ResponseObj.Actions = Actions
                Write ResponseObj.%ToJSON()
            }
        } else {
            set %response.Status = ..#HTTP400BADREQUEST
        }
    } catch Exception {
        set tSC = Exception.AsStatus()
    }

    Quit tSC
}
 

Rich Taylor · Sep 26, 2017 go to post

Thomas, I am working on the same problem.  I will post a solution if and when I get one.  And look for any support here to of course.

Rich Taylor · Sep 25, 2017 go to post

To add to John's post.  That earlier post lets you convert the timestamp to a matching format that you can then get out of your current database as follows:

$zdatetime($h,3,1)

Or convert the timestamp into an internal date format with $zdatetimeh.

Rich Taylor · Sep 18, 2017 go to post

Some questions first.  

  1. When you say 'localhost' are you implying that you are using the private Web Server contained within Cache?
  2. Is this the only application being run from the web server vs localhost?
  3. If not then are the other applications still accessible via the external web server.

Some things to check right off the bat.

  • enabling audting and seeing which user is getting the error will help
  • Use the CSP Gateway Management pages HTTP trace capability to see if the request is even making it into Cache.  It would seem so from the error, but better to confirm everything.
  • Make sure that the user that the CSP Gateway associated with the Web Server uses to communicate to Cache has access to your database.  This would be different from the person logging into your application.  This can be found in the CSP Gateway Management for the server.
Rich Taylor · Sep 15, 2017 go to post

Another method to consider is Delegated Authentication (http://localhost:57775/csp/docbook/DocBook.UI.Page.cls?KEY=GCAS_delegat…).  The benefit here is your custom authentication code gets run BEFORE the user is granted any kind of access to the system.  The issue with relying on the routine started as soon as the login is that the user has already "breached the walls" and is in the system (and consuming a license) before you authenticate them.   If your code fails for any reason or if there is a hole that allows them to break out of the code then they will likely have complete access to at least that namespace.

With Delegated Authentication you are hooking your custom code into the authentication flow of Cache.  If there is any failure of code in this process the user gets "Access Denied".

One limitation is that you are only allowed a limited amount of code (2000 or 4000 commands if memory serves) to complete you authentication.  After that delays are inserted into the process.

Hope that helps.

Rich Taylor · Aug 29, 2017 go to post

Robert,

Great history lesson!  I have a question for you though.  As you were there at the  begining or close to it perhaps you might have some insight.  I came from a background in MultiValued databases (aka PICK, Universe, Unidata) joining InterSystems in 2008 when they were pushing Cache's ability to migrate those systems.   From the beginning I was amazed at the parallel evolution of both platforms.  In fact when I was preparing for my first interviews, having not heard of Cache before, I thought it was some derivative of PICK.  Conceptually both MUMPS and PICK share a lot of commonality.  Differing in implementation of course.  I have long harbored the belief that there had to be some common heritage.  Some white papers or other IP that influenced both.   Would you have any knowledge on the how the original developers of MUMPS arrived at the design concepts they embraced?  Does the name Don Nelson ring a bell?

Thanks again for the history.