This means your query is taking a longer time to return the result back to your CSP application than what it expects.

I'd recommend you to verify why it's taking so long and optimize your query, but you can also ask the CSP gateway to wait for a longer period before it times out.

  // On OnPreHTTP callback method.
  set %response.Timeout = 900 // this will make your application wait for the response for 900 seconds.

Note that %response.Timeout will change the timeout for the current request only.

EDIT: Now that I have seen this qualifier, I'll be doing some tests to see the result.
EDIT #2: Looks like it works as I intended, so please ignore  the text below. 

Yes, this is really painful if you think about using IRIS to generate artifacts for relatively new Caché versions. They can't read some changes that IRIS made, like:

* IRIS also moves away from %Library.CacheStorage which is used by Caché as it instead now uses %Storage.SQL. It also unifies the storage strategy for both: %Persistent classes and custom storage, which Cache used two distinct storage classes, one being %CacheStorage.
* Some XML elements have been modified or wiped out.
* The XML header now prints generator=IRIS instead of generator=Cache.
* Methods that have [ Language = cache ]. are now converted to [ Language = objectscript ].

This is what lead us to create a custom docker image that hosts Caché instead of IRIS for using it to generate continuous delivery.

%CSP.StreamServer is just a helper to cut short some manual labor.

What you need to do is write the file stream back to the device and change three Content-* headers as follows.

Your CSP page has an OnPreHTTP just like every %CSP.Page extended class. You can use this method to modify the Content-Type, Content-Disposition and Content-Length headers. In your case you'll need to use the <script> tag syntax for creating the method.

The example below assumes you're using a class instead.

NOTE: If you really customize your %CSP.Page or %CSP.REST. You don't even need to use the OnPreHTTP method. But I'll be using it here for educational purposes.

ClassMethod OnPreHTTP() As %Boolean [ ServerOnly = 1 ]
{
   set file = %request.GetMimeData("FileStream")
   if '$isobject(file) return 0

   do ##class(%CSP.StreamServer).FileClassify("XLS", .contentType, .isBinary, .charset)
   do %response.SetHeader("Content-Type", contentType)
   do %response.SetHeader("Content-Disposition", "attachment; filename=""_file.FileName_""")

  return 1
}

ClassMethod OnPage() As %Status [ ServerOnly = 1 ]
{
  set iStream = %request.GetMimeData("FileStream")
  $$$QuitOnError(##class(something).method(iStream, .oStream))

  do %response.SetHeader("Content-Length", oStream.Size)

  do oStream.Rewind()
  do oStream.OutputToDevice()

  return $$$OK
}

Check for the following methods:

FindInFiles and FindInFilesRegex.

Optionally you could limit the search for a single project using the method FindInProject.

All methods belong to the %Studio.Project class.

Here's how Atelier does:

            Set tSC=##class(%Studio.Project).FindInFiles(
                tSearch,
                doclist,
                system,
                wholeword,
                casesensitive,
                max,
                "GENERATED="_generated, // filter
                wildcards
            )

But you'll need to use device redirection to capture and parse its content. Because Studio uses these methods to display the results in the Output window.

You need to generate a SSL configuration and provide to your request object by using the SSLConfiguration property, after this you must also inform to your request that you want to use a secure connection by enabling the property Https.

Here's how we send a push using OneSignal:

 set client = ##class(%Net.HttpRequest).%New()
 set client.Server = "onesignal.com"

 // You'll need to generate the configuration to be used below, you can decide its name.
 set client.SSLConfiguration = "OneSignal SSL Config"

 set client.Https = 1
 set client.Authorization = $$$FormatText("Basic %1", $get(^App.Envs("ONESIGNAL_KEY")))
 set client.ContentCharset = "utf-8"
 set client.ContentType = "application/json"
 set client.ContentEncoding = "utf-8"
 set client.NoDefaultContentCharset = 0

 set body = {
   "app_id": ($get(^App.Envs("ONESIGNAL_APPID"))),
   "data": (data),
   "contents": {
       "en": (message),
       "pt": (message)
   },
    "filters": (filters)
}

  set json = body.%ToJSON()
  do client.EntityBody.Write(json)

  set sc = client.Post("/api/v1/notifications")
  return sc

You can generate that SSL configuration using the portal, you can also something like this:

ClassMethod CreateSSLConfigurationIfNoneExists(name As %String)
{
   new $namespace
   set $namespace = "%SYS"

   do ##class(Security.SSLConfigs).Get(name, .p)
   if $data(p) quit   

   set p("CipherList")="ALL:!aNULL:!eNULL:!EXP:!SSLv2"
   set p("CAFile")=""
   set p("CAPath")=""
   set p("CRLFile")=""
   set p("CertificateFile")=""
   set p("CipherList")="ALL:!aNULL:!eNULL:!EXP:!SSLv2"
   set p("Description")=""
   set p("Enabled")=1
   set p("PrivateKeyFile")=""
   set p("PrivateKeyPassword")=""
   set p("PrivateKeyType")=2
   set p("Protocols")=24
   set p("SNIName")=""
   set p("Type")=0
   set p("VerifyDepth")=9
   set p("VerifyPeer")=0

   do ##class(Security.SSLConfigs).Create(name, .p)
}

You should never attempt to filter the list manually (using $replace). Instead of that, if you already know what item you want to fetch then you can :

Use the $listfind to retrieve the index of the matching item inside your list and $listget to to finally fetch it.

set list = $listbuild("a","b","c")

set whatIWantToFind = "c"
set index = $listfind(list, "c") // returns the index 3.
write $listget(list, index) // returns "c".

If you don't know the content on that list, you can use $listlength to get the size of the list:

 set list = $listbuild("a","b","c")

 for i=1:1:$listlength(list) {
     write $listget(list, i) // returns the current item at index i.
 }

If you want extreme performance, then I'd suggest you to go with $listnext:

set pointer = 0
set list = $listbuild("a","b","c")
while $listnext(list, pointer, item) {
   // Self-explanatory.
}

If you really want to filter the list separators, you can use $listtostring to do the opposite of $listfromstring that you used.

set list = $listbuild("a","b","c")
write $listtostring(list, "|") // "a|b|c"

Finally, you can also compare the list as whole using $listsame:

set list = $listbuild("a","b","c")
set listB = $listbuild("a", "b", "d")
set listC = $listbuild("a","b","c")

write $listsame(list, listB) // 0. False, because they're not the same.
write $listsame(list, listC) // 1. True, the values match.

You might want to try using Port. It was made exactly for that purpose.

Just install the port-prod.xml and restart your Studio, note that this tool will takeover your current source control class.

After that, open your project, Source Control->Export. If your language is not supported, then you might need to change your current session to english using ##class(%MessageDictionary).SetSessionLanguage("en").

It actually works and it can even transport objects using %session.Data, it also seems to run in a separate process from the request, which means a non-blocking operation. But I could  only make it work by defining the event class on the web application config even though there's a way to define it dynamically by using %session.EventClass.

So my only question is: Is there a moment for setting this up? Like when using OnPreHTTP for %response? I tried setting it using the classmethod Page after calling the superclass (%CSP.REST) but no avail. And %CSP.REST doesn't provide OnPreHTTP because it overwrites the Page method completely.

Now regarding the data we can retrieve from the default %CSP.REST implementation is none.  As we only have access to %session and %request (I didn't tried %response), it lacks any metadata that is used for %CSP.REST based applications (dispatch method, dispatch class, route arguments, stack, etc) since no object is created for it.

 

That means you still need a way to retrieve this info, this is why I had to transport the object over by using %session.Data. And here's how I did it:

ClassMethod Page(skipheader As %Boolean = 1) As %Status [ ProcedureBlock = 0 ]
{
  new %frontier
  set %frontier = ##class(Frontier.Context).%New(%session, %request, %response)
  set %session.Data("%frontier") = %frontier
  $$$QuitOnError(##super(skipheader))
  return $$$OK
}

And the event class:

/// Called when we have finished processing this request
ClassMethod OnEndRequest() As %Status
{
  set frontier = %session.Data("%frontier")
  return frontier.ReporterManager.Report()
}

Now using %CSP.REST I think the best approach is to create an utility method that is called on the beginning of each dispatch method. This utility would populate %session.Data with context sensitive info like arguments received. This method must be called for each method that is dispatchable. An example:

ClassMethod PopulateLogObject(arguments As %List, method, classname)
{
    set log = ##class(MyApp.Log).%New()
    set log.username = $username // <--  This is the only one you can fetch directly when using %CSP.SessionEvents.
    set log.parameters = arguments
    set log.method = method
    set log.classname = $classname()
    return log.%Save()
}

ClassMethod DoSomethingRequestedOverHTTP(argA, argB, argC)
{
     try {
        $$$ThrowOnError(..PopulateLogObject($lb(argA, argB, argC), "DoSomethingRequestedOverHTTP", $classname())

     } catch e {
        // render error response.
     }
    // render success response.
}

I don't know of any other possible way from getting this kind of info, unless if you overwrite DispatchRequest from %CSP.REST.

No, you need to grant the group/user you're using for Caché permission to access the executable that's being denied.
On Windows, Caché uses the same user that started the process. Remember that you're trying to access a resource natively from your OS, so your Caché user is NOT what you should check.

 

If you need to know which you should be looking for, open a cmd and type "whoami".

Now you'll need to have an account with Administrator privileges, otherwise you won't be able to edit your executable's permissions. If you're working in company network, then your user probably wont have enough privileges for changing that file's permissions, so I'd recommend leaving this task for your IT technician.

 

The technician should know what to do,  just ask him to add your user to list of groups/users that are allowed to execute your file. Normally you would only need "read and execute", unless that executable also generates some file somewhere, so also grant your user the permission to write.

Implementing IntelliSense seems to require a major overhaul of the current extension as it requires the usage of a language server and a language client that sends and retrieves a lot of essential metadata for code diagnostics. Some of that I don't know if InterSystems publicly discloses.

A brief protocol explanation: https://code.visualstudio.com/blogs/2016/06/27/common-language-protocol
The full protocol: https://github.com/Microsoft/language-server-protocol/blob/master/protoc...

The implementation using NodeJS: https://github.com/Microsoft/vscode-languageserver-node

On the last repository you will notice that it actually contains the JSONRPC 2.0 message definition, the protocol, and finally the base server and client implementations.

Hello Soufiane, although your thank you is appreciated, an acceptance check is even more, because it alerts users with the same issue as you that a solution has been provided already.

https://community.intersystems.com/post/convert-timestamp-its-correspond...

You'll also help us if you use the same thread whenever you think about making a similar post. Don't create new threads if your subject is the same as the one you already created, or the community will start downvoting you.

Remember that downvoted threads have less chance of getting attention.

Hello Coty.

I noticed that you starred my github repository and I thank you for that. :)


Back to your question, I think you're detecting changes by applying an unusual way to do that, since you said you can't trigger an action when modifying static files. Just so you know, as long as you're working with the Studio's SourceControl API, you should be able to do whatever you want whenever an  item is modified, you're even free to decide how to restrict the implementation, all of this regardless of the item you're updating.

Look at this part to understand how it's done.

About your use-case, we're actually testing Port with this development format. We have one code base, that's our development server, multiple namespaces simulating different customer configurations and mock data (not mock, actually their test data).

Even though this model works, by our analysis it can get pretty frustrating for users coming from a distributed version control, because they notice that multiple developers interacting with their "repository". Still, it's already a step ahead from not versioning at all.

However, the team is expected to migrate all their source to projects, since Port annoys the user about trying to save default projects and even detects items that are already owned by other projects. This forces all the team to prioritize organizing their code base.