This means your query is taking a longer time to return the result back to your CSP application than what it expects.

I'd recommend you to verify why it's taking so long and optimize your query, but you can also ask the CSP gateway to wait for a longer period before it times out.

  // On OnPreHTTP callback method.
  set %response.Timeout = 900 // this will make your application wait for the response for 900 seconds.

Note that %response.Timeout will change the timeout for the current request only.

Thanks for everyone's suggestions. But we have opted for using CompilePackage while providing the method the argument dynamically by using envs.

EDIT: Now that I have seen this qualifier, I'll be doing some tests to see the result.
EDIT #2: Looks like it works as I intended, so please ignore  the text below. 

Yes, this is really painful if you think about using IRIS to generate artifacts for relatively new Caché versions. They can't read some changes that IRIS made, like:

* IRIS also moves away from %Library.CacheStorage which is used by Caché as it instead now uses %Storage.SQL. It also unifies the storage strategy for both: %Persistent classes and custom storage, which Cache used two distinct storage classes, one being %CacheStorage.
* Some XML elements have been modified or wiped out.
* The XML header now prints generator=IRIS instead of generator=Cache.
* Methods that have [ Language = cache ]. are now converted to [ Language = objectscript ].

This is what lead us to create a custom docker image that hosts Caché instead of IRIS for using it to generate continuous delivery.

Neither alternatives. I'd usually archive the repository instead.

I think it is:

write %request.GetCgiEnv("HTTP_APPLICATION_ID")

Although by convention, custom headers should begin with a X character to avoid possible conflicts with future but spec-based implementations.

You need to generate a SSL configuration and provide to your request object by using the SSLConfiguration property, after this you must also inform to your request that you want to use a secure connection by enabling the property Https.

Here's how we send a push using OneSignal:

 set client = ##class(%Net.HttpRequest).%New()
 set client.Server = "onesignal.com"

 // You'll need to generate the configuration to be used below, you can decide its name.
 set client.SSLConfiguration = "OneSignal SSL Config"

 set client.Https = 1
 set client.Authorization = $$$FormatText("Basic %1", $get(^App.Envs("ONESIGNAL_KEY")))
 set client.ContentCharset = "utf-8"
 set client.ContentType = "application/json"
 set client.ContentEncoding = "utf-8"
 set client.NoDefaultContentCharset = 0

 set body = {
   "app_id": ($get(^App.Envs("ONESIGNAL_APPID"))),
   "data": (data),
   "contents": {
       "en": (message),
       "pt": (message)
   },
    "filters": (filters)
}

  set json = body.%ToJSON()
  do client.EntityBody.Write(json)

  set sc = client.Post("/api/v1/notifications")
  return sc

You can generate that SSL configuration using the portal, you can also something like this:

ClassMethod CreateSSLConfigurationIfNoneExists(name As %String)
{
   new $namespace
   set $namespace = "%SYS"

   do ##class(Security.SSLConfigs).Get(name, .p)
   if $data(p) quit   

   set p("CipherList")="ALL:!aNULL:!eNULL:!EXP:!SSLv2"
   set p("CAFile")=""
   set p("CAPath")=""
   set p("CRLFile")=""
   set p("CertificateFile")=""
   set p("CipherList")="ALL:!aNULL:!eNULL:!EXP:!SSLv2"
   set p("Description")=""
   set p("Enabled")=1
   set p("PrivateKeyFile")=""
   set p("PrivateKeyPassword")=""
   set p("PrivateKeyType")=2
   set p("Protocols")=24
   set p("SNIName")=""
   set p("Type")=0
   set p("VerifyDepth")=9
   set p("VerifyPeer")=0

   do ##class(Security.SSLConfigs).Create(name, .p)
}

If you don't want to hook Caché with another language using Caché Bindings, you should use PBKDF2 with SHA256. Otherwise you'll need some external implementation to use bcrypt.

write $System.Encryption.PBKDF2("secret", 15000, $System.Encryption.GenCryptRand(64), 64, 256)

Although IS should really implement bcrypt and Argon with native support.

Try sending the CreatedDate format like this, which is the format used by JSON serializers.

"2019-08-26T16:38:26.893Z"

My first idea is to use a captcha service straight from China.

Something like this:

http://www.yinxiangma.com/

Maybe if you use some flag to fallback to this service instead of reCaptcha when the consumer originates from mainland China.

The only issue is that you might need a translator for support.

Hello Coty.

I noticed that you starred my github repository and I thank you for that. :)


Back to your question, I think you're detecting changes by applying an unusual way to do that, since you said you can't trigger an action when modifying static files. Just so you know, as long as you're working with the Studio's SourceControl API, you should be able to do whatever you want whenever an  item is modified, you're even free to decide how to restrict the implementation, all of this regardless of the item you're updating.

Look at this part to understand how it's done.

About your use-case, we're actually testing Port with this development format. We have one code base, that's our development server, multiple namespaces simulating different customer configurations and mock data (not mock, actually their test data).

Even though this model works, by our analysis it can get pretty frustrating for users coming from a distributed version control, because they notice that multiple developers interacting with their "repository". Still, it's already a step ahead from not versioning at all.

However, the team is expected to migrate all their source to projects, since Port annoys the user about trying to save default projects and even detects items that are already owned by other projects. This forces all the team to prioritize organizing their code base.