I don't know what you're extending your class from, but try calling ##super() before you set the timeout.

EDIT: Hold on, try using %response.Timeout instead. Maybe your problem is not the session itself, but the response timeout instead.

/// Can be set in the OnPreHTTP() method of a page in which case this changes the amount of time the
/// CSP gateway will wait for a response from the server in seconds before it reports the 'Server is not
/// responding' error message. This is useful if you know that this page is doing an expensive SQL
/// query that will take a couple of minutes and you want to set the server response timeout on
/// the CSP gateway to a minute and yet wait three minutes for this page to respond. It will just
/// change the server response timeout for this page only. If not set the the CSP gateway uses its
/// default timeout value specified in the CSP gateway configuration.
Property Timeout As %Integer;

That's quite a topic for complex discussions.

  • Do you use an issue tracking / collaboration system? If so which one. Any you would recommend or immediately dismiss based on personal experience?

I use Github plus repository issues.

  • How do you keep track of large code bases? Thousdands of folders named backup1, backups2, ..., SVN, git?

Git.

  • Do you have a development server to which you commit and test features there, or do you rather run a local copy of caché and implement features locally first, then push to the server?

Locally implemented and tested. tested and implemented.

  • Bonus question: How do you handle legacy code (and I mean the using lots of $ZUs kind of legacy) code? Leave it untouched and try to implement new features elesewhere? Rewrite the entire thing?

It depends, the more complex the code is, more I consider creating modern API wrappers instead of re-writting it.

Since no one answered that yet, maybe there isn't a known way of doing it.
If that's really the case, I suggest you to create your procedure.

This could work as the following:
 

1 - Routine B watches a global generated from Routine A using a scheduled task.

2 - Process A triggers an exception.

3 - Exception on routine A executes subroutine or method to that uses $stack to capture execution data.

4 - Routine A stores data into a global and quits abnormally.

5 - Routine B watches for new entries in the global, marks it as used/processed.

If you mean testing using the SSL configuration in the Portal, when you click Test, it asks you for your server name and port. This follows the same pattern as %Net.HttpRequest, which means: don't provide the protocol http:// or https://, only your address: fantasyfootballnerd.com

For a use case, we have service hosted on AWS using HTTPS and it connects succesfully, but only if I strip the protocol from the address. Here's a feedback from our service when using the Test button (it's portuguese though ...)

Conexão SSL bem sucedida <- Indicates success on connecting.
Protocolo: TLSv1/SSLv3
Ciphersuite:ECDHE-RSA-AES128-GCM-SHA256
Peer:  <- No peer certification defined, so it's empty...
Requisição: GET / HTTP/1.0 <- Request done using HTTP 1.0 protocol 
Resposta: <- Response is an empty body.

HTTP Status 301 means Moved Permanently, which happens when you request a resource and the server redirects you to another (usually equivalent), since you said it's asking to use HTTPS, I suppose you haven't configured a SSL configuration on the Caché side.

Create a new SSL Configuration mark Enabled, select Type to Client, Peer certificate verification level to None, and Protocols to TLS.

Or simply fill the Name and Description, anything else is default... click Test to see if it works and Save it.

Now after instantiating the %Net.HttpRequest, you pass your instance.SSLConfiguration = "Your SSL Configuration Name" and instance.Https = 1.

This is the minimal you usually need to do to have a working SSL aware client.

You can use $query as Fabian suggested along with indirection this will allow you to traverse the global without the need of creating a recursive call.

You can use $qlength along with $query to discover how deep (how many subscripts) you are in the global.

You can use $qsubscript along with an index to fetch it's name.

There's a sample within the $query documentation.

Also, $order uses the lastest subscript you provided to find the next one. So, no, you don't need to know how many subscripts you have. But you need to know when to stop.

I suggest you to check [@Sergey Kamenev]'s articles for a deep understanding about globals.

/// Returns a list of the Globals in a Cache NameSpace (used for GUI display)<br>
/// <br>
/// <b>Parameters:</b> <br>
/// NameSpace - a Cache namespace. Default is current namespace. <br>
/// Mask - a mask, or comma-separated list of masks, to select globals. Default is "*" for all.<br>
/// SystemGlobals - boolean flag to include system globals in the results. Default is "0".<br>
/// UnavailableDatabases - a returned local array of any databases not currently accessible, i.e. array(name)=status.<br>
/// Index - Internal use only.<br>
/// IgnoreHasData - For faster list of Globals set this to 1 and the HasData column will always be FALSE.<br>
/// Mapped - Return all mapped global nodes when set to 1, the default value of this parameter is 1.
/// <br>
/// Valid masks are as follows:
/// <br>
/// ABC* - All strings starting with ABC<br>
/// A:D - All strings between A and D<br>
/// A:D,Y* - All strings between A and D, and all strings starting with Y<br>
/// A:D,'C* - All strings between A and D, except those starting with C
Query NameSpaceList(
  NameSpace As %String,
  Mask As %String,
  SystemGlobals As %Boolean,
  ByRef UnavailableDatabases As %String,
  Index As %Integer,
  IgnoreHasData As %Boolean = 0,
  Mapped As %Boolean = 1) As %Query(ROWSPEC = "Name:%String,Location:%String,ResourceName:%String,Permission:%String,Empty:%String,Keep:%String,Collation:%String,PointerBlock:%String,GrowthBlock:%String,HasData:%Boolean,Journal:%String,LockLocation:%String,HasSubscripts:%Boolean") [ SqlProc ]
{
}


set s = ##class(%SQL.Statement).%New()

do s.%PrepareClassQuery("%SYS.GlobalQuery", "NameSpaceList")
set r = s.%Execute("SAMPLE", "*")
set $namespace = "SAMPLE"
while r.%Next() { kill @r.%Get("Name") }

Instead of creating multiple repositories. Can't you just create a single repository and keep all projects inside it?
This is what a monorepository approach does.

Big companies like Google opted-out for using this approach because they started to find it too hard to manage issues across multiple repositories, since one should clone the issued repository along with a tree of dependencies to make it testable. And that's only one use case.

Some few examples:

https://eng.uber.com/ios-monorepo/
https://medium.com/@pejvan/monorepos-85e608d43b57

https://blog.ghaering.de/post/monorepo-march/

Now you must consider if that's a option for your company. Since using monorepo actually seems to be a trend and can lead you into traps.

Although Caché does have a %DataTypes layer for SQL, the database engine itself is purely based on globals, which is losely typed.

Thus, what I can say for you is:

Local variables (memory) is by default around 32 kilobytes, you can upgrade this amount up to 50x.

So, straight from the documentation:
 

Caché supports two maximum string length options:

  • The traditional maximum string length of 32,767 characters.

  • Long Strings maximum string length of 3,641,144 characters.

Globals can go way beyond, since it's persisted.

How huge a number can go? I suppose you're talking about floating precision.

This might help you.

And this explains about how Caché manages variables.

Caché also provides you the possibility to redefine the memory allocation size for the context process.

Here is the procedure sample:

http://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=...

 

ClassMethod CalcAvgScore(firstname As %String,lastname As %String) [sqlproc]
{
  New SQLCODE,%ROWID
  &sql(UPDATE students SET avgscore = 
    (SELECT AVG(sc.score) 
     FROM scores sc, students st
     WHERE sc.student_id=st.student_id 
       AND st.lastname=:lastname
       AND st.firstname=:firstname)
     WHERE students.lastname=:lastname
       AND students.firstname=:firstname)

  IF ($GET(%sqlcontext)'= "") {
    SET %sqlcontext.%SQLCODE = SQLCODE
    SET %sqlcontext.%ROWCOUNT = %ROWCOUNT
  }
  QUIT
}

 

Instead of using SQL to define PROCEDURES, even though you can, it's easier to create one using your own class. Just declare it as [ SqlProc] and it'll be available to use inside SQL. You can use that way to define a SQL function as well.

Is the file using BOM? If so you can check the header for the following signature: EF BB BF


This can be described as: $c(239, 187, 191)

Now keep in mind that most of editors abandoned the use of BOM in favor of digraphs and trigraphs detection heuristics as a fallback, yes, fallback. Because many assume you're already working with UTF-8 and won't work well with some charsets neither output BOM characters unless you tell it to use the desired charset.
 

You can try checking it against the US-ASCII table that goes from 0 to 127 code points, however that still wouldn't be 100% assertive about the stream containing UTF-8 characters.

Use #server if you want to wait for a response, but be warned though that JavaScript is one threaded, and using #server with a left-hand side (LHS) variable would lock the current thread.

If you don't specify a LHS you can continue using #call, that will inform the CSP Gateway to execute the request asynchronously.

More details here: http://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=...


If you need something closer to a callback then must do your callback on the server using &js< /* your javascript code here */ >. This way the server will return a "runtime" JavasScript to execute remaining operations on the client side.
 

Try generating an error with code 5001 ($$$GeneralError, "your custom message").
This will give you an "#ERROR 5001: your custom message",  $replace that "#ERROR 5001:" part to "" using GetErrorText.

Note that if you have multiple errors inside a single status you'll need to fetch and $replace them individually.

That's because I don't think you can generate custom error codes. I would delegate the errors to your application layer instead.

By default, Base64Decode and Base64Encode are functions used to decode and encode datatypes, or best saying... STRING.

Since you want to encode a stream the Decoder must understand that it should continue from the last chunk position instead of assuming a new string, otherwise you'll get a corrupted result.

Here's how XML Writer outputs an encoded binary.

/// <method>WriteBase64</method> encodes the specified binary bytes as base64 and writes out the resulting text.
/// This method is used to write element content.<br>
/// Argument:<br>
/// - <var>binary</var> The binary data to output. Type of %Binary or %BinaryStream.
Method WriteBase64(binary) As %Status
{
  If '..InRootElement Quit $$$ERROR($$$XMLNotInRootElement)

  If ..OutputDestination'="device" {
    Set io=$io
    Use ..OutputFilename:(/NOXY)
  }

  If ..InTag Write ">" Set ..InTag=0

  If $isObject(binary) {
    Do binary.Rewind() Set len=12000
    While 'binary.AtEnd {
      Write $system.Encryption.Base64Encode(binary.Read(.len),'..Base64LineBreaks)
    }
  } Else {
    Write $system.Encryption.Base64Encode(binary,'..Base64LineBreaks)
  }

  If ..OutputDestination'="device" {
    Use io
  }

  Set ..IndentNext=0

  Quit $$$OK
}

As far as I know, %OnBeforeSave is used to prevent (abort) the save operation and to customize %Save status.
You can't change any properties on that phase as they won't be reflected, that is because the payload is already queued

to be saved.

%OnAddToSaveSet is executed before before the queueing phase, which allows you to overwrite property values.

Unless you want to do something that deals with complex business rules, you should indeed use InitialExpression as Fabian suggested, even use ReadOnly to prevent the property's value to be edited.

Unless you're asking for maximum performance you could create a method that receives or creates a %SQL.Statement for the %File:FileSet query and detect if "Type" is "D" to call it recursively, passing that statement instance.

Here's a use case that I applied that pattern:

Method SearchExtraneousEntries(
statement As %SQL.Statement = "",
path As %String,
ByRef files As %List = "")
{
  
  if statement = "" {
    set statement = ##class(%SQL.Statement).%New()
    $$$QuitOnError(statement.%PrepareClassQuery("%File", "FileSet"))
  }
  
  set dir = ##class(%File).NormalizeDirectory(path)
  set row = statement.%Execute(dir)
  set sc = $$$OK   
  
  while row.%Next(.sc) {
    if $$$ISERR(sc) quit
    set type = row.%Get("Type")
    set fullPath = row.%Get("Name")
    
    if ..IsIgnored(fullPath) continue
            
    if type = "D" {      
      set sc = ..SearchExtraneousEntries(statement, fullPath, .files)
      if $$$ISERR(sc) quit
    
    
    if '..PathDependencies.IsDefined(fullPath) {
      set length = $case(files, "": 1, : $listlength(files)+1)
      set $list(files, length) = $listbuild(fullPath, type)
    }
  }
  quit sc
}