Very nice tools, but I disagreed with majority of fatal problems (especially in my projects :)). And who said that I could not write $$$MACRO as a single statement? Where did you get one?

So... it would require a lot of fine tuning to work properly accroding to the experienced Cache' ObjectScript programmer tastes. Java idioms ported as-is to ObjectScript just doesn't work.

GLOBUFF is the nice tool for showing current distribution of a cached data (given current memory limitations, past activity and current algorithm), but what is more important is to estimate a maximum of working set for your applications using your own globals. Which might be quite big if yu have transactions-heavy application.

[Worth to mention that for ^GLOBUFF case you could quite unique information - these % of memory used by process-private variables, which will be essential information for SQL heavy applications]

But for me it's always interesting to know - what amount of memory would be enough to keep all your globals in memory? Use ^%GSIZE:

                Global Size Display of /xtreme/db/sber/
                             10:57 AM  Jul 10 2015

                     sber.Err     150           sber.data.AccountD 2833662
           sber.data.AccountI 1234223         sber.log.LoadAccount      88
    sber.log.LoadAccount1Proc      88         sber.log.Transaction      11
     sber.log.generateAccount       3    sber.log.generateAccount0       1
 sber.log.generateTransaction       4           sber.log.loadBatch       7
          sber.log.loadBatch0       1             sber.tmp.Account  999784

      TOTAL: 5068022

So we see 5 millions of 8KB blocks necessary to hold whole application dataset, this might serve as upper estimate for the memory necessary for your applications. [I intentionally omit  here needs for routine buffer, which is usually is negligible comparing to the database buffers, it's very hard to find application code which, in its OBJ form, would occupy multiple gigabytes, or even single GB]

All in all, for the case shown above we started from 40GB of global buffers for this application, because we had plenty of RAM.

I'd expect it supported via this set of share tools

but apparently it just sends the link to the given address. Ok, you need page scraper then - and the easiest tool for this I know is uKeeper by Umputun (very famous in Russian IT podcaster, one of voices of "Radio-T" podcast).

P.S.

It's quite normal for all "share-buttons" to just send a link with title and not sending full content, they are all intending to be used in social networks where text length limitation is the major factor.

If you need article content to get delivered to your email "inbox" - use RSS subscriptions and RSS-enabled mailer. Like Microsoft Outlook :)

As it has already been reported elsewhere there is RSS feed for whole community portal - https://community.intersystems.com/rss

Also, Eduard has created wider feed which includes both StackOverflow [InterSystems related tags] and Community Portal contents - http://www.rssmix.com/u/8177955/rss.xml (though, it has noticable delay while exporting Community Portal)

No, IIRC, consensus was about slight different formula - sorting by Max(PostDate, LastCommentDate) DESC which is quite different to "ordered by the latest posts, then posts with the latest comments, DESC".

Even very old post after new comment should go to the top of list for Newest filtering mode. If it's hard to implement such Max aggregation function in Drupal then I'd try to directly update post date/time on each new comment...

IMVHO, threading is useful, very useful, even in such micro-context as discussion after proposed answer in the StackOverflow [as you introduce in this newer change].

Once you killed threading, you could easily introduce unnecessary confusion of participants, for which it's becoming harder to comment on the relevant comment.

  1. Hope threading will not be killed in the "Article" mode;
  2. Hope threading will be added later to the Q&A mode, because it will simplify discussion.

Yes, %Projection mechanism is quite convenient tool to automatically setup things upon simple recompilation.

Nikita Savchenko (author of this mentioned UMLExplorer ) is now working on an article for  HabraHabr site [in Russian] where he explains usage scenarious of that  %Projection facilities in more details. My assumptin is that soon after this post will be translated to English.

Here are few usage cases I was involved lately:

  • in the package manager I want to have easy to use facilitity which will allow to create package definition even for the sources directly loaded from GitHub. Something similar to package.json, but handled by $system.OBJ.Load() . So solution was to create projection class which will load package metadata definition at the end off package compilation - https://github.com/cpmteam/CPM/blob/master/CPM/Sample/PackageDefinition.cls.xml  
    You inherit your class from CPM.Utils.PackageDefinition, insert package.json definition part as XData block, and at the end of compilation stage you not only have your classes compiled, but also have registered them in the package manager as part of package [this will be necessary, for example, for package uninstall] 
  • I've discovered lately that despite the fact that WebSockets could be made working without any installation ste (i.e. you simple call appropriate CSP URL), but for REST handler you have to create web-application (https://community.intersystems.com/comment/3571#comment-3571) and the easiest solution was creation of special projection which would be creating nested web-application upon recompilation - https://github.com/intersystems-ru/iknowSocial/blob/master/TWReader/Setup.cls.xml , and regardless of a namespace it was installed to, and url of default CSP application assigned to this namespace this nested rest handler will still be available handy.

Some time ago, my previous favorite "lightweight" editor was Atom [1], and we even played with its integration with Caché some time ago. But now, due to multiple reasons, this project is stale, community people which was doing development gone, Atom broke their API multiple times (and still not fixed some fundamental issues, like lack of debugger API or editor limitations).

So after all these years I believe that Microsoft Code is better, faster, and smarter incarnation of Atom nowadays (their editor is much faster, they do already have refactoring and debugger API implemented). So, eventually, after Atelier API will be released  there will be better grounds for resurrection of similar projects, but with server-side already been taken care of via REST API...

[1] Ok, ok, there is Sublime text editor,  which  always was much, much faster than Atom. But extensions there should be written in Python (which I unfortunately dislike), so JavaScript based systems (like Atom, Code or Brackets) are much more preferable for me. At least from "hacking" pont of view.