My VSCode-ObjectScript extension supports the same versions as Atelier, so Caché/Ensemble versions from 2016.1.

You can not just edit mac routines, and classes, compile and do many other features. And it is also supporting Intellisense. CSP support, just very simple, only syntax highlighting as HTML, as it already stored as files, you can just use the same folder in Caché and in VSCode, and everything should be OK. You can find details about the latest features here, and use the arrow buttons to navigate to the previous release notes.

And now, we are also offering enterprise support, so, you will get faster issues solving, and may get some priority in realization new features.

When you have || in Id, it usually means, that you have additional Index by two or more fields, and marked as IdKey. Something like this.

Property RollNo As %String;

Property Marks As %String;

Index RollNoMarks On (RollNo, Marks) [ IdKey, Unique ];

Here we have some two properties, to store both values separately, but we used both of their values as IdKey.

Now it is possible to check of existing an object by ##class(User.School).%ExistsId(RollNo_"||"_Marks), or open id by  ##class(User.School).%OpenId(RollNo_"||"_Marks)

But in additional to this system methods, we now have some new ones which construct with index name and "Exists", "Open" and so on. But call will be a bit different. You should pass all values separately in this way.

##class(User.School).RollNoMarksExists(RollNo, Marks)

##class(User.School).RollNoMarksOpen(RollNo, Marks, , .sc)

You can try any of these ways, and you should remember about Status variable, which may help you to understand what's happening.

do $system.OBJ.DisplayError(sc)

will decode status

Sure, we can discuss here, or in github

Yes, some IntelliSense options need connection to the server, yet for now. Like, it's easier yet to get a list of methods directly from server, instead of parsing all files locally. But some things, such as commands and system functions should work without a connection. As well as go to definition if file exists locally. It is possible to add loading METADATA info prepared for Atelier, to use it as a cache for system classes, but it anyway will need a connection to a server at least once.

Rubens, thanks for the donation. 

You can just disable a connection with "objectscript.conn.action": false, and activity with a server will be disabled,  autosave as well.

Actually I had this option previously but removed it recently. If you would like to have this option, just fill the issue, I will add it.

It's cool, that more and more extensions will appear, to add more features. 

btw, are you going to GlobalSummit, so we would have some discuss there?

It is not actually a bug, it mostly expected behavior. I'm not sure about Windows (just, too lazy to check it there), but in *nix systems, folders and files just act almost exactly the same way, there only one difference in one flag, D for directories. And if you would like to check the difference just look at the method DirectoryExists also in the %Library.File class

It's actually a very interesting question, and actually mostly depends on what you really want to achieve.

First of all, you can redefine Dockerfile name in docker-compose.yml

version: '3.7'
services:
  myapp:
    build:
      context: .
      dockerfile: Dockerfile.test

or just in docker build

docker build -f Dockerfile.test .

But there is another use case when you would need even different docker-compose.yml files. If you would want to split running like in the production and development environment.

You can just do it this way, 

docker-compose up --build -f docker-compose.prod.yml

docker-compose up --build -f docker-compose.dev.yml

But I would recommend a bit different way, based on the possibility to extend one YAML file with the content of another.

So we need main docker-compose.yml

version: '3.7'
services:
  iris: 
    extends:
      file: docker-compose.${MODE:-dev}.yml
      service: iris
    ports:
      - '${WEBPORT:-52773}:52773'

which in fact declares just common settings, which will be the same for dev and prod.

Then look at docker-compose.dev.yml, for example I'm using IRIS community edition for development

version: '3.7'
services:
  iris: 
    image: store/intersystems/iris:2019.3.0.302.0-community
    volumes:
      - ./src:/opt/myapp/src

and docker-compose.prod.yml, and licensed version for testing production environment

version: '3.7'
services:
  iris: 
    image: store/intersystems/iris:2019.3.0.302.0
    volumes:
      - ~/iris.key:/usr/irissys/mgr/iris.key

By default when you will start docker-compose environment, it will use dev. Because here ${MODE:-dev} it uses environment variable MODE with default value as dev. Then, you can change it this way.

MODE=prod docker-compose up --build

But you can also create file .env with values for any environment variables which you would like to redefine. So, in our case it can be like this, with changed WEBPORT as well.

MODE=prod
WEBPORT=52774

It is not a big problem to make it with Caché. But I would recommend you to look at different ways. Using Caché just as a reverse proxy, looks like a sledgehammer to crack a nut.

First of all, you should go from an original source of users, I don't see it in your question, but it can be LDAP/AD or something else outside of Caché. 

I would look at Nginx for example, which is actually very flexible.

Interesting, why you duplicated lower and uppercase, and not sure if it's good to uppercase all letters in transliterated variant, even when only this letter was in uppercase. I mean like, Юла -> YUla, looks weird. I think it should check the case of the original word, if it completely uppercase, it should uppercase resulting word, but if only first letter in upper, so, resulting string should use $zconvert(word, "W")

NPM does have the command npm init which asks you the basic information about your package and generates a package.json.

Yes, kind of init command sounds useful. You know, we anyway have many differences with npm, for instance. Like, zpm works inside the database with nothing on disk, while npm in OS close to source files, but I think we can find the way how to achieve the best way.

It is already there, should work the same as semver in npm
Nice! Does it follow the same format as the one from NPM?

Yes, the same way

1 - Is there any plans to automatize the module.xml generation by using something like a Wizard?

Any reasons for it? Are you so lazy, that you can't write this simple XML by hand? Just kidding, not yet, I think best and fastest what I can do it, add Intellisense in vscode for such files, so you can help to do it easier. Any UI, at the moment, is just a waste of time, it is not so important. And anyway, is there any wizard from NPM?

2 - Is there any plans to support non-specific dependency versions like NPM does?

It is already there, should work the same as semver in npm

3 - Is it possible to run pre/post-install scripts as well? Kind of what installer classes do.

There is already something like this, but I would like to change this way.

4 - Is also possible to use the module.xmlto provide a contextual root? 

Not sure about contextual root. But if saying about UnitTests, yes actually there are many things which should be changed in the original %UnitTests engine. But in this case, it has some way to run tests without care about UnitTestRoot global.  ZPM itself has own module.xml, and look there. You will find lines about UnitTests. with this definition, you run these commands, and it will run tests in different phases

zpm: USER>packagename test

zpm: USER>packagename verify