Actually, with my VSCode extension is not supposed that you need to sync your code. It expects, that you have only one source of truth, your local sources. But if you can export any code from the server, it can be the same folder where your sources already placed or any others. Just use Export action in Context menu in Explorer, and it will export all your classes or routines.

When you try to open any file in explorer, and this class or routine you already have locally, it will show your local file instead of server's file. Just because it does not expects any difference, as I said your local sources have the privilege to be used first.

Any file opened from server will be shown in read-only mode.

It actually, does not matter, where to install drivers.

My Caché works in docker, so, I downloaded ODBC drivers from the ftp.

extracted it, just in Downloads, and run there ODBCInstall from a terminal.

with ODBCInstall I got file mgr/cacheodbc.ini, which I copied to ~/Library/ODBC/odbc.ini, there will be as User DSN. /Library/ODBC/odbc.ini for System DSN.

DYLD_LIBRARY_PATH should follow to bin folder in extracted ODBC Drivers folder

in my case 

export DYLD_LIBRARY_PATH=/Users/daimor/Downloads/ODBC-2018.1.2.309.0-macx64/bin

you can check connection with iODBC Manager, running right from terminal

DYLD_LIBRARY_PATH=/Users/daimor/Downloads/ODBC-2018.1.2.309.0-macx64/bin open /Applications/iODBC/iODBC\ Administrator64.app

and open excel same way

DYLD_LIBRARY_PATH=/Users/daimor/Downloads/ODBC-2018.1.2.309.0-macx64/bin open /Applications/Microsoft\ Excel.app

So, you'll be able to test settings without relogin

What do you have already?

Did you configure ODBC DSN, DYLD_LIBRARY_PATH?

Some information about configuring ODBC, you can find here.

First of all I think you need iODBC installed on your mac.

then ODBC Caché drivers, with correct environment variable DYLD_LIBRARY_PATH, do not forget to relogin after the set variable.

As proof, that's possible to do it. Done with Caché 2018.1

Look at this

Class User.Test
{

ClassMethod Test(Args...)
{
  ZWRITE Args
}

ClassMethod Test2(Arg1, Arg2, Arg3, Arg4, Arg5, Arg6, Arg7, Arg8)
{
  ZWRITE Arg1
  ZWRITE Arg2
  ZWRITE Arg3
  ZWRITE Arg4
  ZWRITE Arg5
  ZWRITE Arg6
  ZWRITE Arg7
  ZWRITE Arg8
}

}

and let's call it

USER>do ##class(Test).Test(1,2,,,,,,8)  
Args=8
Args(1)=1
Args(2)=2
Args(8)=8

and second method

USER>do ##class(Test).Test2(1,2,,,,,,8) 
Arg1=1
Arg2=2
Arg8=8

And another way 

USER>set args=8 ;just the number of the latest one                      

USER>set args(8)="test"

and calls

USER>do ##class(Test).Test(args...)
Args=8
Args(8)="test"

USER>do ##class(Test).Test2(args...)
Arg8="test"

It's not so much important to have key before install, much more important to have it when server is running.

But how you sure that your key is suitable for this platform. You can check it on running container, where you can enter inside, and go the csession. You can find some intersting methods for $SYSTEM.License in the documentation which can help you to check license file inside the container.

Most of the reasons for getting such error is just missed license file or exceeded license limit.

Just check it, you can mount it during docker run or copy it inside the image during docker build.

I see that you use quite an old version, I would recommend considering usage latest version based on IRIS. Due to many limitations of using such an old version in Docker.

Some time ago I did an example of Angular Application with IRIS on a backend.

Source for this project available on gitlab.

How to develop Angular application, you should look at angular documentation, and all about frontend development. There are some tools, which helps to develop and build your frontend side. Such as webpack, which do most of work related with build your sources to  production ready environment.

In my simple project, you need only docker, and any editor, preferable VSCode.

By command,  you will get a running server in development mode. So, you can edit Angular code, and IRIS code, and see an immediate result.

docker-compose up -d

And this project also deployable with Kubernetes. So, after any push of changes to gitlab, it will build, and test it.

node_modules, never goes to source control. It's enough to have package.json and package-lock.json there. node_modules may contain platform specific libraries.

Hope I will manage to write a complete article about this project, with all the details.