Thanks guys! I could not find helper links to these methods in the ADO .NET Managed Provider class Intersystems.Data.CacheTypes.CacheStatus so I have to proxy to these Intersystems methods via my own Helper class 

Class App.Status.Helper Extends %RegisteredObject
{

ClassMethod GetOneStatusText(pStatus As %Status) As %String
{
    Quit $system.Status.GetOneStatusText(pStatus)
}

}


This is because by default, when you have an error status, CacheStatus.Message property contains a string formatted as  Error  #5001: CustomerID not valid
 

There's one odd thing in the documentation (Cache 2018.1.3) under %SYSTEM.Status and that is the method signatures for some of these methods

classmethod GetOneErrorText(statuscode As %Status, index As %Integer, language) as %Boolean
classmethod GetOneStatusText(statuscode As %Status, index As %Integer, language) as %Boolean

Surely, with "Text" in the method name the return type should be a %String? Why does it say %Boolean?!

Statistical performance metrics isn't what I'm looking for. Rather than a simple number or metric, I'm looking to actually map global structures - similar to what the journal is doing with sets and kills but for a specific routine or set of routines. Also some code doesn't execute but it is still important to identify references to globals.

A few of things you could try:

  1. Restore any changes you made to the registry using a backup
  2. Use a system restore point to restore your system to a point before the Caché installation
  3. Use a newer build of Caché eg. Caché 2018.1.3.414 if you are able to.

Alternatively, you might want to contact Intersystems WRC directly or try an installation on a clean system to see if you get the same error. We have a number of small teams in our organisation that use Caché. Our application support team wanted to simplify upgrades to Caché so they designed a simple batch script and published it through SCCM for Windows 10 clients.  The script was based on the 'unattended installation' commands described in the installation guide and involved removing previous Caché versions before installing the desired version. You also might not need the full-kit for your needs - particularly if you are connecting to a remote Caché instance from the Windows 10 client and you use Atelier or VSCode for development.

Approach 1

I would be tempted to have your Dispatch class have a forwarding rule for the API version eg v1 or v2 and this will help ensure a clear hierarchical separation both in the URL and in the class definitions between versions. You might also be interested in the %request.URL property for checking relative paths. An example based on your route map might look like 

<Map Prefix="/v1/customer" Forward="MyApp.APIVersion1.Customer" />
<Map Prefix="/v2/customer" Forward="MyApp.
APIVersion2.Customer" />

And your class MyApp.APIVersion1.Customer might look like

<Routes>
        <Route Url="/getcustomer" Method="Post" Call="GetCustomer" />
</Routes>

Personally, I like the classmethod names to reflect the HTTP Method so if I see a GetCustomer method I know that's a HTTP Get method but this is based on personal preference and convention rather than a rule.

Approach 2

The alternative approach is to have everything in the same class but over time this may cause your classes to be rather bloated and unwieldy

<Map Prefix="/v1/customer" Method="Post"  Call="GetCustomerV1" />
<Map Prefix="/v2/customer" Method="Post"  Call="GetCustomerV2" />

Other Thoughts

I do not know if there's a specific function that can be called prior the classmethod in the route map that can validate or invalidate routes. Perhaps the OnPreHTTP method could be used? I noted that some of your methods had the word "default" in them. You can define default route as "/" in your route map.

Strange. Our production servers are Caché 2018 on AIX but still showing only 16,384 KB.
Must have preserved the existing setting on upgrades rather than use the new value.
Fresh local Caché install on Windows install shows 256MB though. 

%SYS>w $zv                                                            
Cache for UNIX (IBM AIX for System Power System-64) 2018.1.2 (Build 309_5) Wed Jun 12 2019 20:08:03 EDT                                                        
%SYS>w $zs
16384         
 

Try this based on npm link

File Structure
/projects/my-scss
/projects/my-existingproject

Create a new project
cd /projects
mkdir my-scss
cd my-scss

Initialize the project and answer the prompts
npm init

Dump a SCSS file in there

// _base.scss
$font-stack:    Helvetica, sans-serif;
$primary-color: #333;
body {
  font: 100% $font-stack;
  color: $primary-color;
}

Navigate to your existing project
cd ../my-existingproject
npm link ../my-scss

Verify the my-scss folder exists in the node_modules of your existing project.

Then suppose you want to get all the *.scss files in your my-scss project and put them in the /wwwroot/scss folder of my-existingproject. Gulpfile.js within my-existingproject would look something like

const gulp = require('gulp');
const { src, dest } = require('gulp');
const merge = require('merge-stream'); 

var deps = {
    "my-scss": {
        "**/*.scss": ""
    }
};

function scripts() {

    var streams = [];

    for (var prop in deps) {
        console.log("Prepping Scripts for: " + prop);
        for (var itemProp in deps[prop]) {
            streams.push(src("node_modules/" + prop + "/" + itemProp)
                .pipe(dest("wwwroot/scss/" + prop + "/" + deps[prop][itemProp])));
        }
    }

    return merge(streams);
}

exports.scripts = scripts;
exports.default = scripts;

Then providing you have installed gulp and all required gulp-modules run 'gulp' from the project directory command-line. This will run the default task.

This might be a stupid answer but here goes! 

It sounds like you are looking for build tools to help you manage your CSS and JavaScript needs (minification, bundling, pollyfillers, CSS generation from SCSS, prefixing) . Have you looked at Gulp or Grunt before? You can use NPM to get them. These tools have a SCSS/SASS module that you can import before writing your various build tasks. 

You can also find good templates for these tools that you can use as 'boilerplate' code that you then customize for your needs. 

Gulp Template

Grunt Template

Another part of your question seemed to refer to using NPM to create local packages that you can then import into various projects. This article describes three solutions:

  1. npm link 
  2. npm install /absolute/path/to/project
  3. npm pack with npm install yourproject.tgz

You can install packages globally using npm install -g <package>  but this is generally used for CLIs 

Adding the scope 'offline_access' to the 'password' grant_type generates a refresh_token in the JSON response. 

endpoint: https://{{SERVER}}:{{SSLPORT}}/oauth2/token

{
"grant_type":"password",
"username":"test1",
"password":"P@ssw0rd",
"client_id":"XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX",
"client_secret":"XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX",
"response_type":"token",
"state":"myapp",
"scope":"myapp openid profile offline_access"    
}
 

Response JSON

{

"access_token": "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX",

"token_type": "bearer",

"expires_in": 180,

"refresh_token": "7NJ7tQbFBLFcUftZr9j4n6o99Og03QeM6rx51L05eIU",

"scope": "myapp offline_access openid profile",

"account_enabled": 1,

"account_never_expires": 1,

"account_password_never_expires": 1,

"change_password": 0,

"comment": "Test User",

"full_name": "test1",

"roles": "%DB_CODE,createModify,publish"

}

So... if you detect that access_token is no longer valid, you could try using the refresh_token to generate a new one without prompting the user for input. It seems a good idea to have the refresh_token interval significantly larger value than your access_token value. I will need to do more experimentation to find the ideal intervals and review the impact on license usage.

This is a common problem. Please bear in mind your system config is specific to you so what is described below may not be the answer.

Initially I used a 3rd party app called CNTLM and pointed Eclipse to the CNTLM process port, which points to the corporate proxy but I would no longer recommend this option as it doesn't account for passwords expiring regularly. 

I later discovered that Basic Proxy Authentication was disabled by default as part of a JRE 8u111 Update under the heading 'Disable Basic authentication for HTTPS tunneling'.  As the document describes, you can override this behavior either 1) globally on your machine if you have the necessary permissions or 2) locally if you have Eclipse installed on a file system you have write access permission.   

Try changing your Eclipse.ini file to include this under after -vmargs

-Djdk.http.auth.tunneling.disabledSchemes="" 

Leave your Network Connections set to 'Native'. Only HTTP Dynamic should be ticked

I would not recommend updating from 1.0 to 1.3 because there has been so many changes since then and projects will need to be migrated. It would be safer to try downloading a fresh install following the instructions to install the plugin and then test your 'Check for Updates' button. 

Using a fresh workspace is also recommended.

On a different note, you can pass proxy login details into the target url if you encode it properly. I've used this trick for Node Package Manager (NPM) configuration in the .npmrc file

proxy=http://DOMAIN%5Cusername:password@myproxyserver.net:8080/

The other common issue you might encounter is the PKIX Path building failed. This is related to HTTPS connections from your JRE running Eclipse and a missing CA certificate from your cacerts certificate store.

Consider logging the issue with WRC if you are looking for a more bespoke solution.

That looks a little complex. In contrast to the above example, the default $System.Encryption.GenCryptRand() size appears to be 8 as $L(user.Salt) resolves to 8. After a bit of experimentation, I found I didn't need to encode anything at all. In this example, I'm using user test1 with password P@ssw0rd on a non-unicode 8-bit Cache installation. Use $SYSTEM.Version.IsUnicode() to check your installation

Do ##class(Security.Users).Exists("test1",.user,.status1)
set storedHash = user.Password 
set computed=$System.Encryption.PBKDF2("P@ssw0rd", 1024, user.Salt,20,160)

Produces the following 20 byte hash using PBKDF2 with 1024 iterations, 64 bits of salt and SHA1 (160)

%SYS>zw storedHash
storedHash="n"_$c(138)_"z iSWWs"_$c(11)_"cbM"_$c(27)_"nY'"_$c(3,152)_"H"

%SYS>zw computed
computed="n"_$c(138)_"z iSWWs"_$c(11)_"cbM"_$c(27)_"nY'"_$c(3,152)_"H"

Thanks @Barton Pravin for clarifying scope and providing the code snippets!  If you want the claim to be part of the Token endpoint response message, you can use Do properties.ResponseProperties.SetAt(roles,"roles") 

Heres's a sample of ValidateUser customization

 // Use the Cache roles for the user to setup a custom property.
Set sc=##class(Security.Roles).RecurseRoleSet(prop("Roles"),.roles)
If $$$ISERR(sc) Quit 0


Set roles=prop("Roles")
Do properties.CustomProperties.SetAt(roles,"roles") 

// SETUP CUSTOM CLAIMS HERE
Set tClaim = ##class(%OAuth2.Server.Claim).%New()
Do properties.ResponseProperties.SetAt(roles,"roles")
Do properties.IntrospectionClaims.SetAt(tClaim,"roles")
Do properties.UserinfoClaims.SetAt(tClaim,"roles")
Do properties.JWTClaims.SetAt(tClaim,"roles") 
Do properties.SetClaimValue("roles",roles)

And a sample of the REST API application

Class API.DemoBearerToken Extends %CSP.REST
{
Parameter APIHOST = "localhost";

Parameter APIPORT = 57773;

Parameter APIPATH = "/api/demobearertoken";

Parameter CHARSET = "utf-8";

Parameter CONTENTTYPE = "application/json";

Parameter OAUTH2CLIENTREDIRECTURI = "https://localhost:57773/api/demobearertoken/example";

Parameter OAUTH2APPNAME = "demobearertoken";

XData UrlMap [ XMLNamespace = "http://www.intersystems.com/urlmap" ]
{
<Routes>
<Route Url="/getToken" Method="Get" Call="GetToken"/>
</Routes>
}


Classmethod AccessCheck(Output pAuthorized As %Boolean = 0) as %Status
{

Set dayNum = $p($H,",",1)
Set timeNum = $p($H,",",2)


Set accessToken = ..GetAccessTokenFromRequest(%request) 
Set scope = "createModify openid profile publish" 
Set isValidToken=##class(%SYS.OAuth2.Validation).ValidateJWT(..#OAUTH2APPNAME,.accessToken,,,.jsonValidationObject,.securityParameters,.error)
Set ^LOG(dayNum,timeNum,$UserName,"API.DemoBearerToken",$ztimestamp,"AccessCheck")=$zdatetime(dayNum_","_timeNum,4,1,,,4)_"*"_isValidToken

Set:isValidToken=1 pAuthorized=1
Set:isValidToken=0 pAuthorized=0 
Quit $$$OK
}
ClassMethod GetToken() As %Status
{
#dim %response as %CSP.Response
Set %response.Expires = 86400
Set %response.Headers("Cache-Control") = "max-age=86400"

Set dayNum = $p($H,",",1)
Set timeNum = $p($H,",",2) 
Set ^LOG(dayNum,timeNum,$UserName,"API.DemoBearerToken",$ztimestamp,"GetToken")=$zdatetime(dayNum_","_timeNum,4,1,,,4) 
Set accessToken = ..GetAccessTokenFromRequest(%request)
Set scope = "createModify openid profile publish" 
Set valid=##class(%SYS.OAuth2.Validation).ValidateJWT(..#OAUTH2APPNAME,.accessToken,,,.jsonValidationObject,.securityParameters,.error) 

Set introspectionStatus=##class(%SYS.OAuth2.AccessToken).GetIntrospection(..#OAUTH2APPNAME,accessToken,.introspectionJSON) 
Set userInfoStatus = ##class(%SYS.OAuth2.AccessToken).GetUserinfo(..#OAUTH2APPNAME,accessToken,,.userInfoJSON) 
Set jsonResponse = {}.%Set("OAUTH2APPNAME",..#OAUTH2APPNAME)
Do jsonResponse.%Set("ValidateJWT",valid) 

Do jsonResponse.%Set("jsonValidationObject",jsonValidationObject)
Do jsonResponse.%Set("IntrospectionJSON",introspectionJSON)
Do jsonResponse.%Set("sc_userinfo",$$$ISOK(userInfoStatus)) 

If $$$ISOK(userInfoStatus) {
 Do jsonResponse.%Set("UserInfoJSON",userInfoJSON)
}

Write jsonResponse.%ToJSON()

Quit $$$OK 
}

}

And the Postman response for GET https://{{SERVER}}:{{SSLPORT}}/api/demobearertoken/getToken

{

"OAUTH2APPNAME": "demobearertoken",

"ValidateJWT": 1,

"jsonValidationObject": {

"jti": "https://localhost:57773/oauth2.mxn0URwYVkmaX9BSKHGIzISi-cI",

"iss": "https://localhost:57773/oauth2",

"sub": "test1",

"exp": 1565708268,

"aud": "xxxxxxxxxxxxxxxxxxxxx",

"roles": "%DB_CODE,%Manager,createModify,publish"

},

"IntrospectionJSON": {

"active": true,

"scope": "createModify openid profile publish",

"client_id": "xxxxxxxxxxxxxxxxxxxxxxxxxxx",

"username": "test1",

"token_type": "bearer",

"exp": 1565708268,

"sub": "test1",

"aud": "xxxxxxxxxxxxxxxxxxxxxxx",

"iss": "https://localhost:57773/oauth2",

"roles": "%DB_CODE,%Manager,createModify,publish"

},

"sc_userinfo": 1,

"UserInfoJSON": {

"sub": "test1",

"roles": "%DB_CODE,%Manager,createModify,publish"

}

You can see 'roles' has been added to JWT, Introspection and UserInfo claim types. In my real-world application it's probably sufficient to add it to the JWT. 

Also @Eduard Lebedyuk  suggested using the AccessCheck method to verify the token in the post From Cache how to Retrieve and Use/Reuse a Bearer Token to authenticate and send data to REST web service?  AccessCheck is called before anything else so if the token is invalid or has expired a 401 Unauthorized HTTP Response is returned. A web application consuming this REST API can then process this unauthorized status and return them to the login screen. 

I can also see in the classmethod ##class(OAuth2.Server.Token).ReturnToken(client,token) where there is a section on adding customized response properties but where are these set?

My AccessToken.ResponseProperties array appears to be empty

Set json.scope=token.Scope
// Add the customized response properties
Set key=""
For {
Set value=token.Properties.ResponseProperties.GetNext(.key)
If key="" Quit
Set $property(json,key)=value
}

You could use https://www.slideshare.net/ or add the document to the GitHub repo.

There is a way to post documents on Intersystems Community under Edit Post -> Change Additional Settings, which I documented here but it's not user friendly and I didn't automatically see links to attached documents within the post so I had to manually add the links. Community feedback suggests they may turn this feature off at some point so I'd recommend any of the above options instead.

You could also try a tool like Postman to test service calls and authentication methods.  If you tick the Password checkbox it enables Basic authentication (plain text username/password) is enabled. You can  also use bearer tokens instead, which is a popular authentication scheme.

While authentication/authorization isn't really covered in great detail, REST and Relaxation is a good starting point for REST development and it comes with a video and source code.

You should also double-check your URL is correct and resource permissions are correct. You probably only need permissions on the ENSEMBLE namespace and there might be a resource that defines this. Your URL is probably something like http://yourserver/rest/coffeemakerapp/coffeemaker