· Mar 16, 2018

translation of data ready to push to and from a JSON restAPI

most of my jSON data is single words, but when it comes to sending chunks of a memo/email via JSON, that totally different

which is the correct conversion process to prepare data ready to insert into the data part of a JSON string so that all problem characters are converted

JSON of course crashes if you use the wrong format of quotes (single or double) and you ned up having to use &QUOT and \' to get around that but is there a simple call to something that will just take care of preparing the data from the JSON string both on the inward and outward part of JSON


Discussion (13)2
Log in or sign up to continue

Hi Keven,

It can depend on what the data is and where (what technology) you are sending it from.

If you are using JavaScript for instance then calling its JSON.stringify() will auto escape the obvious characters for you.

These are mainly the JSON reserved characters and control characters which get prefixed with \u.

If you have binary data then you will want to Base64 encode the data which will make it JSON safe, or send the binary separately with the JSON, for instance in a multi form data request.

I have a backwards compatible (and battle tested library) for handling JSON inside Caché that demonstrates encoding and decoding these characters, you can find it here...

I also have some updates to push to it which will auto convert binary data in and out of JSON without the 3.6GB large string limitation.



thanks for the quick update,

technology wise, Cache, QEWDjs, restAPI's using standard JSON, JWT's - there will be about 100-150 api's and I'm trying to get the right concept correct before I rattle off the main bulk of them.

As long as I declare the JSON format, the third party will work with it.

I actual downloaded Cogs last night having followed your earlier thread about Cogs, and then imported JsonClass1.0.4.xml but it generated errors (I've just re-compiled it so I could get the error messages)
Compilation started on 03/16/2018 10:50:10 with qualifiers 'ckbry-u'
ERROR #5373: Class 'Cogs.Bookshelf.Page', used by 'Cogs.Lib.Json.Docs.JsonBenchmarks:superclass', does not exist
Skip class Cogs.Lib.Json.Docs.JsonBenchmarks
ERROR #5373: Class 'Cogs.Bookshelf.Page', used by 'Cogs.Lib.Json.Docs.Jsonclass:superclass', does not exist
Skip class Cogs.Lib.Json.Docs.Jsonclass
ERROR #5373: Class 'Cogs.Bookshelf.Page', used by 'Cogs.Lib.Json.Docs.Readme:superclass', does not exist
Skip class Cogs.Lib.Json.Docs.Readme
ERROR #5373: Class 'Cogs.Touchstone.TestSuite', used by 'Cogs.Lib.Json.Tests.JsonClass.JsonClassSuite:superclass', does not exist
Skip class Cogs.Lib.Json.Tests.JsonClass.JsonClassSuite
ERROR #5373: Class 'Cogs.Touchstone.TestClass', used by 'Cogs.Lib.Json.Tests.JsonClass.JsonClassTest1:superclass', does not exist
Skip class Cogs.Lib.Json.Tests.JsonClass.JsonClassTest1
ERROR #5373: Class 'Cogs.Touchstone.TestClass', used by 'Cogs.Lib.Json.Tests.JsonClass.JsonClassTest2:superclass', does not exist
Skip class Cogs.Lib.Json.Tests.JsonClass.JsonClassTest2
ERROR #5373: Class 'Cogs.Touchstone.TestClass', used by 'Cogs.Lib.Json.Tests.JsonClass.JsonClassTest3:superclass', does not exist
Skip class Cogs.Lib.Json.Tests.JsonClass.JsonClassTest3
ERROR #5373: Class 'Cogs.Touchstone.TestClass', used by 'Cogs.Lib.Json.Tests.JsonClass.JsonClassTest4:superclass', does not exist
Skip class Cogs.Lib.Json.Tests.JsonClass.JsonClassTest4
ERROR #5373: Class 'Cogs.Touchstone.TestClass', used by 'Cogs.Lib.Json.Tests.JsonClass.JsonClassTest5:superclass', does not exist
Skip class Cogs.Lib.Json.Tests.JsonClass.JsonClassTest5

  • are you missing a couple of classes in the project.
  • how do I view the doc's  ?

having been very impressed with the small 90-100 lines of code to make it.

I'm actually interfacing to a third party and busy creating the actual API's (both in and out of the server)

I followed the way you did the SQL names etc, just couldn't see the syntax of the  calls to Cogs to create and interpret the JSON strings,

I'm after smallish ( 10kb) JSON strings for the memo/email side so, your current limitations is more than enough.


In terms of docs, there is a page here that you can read, its in the git docs folder...

If you are planning on using the new REST classes in Caché then you might also be interested in some Cogs additions that are not too far off. These include a REST mapping generator with automatic Swagger / OpenAPI documentation generation.


thanks for the updates and clues

I'm about to update to the very latest cache this weekend, and have had a special Cache build that allows us to use NodeJS version 8.

There's lots of existing  cache classes to do the fetching, filing as we had an old concept that uses simple CSV files to an external third party server, but now it's all coming in house (with associated short timescales of course) and I now need to hook into all these new realtime restAPI's to do the work.

I've been using EWD since it's first incarnation, and the new QEWDjs just makes life easy.

I'd be interested in the new upgrade when it becomes available, any idea of the timeline ??

most grateful,


Hi Keven,

The compilation is failing for the unit tests which I have included in the src folder, but thought I had removed from the main build file. I was planning on a new push so I will tidy that up. For now you can ignore those particular errors, the main Cogs.JsonClass will not be effected.

If you are on a new version of Caché then you should check to see if $ZCVT supports JSON, if you are currently rolling your own JSON and just want to escape it then just do...


Or if you are targeting old and new then you can call the base EscapeJSON method on Cogs which compiles by version to $ZCVT or its own escape method, you can call that with...


I would imagin if you are working with QEWD then you might be shuffling lots of data around in globals / arrays.

In this instance you might not want to use the Cogs JSON Class as you will end up having to shim in a whole new set of classes.

The new Cogs push does have more options now, ability to work with globals and arrays and legacy classes (without extending them).

There are also the Zen utilities that might help, again these are only in the more recent versions of Caché.


The way I'd do this is to not do any JSON handling in Cache itself, but output the data into a temporary global whose structure reflected the desired JSON structure.  Then on the Node.js / QEWD side, point a DocumentNode object at the global and use its getDocument() method - that will generate a JS object containing your data.  Finally delete the global using the DocumentNode object's delete() method


so, I'm assuming this is on section 20 (and others) of your QEWDjs training material whereby the global's contains the structure you want in JSON. and then using DocumentNode to get at the data inside javascript

a couple of questions, 

  • In Cache, we would normally use $J in the global to keep it unique to the user, but we're using worker processes, so how do we keep the equivalent of $j in QEWDjs ? (I'm going to be using JWT's if that makes any difference)
  • if I have synchronous API's happening at the same time (say 5 workers running), I'm assuming that we somehow need to keep track of what global node belongs to which API, so is it the "worker process ID" we use? - is it that simple?
  • if we do the equivalent of the $J, how do you clear down  (everything) ? in EWD, you kept session values, and when the session died/expired, then the sessions were cleared. I know that we should use the  DocumentNode object's delete() method, but is there any catch-all if anything happens