Hi Fernando,

Strange. I've never seen this before, and it seems like a bug to me.  If no other suggestions appears here, I would reach out to the support team at InterSystems for advice.

If just defining classes and routines, VS Code is another alternative IDE. This is being developed further to bring it in line with the features of Atelier and provides another formidable option.

Of course the more mature IDE is IRIS Studio, another alternative for you, if running on a Windows client.

 Steve

Hi Wayne,

I'm not sure I can help you with an article, but for what it's worth, the error indicates there may be an issue with Soap security policies of the service. 

For example, where the mock service is expecting to be called using HTTPS, but it is not. 

To try and resolve, it may be worth reviewing what the mock service is expecting security-wise, and match that with how its being invoked.

Steve

Hi Chris,

I agree - note the Schedular basically STARTS or STOPS a business host automatically, on a pre-defined schedule -  (so it applies to Operations and Processes too, not just Services, which are the items that have a CallInterval feature).

For regular invocations of work on services,  in almost all cases,  absolutely - CallInterval is the way to go, and is what is used mostly. I certainly would prefer looking at the production and status of my business hosts and see all of them 'green' and running - even though, running might actually means 'idle in between call intervals' .  Using the Schedular the stopped business host will appear 'gray' when it is not started (ie, it is disabled) 

There are valid use cases -  though, a Schedule on, say, a Business Operation makes sense. For example, you may want to send out messages to business operation that interacts with a fee-per-transaction end point that is cheaper during certain times of the day. In this case, you can disable the operation, send queue it messages all day (which will accumulate in it's queue), then, at the appropriate time, enable the business operation via the schedular, then, disable it again after a period of choice.

In this thread's case, the easiest approach is to use OnInit to prepare the data and send the data. OnProcessInput (called by the interval, which can be very long), would do nothing but quit. That would work. Of course there are other approaches.

I wanted to include the Schedular information as it is often overlooked, and, sometimes, the full story and use case of the original poster might not be evident, and, schedular might have been appropriate. 

Thanks for your feedback.

Steve

Hi Eric

If you want your service t0 be part of the framework, but not actually use any specific connection functionality typically offered by adapters (SQL, FILE, ..etc)  Just ensure that the adapter is first set to  'Ens.InboundAdapter':

Parameter ADAPTER ="Ens.InboundAdapter" ;

And - set your PoolSize is set to 1, so a job is started. with the production.  Note that for every cycle of the Call Interval setting, the OnProcessInput method will be called.

If you want to regularly do your work (ie:  "go through a list of values in a global and compare dates. If criteria is met, it will send an email."),  then, do this in the OnProcessInput method at your desired CallInterval.

However - As you said "on start..." I'm assuming you meant, on start of the production as a whole -   In this case, leave the OnProcessInput method empty with just a 

Quit $$$OK

statement, and, (as others mentioned here), put the logic in the OnInit() method of your service, which will be invoked on production startup or enabling/disabling of the service.

Note that without the Adapter parameter setting above, and the pool size set to 1 - neither OnInit, nor OnProcessInput are called.

Now - Productions are meant to keep on running. You may eventually move away from putting this logic in the OnInit code or somewhere which requires a Production re-start in order to execute, as this effects other running business hosts ....  To explore other options further you can

(a) Work with the CallInterval which calls OnProcessInput after n seconds, and build in logic that determines if a particular cycle should just do nothing, or (say, on the change of the day, or other controlling factors, like, the size of your global entries) - would go ahead and do the emails.  Note that you can set Properties for your business service, to record state - which you can initialise  a value for in the OnInit, and update regularly during the running state of the service if you need to.

(b) Look at the Schedular feature.  The Schedular feature controls the running state of a business host. With the schedular you can elect to Enable/Disable any service on a pre-defined schedule. So - You can enable  your service, (with OnInit code to check globals and send emails), at an interval of choice without needing to stop/start the production. click : here for documentation.

Sincerely -

Steve

Hi

I believe this is a work in progress in the product - but I know of no ETA, so at the moment, everyone builds their own synchronisation techniques as other comments here explain.

* Disclaimer - this is not necessarily the 'Answer' you where looking for, but merely a comment you might find useful * 

In the past I create a framework for doing this and more.   I'll describe it here :

Using a pre-defined base - one would create any number of subclasses for each type of data you wanted to synchronise (for example, a sub-class for mirroring security settings) and in these subclasses implement 2 methods only:

- The first method 'export' deals with collecting a set of data from wherever, and, saves it as properties of the class (for example, in this case the method will export all security settings and read back the xml export in a global character stream for persistence within the DB). These are persistent sub-classes

- The second  method  'import' is the opposite side, which would unpack recently collected data and sync (for example - export the global character stream of the instance data to a temporary 'security.xml'  file and run through the system API's to importing those settings)  

The persistent data created by the classes during the 'export' method call is be saved to a mirrored database, so by default becomes available on other nodes, during 'import' invocation.

A frequently running scheduled task , executing  on any mirror member (primary, secondary or async member) would iterate through known subclasses and based on knowing that server's role, would either invoke the 'export' or 'import' methods of each subclasses . (of course the primary members call the 'export' method only, the other roles call the 'import' method.)

There are various checks and balances, for example, to ensure only the last instance data is imported on the import sides in case some where skipped for some reason. Also - that no import executes midway - ie - waits until an export has been flagged as complete...

I wrote this as a framework because I felt there is other data - not just the Security data in CACHESYS  that would need replicating between members. 

I have done a fair amount of testing on the tools,  and completed it around the time I heard InterSystems was working on a native solution so have not persisted further in documenting/completing.  I wrote this for someone else, who ended up building a more hardcoded, straightforward approach, so it is not actually in production anywhere.

Steve

Don't see why not...

You've got to ask yourself  - do you want to hit that website (which returns the full set) every 5 seconds ?.. Probably not.  I would be hitting it every hour and, spend the time in between hits to go through and update the documents in the document database.

It's your choice whether to pause operations, delete all documents, and upload all documents every n seconds as a whole - that would be an easy approach. I think however, you can get clever and identify an element that can act as a key for you, and use it to extract individual documents and update them with changes, - then, insert new ones.   Keeping a track of rows inserted and updated with each cycle via some 'last updated' property, will also allow you to purge any rows which have been deleted and should no longer appear in your collection.

The above seems like a good approach, your use case may dictate a slightly different one. I'm not sure if there is a technical question here.  Technically - you will call the web site for the batch content in the same way, and, given the properties you already setup via CreateProperty - you can run an SQL Query to extract an individual document for updating/deleting.

Steve

Hi,

Sorry not getting back to you sooner - I was on a flight.

You are on the right track. You just need to understand the makeup  of the data returned.

So - to recap - in IRIS you are preparing a Document database - a collection of JSON documents. Each document represents an 'artist', with 'name', 'playcount', 'listeners' and other  properties.

The JSON string for 1 such entry (artist), or document in the collections would be look something like this, which is embedded actually, in the whole JSON your HTTP request returns..:

{
        "name": "Tough Love",
        "playcount": "279426",
        "listeners": "58179",
        "mbid": "d07276bc-3874-4deb-8699-35c9948be0cc",
        "url": "https://www.last.fm/music/Tough+Love",
        "streamable": "0",
        "image": [
          {
            "#text": "https://lastfm-img2.akamaized.net/i/u/34s/3fa24f60a855fdade245138dead7ec...",
            "size": "small"
          },...

}

If you extracted each artist document in the collection,  you can insert it into the database individually like this:

do db.%SaveDocument({"name":"Tough love","playcount":"279426",... })

The %SaveDocument method takes a single JSON dynamic object and inserts into the collection.  The whole JSON blob goes into the %DocDB 'column' of the projected table, and, for some elements like 'name' specifically created as columns via %CreateProperty - will be individually populated as column values.

But  - as mentioned earlier - the output from your HTTP call returns JSON which, down a few levels deep, has a collection of 'artist' documents :

{
  "artists": {
    "artist": [           <-- This is where the collection of artists starts from
      {
        "name": "Tough Love",
        "playcount": "279426",

Here are two approaches:

1. Option #1 - iterate through the correct part of your returned data, to extract individually each artist. (I prefer the next option #2)

Using the returned whole JSON, access the element 'artists', - and then, it's property 'artist'.  This element (represented by path "artists.artist", (poorly named imho), is actually the collection.  Use an iterator to iterate through each item in the collection.  

set wholeJSON={}.%FromJSON(httprequest.HttpResponse.Data)
set iArtist=wholeJSON.artists.artist.%GetIterator()  // iArtist is the iterator for the collection of artist JSON's
while iArtist.%GetNext(.key,.value) {
    // key is the item number in the collection
    // value is a dynamic object of the item in this collection
    do db.%SaveDocument(value)  //  insert 1 artist at a time.
}

2.  As you have discovered, you can use db.%FromJSON to import a whole collection of documents, in one hit, but, what you supply, should be a string or stream in JSON format, representing an array of documents, which the raw HttpResponse.Data is not because of the leading elements 'artists', etc..- but you can dive in and get the array:

set wholeJSON={}.%FromJSON(httprequest.HttpResponse.Data)
set arrArtists=wholeJSON.artists.artist   // this is the collection of artists
do db.%FromJSON(arrArtists.%ToJSON())  // need to give %FromJSON a json string.

.. and in one GULP, ALL artist documents, ie , all items in the collection, are added into the document database (I tried this - I confirm 1000 rows created).

Use option 1 if you want to filter the injection of data into your document database, or option 2 if you want to do a batch upload in one hit.

Let us know how you get on...

Steve

Sure thing...

Use the %Net.HttpRequest class, to make the HTTP request, and, take the response's data

Set httprequest=##class(%Net.HttpRequest).%New()
Set httprequest.Server="http://ws.audioscrobbler.com"
set URL="/2.0/?method=chart.gettopartists&api_key=65218c8cdd03ba3836f9fc8491fb6957&format=json&limit=1000&page=10"
Do httprequest.Get(URL)

The httprequest object has a property 'HttpResponse', now containing the http response.  The HttpResponse in turn, has a stream property 'Data' containing the entire http response body -

so, you can read off this Data stream to get your raw json and setup jstring variable, however, as I see you want to call %db.FromJSON, and, that method takes a stream object anyway - you can skip setting up the jstring variable and just do this directly which should work: 

DO db.%FromJSON(httprequest.HttpResponse.Data)

 

Steve

Thanks. - I'm going to use COUNT(). (I was aware of introducing a subclassed adapter, but want to keep my design as simple as possible)

For those interested who might be following the thread - I thought I'd post a more detailed entry of the use case and proposed solutions, just for education purposes.

Problem: I want to group multiple rows of my query in fewer Ensemble messages that get submitted.  That is - my query might return rows:

1- A
1- B
1- C
2- A
2 -B
...

I want to send only 2 Ensemble messages, the first that has a list property containing 1A, 1B and 1C; and for 2nd that has the same list property containing  items 2A,2B

I had this working by collecting the 'current' 1st column value in an instance property of the service, and, checking for when the first column value changes - (ie, when 1st column value goes from 1 to 2, I need to submit the first message) - but - the problem was that message #2, won't get submitted.

Solution #1: Use COUNT() and Business Service Instance properties.

- I'll change the query to return a column Count() (thanks for reminding me Eduardo..), which  MUST be the # of rows of the query - independent of any state data that Ensemble may be holding on to, like, recently processed ID's, etc. 
- I'll a new Properties on the Service called:  CurrentROW
- Every invocation of OnProcessInput will increment the property 'CurrentROW'.

I will use the existing logic to fire off Ensemble Messages at the correct row intervals, but I will also include a check to see if pInput.GET("ROWCOUNT")=..CurrentROW, - confirming I'm on the last row, and in that case, fire off the remaining Ensemble message, at the end - then set ..CurrentRow back to 0.

Solution #2: (thanks Gilberto) - use the support of a Business Operation

- Modify the query to only return distinct rows on the first column - hence - only 2 rows, leaving my collection properties of the Ensemble Message empty,
- In my current Business process - make a Business Operation call *back* into the database, to get the 'child' rows, (in the case of ID 1, this second query will return A,B,C)
- Add these to the list properties of the BP request message, and continue normal processing.

Thanks for the ideas ...

Steve

Hi Laura,

I would declare the class property without the underscore as you have tried, given the way the generated code (in the class ending in ...Thread1.cls) interprets this and generates code.

Having said this - I'm interested in then knowing how you are going about generating the JSON string.   This is probably where you need to focus hour efforts and set a JSON element of "status_id".

How are you generating you JSON, and what version of Ensemble are you on ?  Answers to these questions will help others looking at your post to contribute a  solution.

Thanks - Steve