Do you have a server connection defined?
- Log in to post comments
Do you have a server connection defined?
Tried that, not the best idea. Tracing and debugging become much harder and you can't query as easily via SQL.
Usually I work with JSON input in Ensemble like this:
Just one point it's Angular2 not AngularJS - I think AngularJS could be delivered by CSP but I doubt that it would be the best way to deliver Angular2
What's the difference between AngularJS and Angular2 in regards to CSP?
AngularJS is the first version, Angular 2-5 are collectively referred as Angular and somewhat compatible between themselves. @Sergei Sarkisian?
Connection is not license. From docs:
Maximum number of processes (connections) per user is 12.
Will this affect our own internal tasks?
It shouldn't as that would be one connection and license slot.Still, I'd recommend contacting WRC to clarify the issue completely.
source code in your posts
Check this article. No external tools required. More articles about how to get the most from community are available on the link at the bottom of the page:

Recently I built a REST API, 1 typical client for which generated dozens of requests per second, I tested locally with one client and usually got 10 to 20 thousands of exceptions in a few minutes as one "run" of the client.
In the end I wrote a custom logging system, because even one minimal run hanged UI.
Code.
I'd like to add that you need to purge the log often, in my experience the UI becomes not very responsive after about 10 000 entries per namespace per day.
The CSP part (REST for sure and maybe SOAP) could be managed by these two tools:
Additionally you cat use ^ZSTART routine to check if user can login.
Execute queries and show tables
State is not required (and in fact harmful) for that case. Here's why.
Users do not care for thousands of results.
That doesn't happen. User most often cares about one specific result, or a small group of them - dozen(s), rarely up to a hundred but to be extremely generous let's say that user cares about 1000 individual records at the same time tops.
So, how do we work with that assumption?
There are several things to do.
One other case where user might be actually interested in the exact number of the results is when he needs that number only. For example our user might be interested in the number of incidents per month and calculate it himself by filtering incidents by date and getting the results count. This requirement could be addressed in several ways:
tl;dr users are not interested in thousands of results, so we need to build systems where they get only the results they really need.
The call to
$$setStream^%apiGTW(QHandle,RawData,0,1)
Is roughly equal to:
#dim gc As %SQLGatewayConnection#dim RawData As %Stream.Object//set sc = gc.ParamData(hstmt, .index) //not sure if it's requiredwhile 'RawData.AtEnd { set string = RawData.Read(16000) quit:string="" set sc=gc.PutData(hstmt, string) // PutDataW}Examples are available in EnsLib.SQL.Common class:
Method putLOBStream(pHS As %String, pStream As %Stream.Object, tBin As %Boolean) As %Status{ Set tSC=..%Connection.ParamData(pHS,.tInd) Set:$$$SQLCODENeedData=..%Connection.sqlcode tSC=$$$OK Quit:$$$ISERR(tSC) tSC Set temp=pStream.Read(16000) If (temp = "") { Set err=..%Connection.PutData(pHS,"") } Else { While ""'=temp { If ..IsUnicodeDLL&&'tBin { Set tSC=..%Connection.PutDataW(pHS,temp) } Else { Set tSC=..%Connection.PutData(pHS,temp) } Quit:$$$ISERR(tSC) Set temp=pStream.Read(16000) } } Quit tSC}All brokers effectively have Parameter UseSession = 1;
But the default value in %CSP.REST is Parameter UseSession As BOOLEAN = 0;
So the developer has to remember to override this every time (or sub-class)
I recommend having an abstract broker, which does technical stuff like CORS, UseSession, Encoding, JSON transformation. All other brokers must extend it (and they don't do technical stuff, only business logic). More on that.
Use same GroupById
If two (or more) web applications share the same GroupBy value, then session opened in one application would be valid in all other applications with the same GroupBy value. This way user needs to login only once (or not at all if we have domain/SSO authentication configured). It's also documented (but hard to find), more docs.
But if it's a third party app then there is no CSP/ZEN app - the use case I have in mind is a 3rd party web developer is creating a complex shop system that needs to communicate with Caché
CSP app could contain only HTML/JS/CSS. So it could be an AngularJS web application, but also hosted with Caché (as a Caché CSP web application).
I have no idea or interest in what technology they are using and it may be that their programming language does not easily support cookies so the CSPCHD (the session cookie) does not get passed.
If you host a web-application via Caché (Ensemble, HealthShare, InterSystems IRIS) then your web application would be authorization-aware automatically. Browser sends relevant cookies/headers with each request so developer don't need to think about it.
I am thinking that in this case the authentication needs to be passed with each Rest call - not an issue
(or use OAUTH which I know little about)
You can do it like this:
This way you pass login/pass only once instead of with every call. But I'd still recommend Caché security system.
Security (link under construction)
Using JSON - how do you implement a logon?What are the licensing issues?
For password authenticated web applications it is possible by following these steps:
If all these conditions are met, user would only consume one license slot per session and perform only one login.
How do you prevent users hacking restful calls that they have no access to?
Authentication as a start, SQL security for basic data access checks, app-level checks for the most specific cases
Sessions or No Sessions
REST mainly disallows sessions as a mechanism of data transfer. Stateless as stated in REST dissertation by Roy Fielding has three aims:
Visibility is improved because a monitoring system does not have to look beyond a single request datum in order to determine the full nature
of the request. In my opinion that's the most important one and it mainly deals with storing session data in between requests.
Let's say you provide a newsfeed API. There's a lot of news so users get them by pages of 10 elements. Clients can access news in two different ways:
Reliability is improved because it eases the task of recovering from partial failures. Where partial failure is defined as (from Waldo J,
Wyant G, Wollrath A, Kendall S. A Note on Distributed Computing.):
Partial failure is a central reality of distributed computing. Both the local and the distributed world contain components that are subject to periodic failure.
In the case of local computing, such failures are either total, affecting all of the entities that are working together in an application, or detectable by some central resource allocator (such as the operating system on the local machine).
This is not the case in distributed computing, where one component (machine, network link) can fail while the others continue. Not only is the failure of the distributed components independent, but there is no common agent that is able to determine what component has failed and inform the other components of that failure, no global state that can be examined that allows determination of exactly what error has occurred.
In a distributed system, the failure of a network link is indistinguishable from the failure of a processor on the other side of that link.
Sessions as an authentication/authorisation mechanism do not affect Reliability.
Scalability is improved because not having to store state between requests allows the server component to quickly free resources, and
further simplifies implementation because the server doesn't have to manage resource usage across requests. Not relevant in our case as Session still gets created, just destroyed immediately after the request is done.
To sum up: REST APIs can use sessions as authentication mechanism, but not as a data transfer mechanism.
Aletier can now be installed as a plugin, check installation instructions. If you already have eclipse, skip to step 2.
Maybe remove $zf calls?
Here's sample code:
ClassMethod Test(){//Create new Gateway connection objectset gc=##class(%SQLGatewayConnection).%New()if gc=$$$NULLOREF quit $$$ERROR($$$GeneralError,"Cannot create %SQLGatewayConnection.")//Make connection to target DSNset pDSN="Cache Samples"set usr="_system"set pwd="SYS"set sc=gc.Connect(pDSN,usr,pwd,0) if $$$ISERR(sc) w !, $SYSTEM.OBJ.DisplayError(sc) QUIT 0if gc.ConnectionHandle="" W !, $$$ERROR($$$GeneralError,"Connection failed") QUITset sc=gc.AllocateStatement(.hstmt) if $$$ISERR(sc) w !, $SYSTEM.OBJ.DisplayError(sc) QUIT 0//Prepare statement for executionset pQuery= "select Name, DOB from Sample.Person WHERE Name %STARTSWITH ?"set sc=gc.Prepare(hstmt,pQuery) if $$$ISERR(sc) w !, $SYSTEM.OBJ.DisplayError(sc) QUIT 0//Bind Parametersset sc=gc.BindParameter(hstmt,1,1,1,12,30,0,30)if $$$ISERR(sc) w !, $SYSTEM.OBJ.DisplayError(sc) QUIT 0set var = "A"set sc=gc.SetParameter(hstmt,$LB(var),1)if $$$ISERR(sc) w !, $SYSTEM.OBJ.DisplayError(sc) QUIT 0//Execute statementset sc=gc.Execute(hstmt)if $$$ISERR(sc) w !, $SYSTEM.OBJ.DisplayError(sc) QUIT 0//Get list of columns returned by queryset sc=gc.DescribeColumns(hstmt, .columnlist) if $$$ISERR(sc) w !, $SYSTEM.OBJ.DisplayError(sc) QUIT 0//display column headers delimited by ":"set numcols=$listlength(columnlist)-1 //get number of columnsfor colnum=2:1:numcols+1 {Write $listget($listget(columnlist,colnum),1),":"}write !//Return first 20 rows set sc=gc.Fetch(hstmt)if $$$ISERR(sc) w !, $SYSTEM.OBJ.DisplayError(sc) QUIT 0s rownum=1while((gc.sqlcode'=100) && (rownum<=20)) {for ii=1:1:numcols {s sc=gc.GetData(hstmt, ii, 1, .val)w " "_valif $$$ISERR(sc) break}s rownum=rownum+1write !set sc=gc.Fetch(hstmt)if $$$ISERR(sc) break}//Close cursor and then disconnectset sc=gc.CloseCursor(hstmt)if $$$ISERR(sc) w !, $SYSTEM.OBJ.DisplayError(sc) QUIT 0set sc=gc.Disconnect()quit sc}Check BindParameter method and EnsSQLTypes for SQL types macro definitions. Varchar maybe.
You can also try to call PutData method and similar methods (by passing parts of stream there).
You can convert object into json:
set sc = ##class(%ZEN.Auxiliary.jsonProvider).%ObjectToJSON(obj)
And work with that.
So I gather that you HAVE to pass in a list of IDS to export
Yes.
The problem with write *-3 is that it is currently added to a system code. I'm searching for a better solution.
Can you post XML file you're trying to import?
Looks like it's empty.
It's on Cache side as adding
Write *-3
fixes the problem.
What error are you getting from Import method?
#2 limit yourself to 2048 char.
Nowadays it's fine. Most browsers work with long URLs.
So is your redirect just URL or URL?urplarams=......
It's URL?urplarams=......
One parameter is 5000+ symbols long.
I think that using ExportTasks and ImportTasks methods of %SYS.TaskSuper class as proposed by @Sean Connelly and @John Murray would be a better solution as:
MONEXT is an include file, you can see it in %SYS namespace.
You can map it to your namespace (in SMP - Namespaces - Routine mappings).
ServerSideRedirect doesn't work in that context (OAuth authentication) unfortunately.
I also tried setting LOCATION header directly (via SetHeader method) - in that case I get the same truncated output.
You can export/import ^SYS("Task","TaskD") global directly, which contains task definitions data.
Some properties are computed (i.e. LastFinished), and could probably cause this error.
Try to export/import minimal set of columns.
Alternative approach would be writing code to generate tasks.
export portal settings
What settings exactly?
Most settings like users, web applications, etc. could be recreated via %Installer (en article, ru article).
You can also use Version Control and Continuous Delivery to automate movement between different servers (en articles: 1, 2, 3, 4, 5, 6, ru article covers parts 1 to 4)
Please post relevant piece of BPL xml definition.
Should they have access?
The best solution here is restricting access.
InterSystems security model allows for a customizable access management.
Application building/testing/deployment tasks could be fully automated and so not require human intervention at all.