Try to call your API via Postman or a similar REST Client.
Set user/pass in a Basic Auth bloc of Postman request.
- Log in to post comments
Try to call your API via Postman or a similar REST Client.
Set user/pass in a Basic Auth bloc of Postman request.
That would be if the property was defined as:
Property Test As list Of %String;But in this case it's %Collection.ListOfDataTypes and not %ListOfDataTypes, which can save strings of arbitrary length by default.
Property Test As %ListOfDataTypes;Simple test case:
Class Test.Col Extends %Persistent
{
Property Test As list Of %String;
Property Test2 As %ListOfDataTypes;
/// do ##class(Test.Col).Test()
ClassMethod Test(test2)
{
set p = ..%New()
set val = $tr($j("",100)," ", "a")
if test2 {
do p.Test2.Insert(val)
} else {
do p.Test.Insert(val)
}
set sc = p.%Save()
zw sc
}
}Show the relevant part of your XSD.
To connect you'll need a create a new Cache connection (as the driver seems to be installed) and specify these params:
Can you explain the issue you're having?
To determine contents of %request, %response and %session objects you can add this to the beginning of your code
set %response.ContentType = "html"
do ##class(%CSP.Utils).DisplayAllObjects()
quit $$$OK
It would return detailed information about the request as an html page.
You need to tell Springboot to fetch data in display mode or use %EXTERNAL function on column value. That's Java side.
Alternatively, on InterSystems side you can create a calculated property and access it in Java (replace !!! with full class name):
Property PetNameDisplay As %String [ SqlComputeCode = {##class(!!!).PetNameLogicalToDisplay(##class(!!!).PetNameGetStored({%%ID}))}, SqlComputed, SqlComputeOnChange = (%%INSERT, %%UPDATE) ];Remove SqlComputeOnChange and add Calculated if you want it always computed instead of trigger-computed and stored.
I agree with general approach offered by @Julius Kavay, just wanted to add some notes:
1. Use $zf(-100) as a more secure alternative to $zf(-1) if it's available.
2. I'd go straight for imagemagick as it's crossplatform.
3. If speed is an issue you can consider using high-level MagickWand API (here's resize example) or low-level MagickCore API (here's resize functions) via Callout functionality. CoreAPI would be faster as there's in-memory image initializers so you can skip input/output file creation/consumption and work with inmemory streams.
Enable ODBC log. What statement throws an error?
Are you using custom email adapter? I'm unable to find PutStream method in EnsLib.EMail.OutboundAdapter.
Is there any node.js used to fetch the data???
No, RESTForms is used to fetch the data. It's a REST API.
Is any bootstrap involved too to beautify it??
Probably.
The server part is explained in the linked articles. I can answer additional questions if ou have any.
For client part, @Sergey.Sarkisyancan weight in?
Show us %GSIZE report?
This code will output arbitrary query to CSV.
/// w $System.Status.GetErrorText(##class(!!!).ToCSV())
ClassMethod ToCSV(file = {##class(%File).NormalizeDirectory(##class(%SYS.System).TempDirectory())_ "out"}, query As %String = "SELECT 1,2,'a'", args...) As %Status
{
#dim sc As %Status = $$$OK
// Cant't do $zcvt($e(file,*-4,*), "l") as it can bring unexpected effect on case-sensitive fs
// Possible solution is to make file byref but it should be done in application code
set:$e(file,*-4,*)=".csv" file = $e(file, 1, *-4)
set dir = ##class(%File).GetDirectory(file)
set exists = ##class(%File).DirectoryExists(dir)
if (exists=$$$NO) {
set success = ##class(%File).CreateDirectoryChain(dir, .code)
set:success=$$$NO sc = $$$ERROR($$$GeneralError, "Unable to create directory: '" _ dir _ "', reason: " _ code)
}
quit:$$$ISERR(sc) sc
#dim rs As %SQL.StatementResult
set rs = ##class(%SQL.Statement).%ExecDirect(,query, args...)
if rs.%SQLCODE=0 {
do rs.%DisplayFormatted("CSV", file)
} else {
set sc = $$$ERROR($$$SQLError, rs.%SQLCODE, rs.%Message)
}
quit sc
}Show examples of the row variable. What condition do you want to check?
Something like this?
while (ind '= ""){
set row = ^CacheTemp(repid,"MAIN",ind)
if row [ "keyword" {
use filemain write row,!
} else {
use filemain2 write row,!
}
; Get next row index for MAIN report
set ind = $order(^CacheTemp(repid,"MAIN",ind))
}
close filemainCan you show your code and explain what are you trying to achieve?
set n=10
for i=1:1:n { write "set txt("_i_")="""_i_"""",!}Would output
set txt(1)="1"
set txt(2)="2"
set txt(3)="3"
set txt(4)="4"
set txt(5)="5"
set txt(6)="6"
set txt(7)="7"
set txt(8)="8"
set txt(9)="9"
set txt(10)="10"
You can paste it where needed.
Sure, why not?
There are already prebuilt docker containers, just pull it from docker hub:
docker pull intersystemscommunity/irispy:latest
The binaries are available from the releases page.
I completely agree, and to get to
standard installing mechanism
for USERS, we need to zpm-enable as many existing projects as possible. To enable these projects we need to simplify the zpm-enabling, leveraging existing code if possible (or not preventing developers from leveraging the existing code). I think allowing developers to use already existing installers (whatever form they may take) would help with this goal.
Great article!
I have some questions:
I completely support inclusion of projections.
ObjectScript Language allows execution of arbitrary code at compile time through three different mechanisms:
All these instruments are entirely unlimited in their scope, so I don't see why we need to prohibit one way of executing code at compilation.
Furthermore ZPM itself uses Projections to install itself so closing this avenue to other projects seems strange.
Great to hear that!
If you're interested in performance, upgrading to 2016.2+ would help tremendously with JSON processing due to the addition of dynamic objects.
Furthermore, upgrading to IRIS 2019.1.1 would add %JSON.Adaptor which simplifies JSON (de)serialization of normal objects.
Here's how.
1. Create a hardlink to the original database in a new empty folder.
2. Add this new database.
3. Mount new database as readonly.
4. Map package from this new database.
This way you can access code for RW (in original namespace/database) and RO (in hardlink database).
Just tried this setup and it worked.