Hi Neerav,

No, I am returning it as a JSON string.

something on the lines of

ClassMethod GetDetails ()
{
set result = []
&sql(declare a1 cursor for select ID, GName, DOB into :id, :gName, :dob from MyPackage.MyClass)
&sql(open a1)
for{
&sql(fetch a1)
quit:SQLCODE'=0
set rec={"rid"(rid)"gName":(gName)"dob":(dob)}
do result.%Push(rec)
// do result.$push(rec) ; for older version of cache
}
&sql(close a1)
quit result.%ToJSON()
// quit result.$toJSON() ; for older verions}
}

Regards

Anil

Hi Neerav,

You could try http://tabulator.info/.  This is quite fast and responsive.  I have tested over 10000 records.  It supports bootstrap as well.

Please find some code to get you started.  There are lots of examples in the documentation.

<head>

        <link href="https://unpkg.com/tabulator-tables@4.5.2/dist/css/tabulator.min.css" rel="stylesheet">
        <script type="text/javascript" src="https://unpkg.com/tabulator-tables@4.5.2/dist/js/tabulator.min.js"></script>

</head>

<body>

<div><table id="tblAbcd"></table></div>

<script language ="javascript" >

     var ret #server(MyPackage.MyClass.GetDetails()#;
     var retJSON JSON.parse(ret);

     var ht=600;

     var pageRows = 50

     var  table = new Tabulator("#tblAbcd", {
          data:retJSON,           //load row data from array
          height:(ht+"px"),
          layout:"fitColumns",      //fit columns to width of table
          responsiveLayout:"hide",  //hide columns that dont fit on the table
          tooltips:true,            //show tool tips on cells
          addRowPos:"top",          //when adding a new row, add it to the top of the table
          history:true,             //allow undo and redo actions on the table
          pagination:"local",       //paginate the data
          paginationSize:pageRows,         //allow 7 rows per page of data
          movableColumns:true,      //allow column order to be changed
          resizableRows:true,       //allow row order to be changed
          initialSort:[             //set the initial sort order of the data
               {column:"gstName", dir:"asc"},
          ],
          columns:[                 //define the table columns
              {title:"Id", field:"id",visible:false},
              {title:"Name", field:"gstName", editor:"input", headerFilter:"input",sorter:"text"},
              {title:"DOB", field:"dobDt", editor:"input",headerFilter:"input",sorter:"date"},               
          ],

    });    

</script>

</body>

Hope this helps
Regards

Anil

Hi Peter,

I am not sure if it is appropriate to post my concern here.  Please let me know if I need to move this out this and post separately. 

The reason I posted it here is because similar posts have been redirected to this discussion.   This is wonderful discussion and we ourselves are probably on a similar journey butstill at exploratory stage.  Both you and Eduard seems to have traveled further down this road and you might be able to point me in the proper direction.  Below mentioned is my concern.

Is it mandatory for the application to be rendered using the CSP Server to maintain the CSP Session?

Would it be possible for html and javascript based application can connect to the REST service and maintain the session and the license?

If possible would someone be able post the login code and and the headers or parameters to be passed for the subsequent calls to the REST service. 

I made some trials with the Delegated login and ZAUTHENTICATE (Hardcoded the Username and Password ) for the REST application

I tried  all the below steps (only for the REST web application.  The client web application is not CSP)

  1. All brokers effectively have Parameter UseSession = 1;
  2. REST web application and client web application allow only authenticated (i.e. password) access.
  3. REST web application and client web application have reasonable Session timeout (i.e. 900, 3600).
  4. REST web application and client web application have the same GroupById value.
  5. REST web application and client web application have the same cookie path.

Still each call adds a new session. 


 

Hi Mike,

In 2013, we did some performance test to figure out the same on Cache 2010 and we found the following:-

Accessing data for reporting etc

      $Order was the fastest, next was  SQL,  objects were the least efficient, probably because it fetches a lot of information every time you open an object reference.  

Insert /Update

     SQL was better as it took care of the Indices and SQL computed fields etc consistently and efficiently.

     Object save, had some inconsistency in terms of SQL computed fields.

Coding

     Cache Studio, supported Objects the most :)

I am not sure if this still holds true with the current version.

Anil

Hi,

You can make a vbscript file (.vbs) with the script below and then run that to first convert your xls file to tab delimited.

If you wish to save as coma delimited you could also use ( oBook.SaveAs WScript.Arguments.Item(1), 6 ) instead of (oBook.SaveAs WScript.Arguments.Item(1), -4158) in the script below. 

I usually execute the vbs using $zf(-2,"scriptfilename.vbs"  "filetoconvert.xls" "ouptutfile.txt")

Then you can work with the resulting file.  Hope this helps.

============================

if WScript.Arguments.Count < 2 Then
    WScript.Echo "Error! Please specify the source path and the destination. Usage: XlsToCsv SourcePath.xls Destination.csv"
    Wscript.Quit
End If
Dim oExcel
Set oExcel = CreateObject("Excel.Application")
Dim oBook
Set oBook = oExcel.Workbooks.Open(Wscript.Arguments.Item(0))
oBook.SaveAs WScript.Arguments.Item(1), -4158
oBook.Close False
oExcel.Quit
WScript.Echo "Done"

===========================

Regards

Anil