Thanks Marc,

Your code work great.

We gain a factor ten with this implementation.

Below our current implementation.

/// Structure of pParms is Caché Multidimensional Array
/// Where :
/// pParms indicate the number of row in batch
/// pParms(integer) indicate the number of parameters in the row
/// pParms(integer,integerParam) indicate the value of the parameter whose position is integerParam.
Method ExecuteUpdateBatchParmArray(Output pNumRowsAffected As %Integer, pUpdateStatement As %String, pParms...) As %Status
{

set tStatus = $$$OK

try{

set pSQLStatement=pUpdateStatement

// JDBCGwy is an instance of the JDBC Gateway object. EnsLib.SQL.OutboundAdapter instantiates this automatically and stores a reference to it in ..%Connection.%JGProxy

// Prepare the SQL statement
Set pHS=..%JGProxy.prepareStatement(..%ConnHandle,pSQLStatement)

// executeParametersBatch expects tArgs to be a $LIST, with the following format:
// ParamCount, ParamSets, Type1, Param1, Type2, Param2, Type3, Param3, Type11,Param11… TypeNN,ParamNN
// 
// ParamCount is the number of parameters the query expects (in this example 2) 
// ParamSets is the number of rows we will be inserting in this batch
// Type1, Type2, ..., TypeN is an integer indicating the JDBC data type for the corresponding Param value (e.g. Param1, Param2, ..., ParamN)
// Param1, Param2, ..., ParamN is the value for the query parameter

set nbParam = pParms(1,1)
set nbBatch = pParms(1)

set $LIST(tArgs,1)=nbParam // How many parameters ("?") in it
set $LIST(tArgs,2)=nbBatch // We will insert nbBatch rows in this batch

set = 2

for k=1:1:nbBatch {



for l=1:1:pParms(1,k){

set = +1
set $LIST(tArgs,i)=12 // The JDBC data type for varchar is 12
set = +1
set $LIST(tArgs,i)=pParms(1,k,l) // Value for column Field1VarChar

}

}


// Perform the batch insert
// tResultCodes is a $LIST of integers indicating success/failure for each row in the batch
set tResultCodes = ..%JGProxy.executeParametersBatch(pHS,tArgs)

//Todo, Read list set pNumRowsAffected = $LISTLENGTH(tResultCodes)
set pNumRowsAffected = nbBatch 


// Remove Statement to avoid CURSOR Leaks
set sc = ..%JGProxy.removeStatement(pHS)

tArgs

}

catch exp{

Set tStatus = exp.AsStatus()
}

Quit tStatus
}

My guess is that you are concatenating string with the operator "&" (AND) instead of "_".

If you do so, ObjectScript will cast your string as boolean false, the result of (false and false and false) equals 0, the result you see in your question.

Method PatientInfo(ID As %String) As %Status#dim status as %Status=$$$OK
  SET myquery="SELECT GUID, IDType,IDValue FROM MergeHyland.TypeTwoDimesionCollection WHERE GUID ="_ID
  SET rset=##class(%ResultSet.SQL).%Prepare(myquery,.err,"")
    WHILE rset.%Next() {
    WRITE !,rset.GUID _ ":" _ rset.IDType_ ":" _ rset.IDValue
    }
  WRITE "End of data"
    return status
}

Hi Benjamin,

In some cases we have to use Ensemble as an ETL not an ESB.

So we extract lot of data, transform and load them throw the JDBC SQL Adapter in EnsLib. To do so, we dont do it line by line throw messages but ResultSet by ResultSet (one ResultSet can have more than 500 000 lines and 30 colones). This pattern work well.

But we have start to have time treatment issue. Our process took more than 8 hours. When we analyze it, what it cost time is the select and insert treatment.

We solved the problem of select treatment with this post : https://community.intersystems.com/post/jdbc-large-query-optimisation

So now, we are looking for a way to improve insert time. Our guess is to implement throw JDBC SQL Adapter in EnsLib the java pattern with PrepareStatement.addBatch() then executeBatch().

Do you have any idea to improve the insert treatment ?

Hi Marc,

I'm intereded by your sample code for this kind of optimisation even if we have to use a different EnsLib.SQL.GatewayResultSet.
Furthermore, we are currently looking for the same kind of optimisation in the other way, insert in JDBC batch mode. (https://www.tutorialspoint.com/jdbc/jdbc-batch-processing.htm)
I get in touch with our Sales Engineer and we will continu to discuss this by email.

When we will have improvement, i'll update this theard.