Question
· Oct 14, 2024

Using Cache, Python and pypyodbc, struggling dates

New to Python.  Attempting to use pypyodbc to select data from a table in one Cache database, and inserting into a similarly configured table in another.  Process works fine except for tables containing Date types.  NULL values in date columns are handled without issue, but when data is present, insert fails with:

An error occurred: argument 7: TypeError: 'NoneType' object cannot be interpreted as an integer.

Source table:

CREATE TABLE "SAT"."AuditAttribute" (
    "ID"                 INTEGER NOT NULL PRIMARY KEY DEFAULT $i(^SAT.AuditAttributeD),
    "AddDelete"          VARCHAR(50),
    "ConstituentId"      VARCHAR(50),
    "CreatedDate"        DATE,
    "CreatedTime"        TIME,
    "DeleteDate"         DATE,
    "DeleteTime"         TIME,
    "FinderId"           VARCHAR(50),
    "Tag"                VARCHAR(50),
    "UserName"           VARCHAR(50)
);

Target table:

CREATE TABLE "SAT_D3"."AuditAttribute_DWN" (
    "DBase"              VARCHAR(6),
    "ID_OLD"             INTEGER,
    "AddDelete"          VARCHAR(50),
    "ConstituentId"      VARCHAR(50),
    "CreatedDate"        DATE,
    "CreatedTime"        TIME,
    "DeleteDate"         DATE,
    "DeleteTime"         TIME,
    "FinderId"           VARCHAR(50),
    "Tag"                VARCHAR(50),
    "UserName"           VARCHAR(50)
)
;

select query:

 

select_query = 'select \'DWN\' as DBase, ID as "ID_OLD", AddDelete, ConstituentId, Tag, UserName, "CreatedDate" from SAT.AuditAttribute'

insert_query = 'insert into SAT_D3.AuditAttribute_DWN (DBase, "ID_OLD",  AddDelete, ConstituentId, Tag, UserName, "CreatedDate") values (?,?,?,?,?,?,?)'

Displaying row[6[ for first failing row shows this: "datetime.date(2018, 6, 28)"

Have tried various methods, datetime.strftime(),  datetime.strptime(), but haven't hit on the magic strategy.  I assume it's something simple.

Thanks for your help!

Product version: Caché 2018.1
$ZV: Cache for UNIX (Red Hat Enterprise Linux for x86-64) 2018.1.2 (Build 309U) Mon Mar 4 2019 15:07:46 EST
Discussion (2)2
Log in or sign up to continue

Solved:

exp_sql = "select rundate, filename, batchid, starttimestamp, endtimestamp,  lastname, firstname, birthdate from src_tbl"

ins_sql = "insert into dest_tbl (rundate, filename, batchid, starttimestamp, endtimestamp,  lastname, firstname, birthdate) values (?, ?, ?, ?, ?, ?, ?, ?)"

 date_cols = [0, 3, 4, 7]     # create a list of columns are that are of date, time or timestamp type

batch_items = 1000     # set number of rows to process at a time

    try:

        src_csr.execute(exp_sql)

           while True:

        # Fetch a batch of rows from the source table

            rows = src_csr.fetchmany(batch_size)

            if not rows:

                break  # Exit loop if no more rows to process

            if len(date_cols) == 0:                                         # No date/times to deal with in this table

                tgt_csr.executemany(ins_sql, rows)

            else:

                fixed_rows = []                                             # make a mutable list

                for row in rows:

                    row_list = list(row)                                             # copy rows to list

                    for date_col in date_cols:                                  # for each date column index

                        if row_list[date_col] is not None:                  # if value is not None

                            row_list[date_col] = str(row_list[date_col])    # convert to a string, format like %ODBCOUT()             

                    fixed_rows.append(row_list)                             # append the record to rows

                tgt_csr.executemany(ins_sql, fixed_rows)

            tgt_cnx.commit()  # Commit after each batch