Article
· Apr 21, 2023 2m read

Copy data into IRIS Cloud SQL deployment using sqlalchemy-iris

We have a rule to disable a user account if they have not logged in for a certain number of days. IRIS Audit database logs many events such as login failures for example. It can be configured to log successful logins as well. We have IRIS clusters with many IRIS instances. I like to run queries against audit data from ALL IRIS instances and identify user accounts which have not logged into ANY IRIS instance.

Instead of running queries against many IRIS instances, I developed audit-consolidator to consolidate audit data from many IRIS instances into ONE database table to run queries against the consolidated audit data. The IRIS code is deployed in %SYS namespace.

Step 1 was setting up a task that can run on every IRIS instance daily and export audit data.

Step 2 was creating a persistent class which can import and store all the audit data.

Step 3 was creating a SQL View to filter the audit data by events related to user change.

Step 4 was implementing sqlalchemy-iris to copy data from audit-consolidator to IRIS Cloud SQL deployment.

I created a Python script called audit.py. It uses two environment variables to identify the cloud server and password. I connected to my EC2 instance in AWS where audit-consolidator was running.

export ICSHOST='k8s-a34cb3c6-aa6428f3-181bcb4a5c-1d7a6ab2ab286107.elb.us-east-1.amazonaws.com'

export ICSPASSWORD='Passw0rd123!'

python3 python/audit.py

 

Step 5 will be to use Machine Learning to predict user accounts that are at risk of expiration or identify suspicious activity.

Please look for a YouTube video coming soon…

Discussion (1)1
Log in or sign up to continue