Article
· 10 hr ago 3m read

What I’ve Learned from Multiple Data Migrations

Hello!!!

Data migration often sounds like a simple "move data from A to B task" until you actually do it. In reality, it is a complex process that blends planning, validation, testing, and technical precision.

Over several projects where I handled data migration into a HIS which runs on IRIS (TrakCare), I realized that success comes from a mix of discipline and automation.

Here are a few points which I want to highlight.

1. Start with a Defined Data Format.

Before you even open your first file, make sure everyone, especially data providers, clearly understands the exact data format you expect. Defining templates early avoids unnecessary bank-and-forth and rework later. 

While Excel or CSV formats are common, I personally feel using a tab-delimited text file (.txt) for data upload is best. It's lightweight, consistent, and avoids issues with commas inside text fields. 

PatID   DOB Gender  AdmDate
10001   2000-01-02  M   2025-10-01
10002   1998-01-05  F   2025-10-05
10005   1980-08-23  M   2025-10-15

Make sure that the date formats given in the file is correct and constant throughout the file because all these files are usually converted from an Excel file and an Basic excel user might make mistakes while giving you the date formats wrong. Wrong date formats can irritate you while converting into horolog.

2. Validate data before you load it.

Never - ever skip validation of data. At least a basic glance on the file would do. IRIS although gives us the performance and flexibility to handle large volumes, but that's only useful if your data is clean. 

ALWAYS, keep a flag (0 or 1) in the parameter of your upload function. Where 0 can mean that you just want to validate the data and not process it. And 1 to process the data.

If validations fails for any of the data, maintain an error log which will tell you exactly which data is throwing you the error. If your code does not give you the capability to find out which data has an errored record, then it will be very tough to figure out the wrong records.

3. Keep detailed and searchable logs.

You can either use Global or tables to capture logs. Make sure you capture the timestamp, the filename, record (which can easily be traceable) and status. 

If the data is small, you can ignore success logs and capture only the error logs. Below is an example of how I use to store error logs.

Set ^LOG("xUpload",+$Horolog,patId)=status_"^"_SQLCODE_"^"_$Get(%msg)

For every insert, we will have an SQLCODE, if there is an error while inserting, then we always get an errored message from %msg

This can also be used while validating data. 

4. Insert data in an Efficient and Controlled Manner.

Efficiency in insertion is not just about speed, it's about data consistency, auditability and control. Before inserting, make sure every single record has passed validation and that no mandatory fields are skipped. Missing required fields can silently break relationships or lead to rejected records later in the workflow.

When performing insert:

  • Always include InsertDateTime and UpdateDateTime fields for tracking. This helps in reconciliation, incremental updates and debugging.
  • Maintaining a dedicated backed user for all automated or migration-related activities. This makes it easier to trace changes in audit logs, and clearly separates system actions from human inputs.

5. Reconcile after Migration/Upload.

Once the activity is completed, perform a reconciliation between source and destination:

  • Record count comparison.
  • Field-by-field checksum validation.
  • Referential integrity checks.

Even a simple hash-based comparison script can help confirm that nothing was lost or altered.

 

These are some of the basic yet essential practices for smooth and reliable data migration. Validations, proper logging, consistent inserts, and attention to master data make a huge difference in quality and traceability.

Keep it clean, automated and well documented. The rest will fall into place.

Feel free to reach out to me for any queries, or discussions around IRIS data migration!

Discussion (3)1
Log in or sign up to continue

Hi Alexey,

Thank you for the suggestion- using the time works well too! I was simply trying to provide an example. I'll be posting another article with my code shortly, and I truly look forward to any further feedback you might have.

We can also use this way.

 Set ^LOG("xUpload",+$Horolog,patId,$ I(^LOG("xUpload",+$Horolog, patId))=status_"^"_SQLCODE_"^"_$Get(%msg)

We have had a few people ask for a CSV using a pipe as a separator. If you run into that and need to open it in Excel, you do that by going to the data tab, then click From Text/CSV. It may detect the separator character automatically, but if not there is a place where you can set it. (You may want to right click on the GIF and open in new tab if it's too fuzzy.)