· Aug 22 12m read

REST Api with IRIS on Python with sql migrations

For the upcoming Python contest, I would like to make a small demo, on how to create a simple REST application using Python, which will use IRIS as a database. Using this tools

  • FastAPI framework, high performance, easy to learn, fast to code, ready for production
  • SQLAlchemy is the Python SQL toolkit and Object Relational Mapper that gives application developers the full power and flexibility of SQL
  • Alembic is a lightweight database migration tool for usage with the SQLAlchemy Database Toolkit for Python.
  • Uvicorn is an ASGI web server implementation for Python.

Prepare environment

Assuming that Python is already installed, at least version 3.7. Create a project folder, and create a requirements.txt file in it with this content


 I'd recommend using virtual environment in python, let's create new environment, and activate it

python -m venv env && source env/bin/activate

And now we can install our dependencies

pip install -r requirements.txt

Fast start

Let's create the simplest REST Api with FastAPI. To do so, create app/

from fastapi import FastAPI

app = FastAPI(
    title='TODO Application',

async def pong():
    return {"ping": "pong!"}

At this point it's enough to start our application, and it should already work. To start server we will use uvicorn

$ uvicorn app.main:app         
INFO:     Started server process [94936]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on (Press CTRL+C to quit)

And we can make ping requst

$ curl http://localhost:8000/ping

FastAPI offer UI where we can test API.

Dockerized environment 

To add IRIS into our application, we will use containers. IRIS image will go as is, but we need to build Docker image for the python application. And we will need Dockerfile

FROM python:3.11-slim-buster

WORKDIR /usr/src/app

RUN --mount=type=bind,src=.,dst=. \
    pip install --upgrade pip && \
    pip install -r requirements.txt

COPY . .

ENTRYPOINT [ "/usr/src/app/" ]

To start the application inside the container requires a simple


# Run SQL Migrations, to make DB Schema up to date
alembic upgrade head

# Start Python application
uvicorn app.main:app \
      --workers 1 \
      --host \
      --port 8000 "$@"

Do not forget to add an execution flag

chmod +x

And combine with IRIS in docker-compose.yml

version: "3"
    image: intersystemsdc/iris-community
      - 1972
      - IRISUSERNAME=demo
      - IRISPASSWORD=demo
      test: /
      interval: 5s
    build: .
      - 8000:8000
      - DATABASE_URL=iris://demo:demo@iris:1972/USER
      - ./:/usr/src/app
        condition: service_healthy
      - --reload

Let's build it

docker-compose build

The first data model

Now let's declare access to our IRIS database to the application, add file app/, which will configure SQLAlchemy to access our database, which is defined through the URL passed through docker-compose.yml. And it contains a couple handlers we will use later in the app

import os

from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import DeclarativeMeta, declarative_base
from sqlalchemy.orm import sessionmaker

DATABASE_URL = os.environ.get("DATABASE_URL")
    DATABASE_URL = "iris://demo:demo@localhost:1972/USER"
engine = create_engine(DATABASE_URL, echo=True, future=True)

Base: DeclarativeMeta = declarative_base()

SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)

def init_db():

def get_session():
    session = SessionLocal()
    yield session

And ready to define the first and only one model in our application. Create and edit file app/, it will use SQLAlchemy to define the model, named Todo, with three columns, id, title, and description.

from sqlalchemy import Column, Integer, String, Text
from app.db import Base

class Todo(Base):
    __tablename__ = 'todo'
    id = Column(Integer, primary_key=True, index=True)
    title = Column(String(200), index=True, nullable=False)
    description = Column(Text, nullable=False)

Prepare SQL migration

In a changing world, we know that our application will be improved in the future, that our table structure is not final, we can add more tables, columns, indexes, and so on. In this case the best scenario is using some SQL Migration tools, which help to upgrade the actual structure in the database according to the version of our application, and with these tools, it helps to downgrade it too, in case if something went off. While in this project we use Python and SQLAlchemy, the author of SQLAlchemy offers his tool named Alembic, and we going to use it here.

We need to start IRIS and container with our application, at this moment we need bash, to be able to run commands

$ docker-compose run --entrypoint bash app
[+] Creating 2/0
 ✔ Network fastapi-iris-demo_default   Created                                                                                                                                                        0.0s 
 ✔ Container fastapi-iris-demo-iris-1  Created                                                                                                                                                        0.0s 
[+] Running 1/1
 ✔ Container fastapi-iris-demo-iris-1  Started                                                                                                                                                        0.1s 

Run command alembic init app/migrations

root@7bf903cd2721:/usr/src/app# alembic init app/migrations
  Creating directory '/usr/src/app/app/migrations' ...  done
  Creating directory '/usr/src/app/app/migrations/versions' ...  done
  Generating /usr/src/app/app/migrations/README ...  done
  Generating /usr/src/app/app/migrations/ ...  done
  Generating /usr/src/app/app/migrations/ ...  done
  Generating /usr/src/app/alembic.ini ...  done
  Please edit configuration/connection/logging settings in '/usr/src/app/alembic.ini' before proceeding.

This prepared alembic configuration, and we need to fix it to fit our application needs. To do so, edit app/migrations/ file. This is only the beginning of file, which supposed to be updated, focus on updating sqlalchemy.url and target_metadata. What's below remains the same

import os
import urllib.parse
from logging.config import fileConfig

from sqlalchemy import engine_from_config
from sqlalchemy import pool

from alembic import context

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config

DATABASE_URL = os.environ.get("DATABASE_URL")

decoded_uri = urllib.parse.unquote(DATABASE_URL)
config.set_main_option("sqlalchemy.url", decoded_uri)

# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.config_file_name is not None:

# add your model's MetaData object here
# for 'autogenerate' support
from app.models import Base
target_metadata = Base.metadata
# target_metadata = None

We already have a model, now we need to create a migration, with command alembic revision --autogenerate 

root@7bf903cd2721:/usr/src/app# alembic revision --autogenerate
INFO  [alembic.runtime.migration] Context impl IRISImpl.
INFO  [alembic.runtime.migration] Will assume non-transactional DDL.
INFO  [] Detected added table 'todo'
INFO  [] Detected added index 'ix_todo_id' on '['id']'
INFO  [] Detected added index 'ix_todo_title' on '['title']'
  Generating /usr/src/app/app/migrations/versions/ ...  done
Let's see generated migration

Now it's time to apply this to the database, with the command alembic upgrade head, where is head is a keyword to upgrade to the latest version

root@7bf903cd2721:/usr/src/app# alembic upgrade head
INFO  [alembic.runtime.migration] Context impl IRISImpl.
INFO  [alembic.runtime.migration] Will assume non-transactional DDL.
INFO  [alembic.runtime.migration] Running upgrade  -> 1e4d3b4d51ca, empty message

Check the current state at any time, will give information if some migrations that are missing

root@7bf903cd2721:/usr/src/app# alembic check
INFO  [alembic.runtime.migration] Context impl IRISImpl.
INFO  [alembic.runtime.migration] Will assume non-transactional DDL.
No new upgrade operations detected.

Make data accessable

So, we now can go back to the REST, and we need it up and running, exit from the current container and run app service as normal now, uvicorn has a flag --reload, so, it will check for changes in the python files and will restart when we change them

$ docker-compose up app
[+] Running 2/0
 ✔ Container fastapi-iris-demo-iris-1  Running                                                                                                                                                        0.0s 
 ✔ Container fastapi-iris-demo-app-1   Created                                                                                                                                                        0.0s 
Attaching to fastapi-iris-demo-app-1, fastapi-iris-demo-iris-1
fastapi-iris-demo-app-1   | INFO  [alembic.runtime.migration] Context impl IRISImpl.
fastapi-iris-demo-app-1   | INFO  [alembic.runtime.migration] Will assume non-transactional DDL.
fastapi-iris-demo-app-1   | INFO:     Will watch for changes in these directories: ['/usr/src/app']
fastapi-iris-demo-app-1   | INFO:     Uvicorn running on (Press CTRL+C to quit)
fastapi-iris-demo-app-1   | INFO:     Started reloader process [8] using StatReload
fastapi-iris-demo-app-1   | INFO:     Started server process [10]
fastapi-iris-demo-app-1   | INFO:     Waiting for application startup.
fastapi-iris-demo-app-1   | INFO:     Application startup complete.

FastAPI uses pydantic project, to declare the data schema, and we need it too, let's create app/, the same columns as in but in a simple Python form

from pydantic import BaseModel

class TodoCreate(BaseModel):
    title: str
    description: str

class Todo(TodoCreate):
    id: int

    class Config:
        from_attributes = True

Declare the crud operations in app/, where we work with the database using SQLAlchemy ORM

from sqlalchemy.orm import Session
from . import models, schemas

def get_todos(db: Session, skip: int = 0, limit: int = 100):
    return db.query(models.Todo).offset(skip).limit(limit).all()

def create_todo(db: Session, todo: schemas.TodoCreate):
    db_todo = models.Todo(**todo.dict())
    return db_todo

And finally, we can update our app/, and add routes to read and create todos

from fastapi import FastAPI, Depends
from .db import init_db, get_session
from . import crud, schemas

app = FastAPI(
    title='TODO Application',

def on_startup():

async def pong():
    return {"ping": "pong!"}

@app.get("/todo", response_model=list[schemas.Todo])
async def read_todos(skip: int = 0, limit: int = 100, session=Depends(get_session)):
    todos = crud.get_todos(session, skip=skip, limit=limit)
    return todos"/todo", response_model=schemas.Todo)
async def create_todo(todo: schemas.TodoCreate, session=Depends(get_session)):
    return crud.create_todo(db=session, todo=todo)

The docs page was updated accordingly, and now we can play with it

Give it a try

And let's check it in IRIS

─$ docker-compose exec iris irissqlcli iris+emb:///
Server:  IRIS for UNIX (Ubuntu Server LTS for ARM64 Containers) 2023.2 (Build 227U) Mon Jul 31 2023 17:43:25 EDT
Version: 0.5.4
[SQL]irisowner@/usr/irissys/:USER> .tables
| TABLE_NAME              |
| SQLUser.alembic_version |
| SQLUser.todo            |
Time: 0.043s
[SQL]irisowner@/usr/irissys/:USER> select * from todo
| id | title | description         |
| 1  | demo  | it's really working |
1 row in set
Time: 0.004s
[SQL]irisowner@/usr/irissys/:USER> select * from alembic_version
| version_num  |
| 1e4d3b4d51ca |
1 row in set
Time: 0.045s


Hope you enjoyed the ease of using Python and FastAPI, on creating REST. The source code for this project is available on github

Discussion (5)2
Log in or sign up to continue

I didn't get the meaning of  Field(ge=0) immediately
I see 6 variants to achieve this in IRIS in order of my personal preference

  1. use SQL NULLIF function
  2. use SQL CASE..END block if more complex
  3. use a custom SQL Function = ClassMethod projected as SQLProcedure
  4. create a TRIGGER if this is not just a single case 
  5. use calculated properties with on Change clause
  6. create a customized  OdbcToLogical method 

There are eventually more options