Article
· Nov 4, 2023 5m read

Connecting InterSystems IRIS and Firebase Cloud Firestore

I recently had the need to monitor from HealthConnect the records present in a NoSQL database in the Cloud, more specifically Cloud Firestore, deployed in Firebase. With a quick glance I could see how easy it would be to create an ad-hoc Adapter to make the connection taking advantage of the capabilities of Embedded Python, so I got to work.

Preparing the environment

To start, we need an instance of the database on which we can perform the tests. By accessing the Firebase console, we have created a new project to which we have added the Firestore database.

Next we create a collection on our database called data_poc and in which we include 3 documents that we will later recover from our production.

With the database deployed we are going to obtain our json file with the necessary keys to make the connection from our production in IRIS. To do this, from the console of our Firebase project we open the account services page that we find from the project configuration (Project Settings -> Service Accounts) and we generate a new private key:

This will download a json file that we will place in a path on our server, in this example we will leave it in the /shared/durable folder of our Docker

Creating the adapter

In order to be able to connect to our database in Firebase, we will have to create a specific adapter that makes the connection. As we mentioned before, we will use the capabilities that Embedded Python offers us, so we will install the library that allows us to connect, firebase-admin

With our library already installed we can now create our adapter:

Class Local.Adapter.FirebaseInboundAdapter Extends Ens.InboundAdapter
{

Property KeyPath As %String(MAXLEN = 100);
Property DocName As %String(MAXLEN = 100);
Parameter SETTINGS = "KeyPath,DocName";
Method OnTask() As %Status
{
    $$$TRACE("Connecting")
    set tSC = $$$OK
    set listOfDocs = ##class("%Library.ListOfDataTypes").%New()
    if ('$DATA(^$GLOBAL("^LASTFIREBASE"))) {
        set ^LASTFIREBASE(..DocName) = 0
    }
    
    do ..ConnectAndQuery(..KeyPath, ^LASTFIREBASE(..DocName), listOfDocs, ..DocName)

    for i = 1:1:listOfDocs.Count() {
        set msg = ##class(Local.Message.FirebaseDocRequest).%New()
        set msg.Doc = listOfDocs.GetAt(i)
        set tSC=..BusinessHost.ProcessInput(msg)
        set docRead = ##class(%DynamicAbstractObject).%FromJSON(msg.Doc)
        set ^LASTFIREBASE(..DocName) = docRead.id
        $$$TRACE("Index: "_^LASTFIREBASE(..DocName))
    }
    $$$TRACE("Finishing connection")

    Quit tSC
}

/// Using Embedded Python to connect with Firebase
ClassMethod ConnectAndQuery(keyPath As %String, lastId As %String, ByRef listOfDocs As %List, docName As %String) [ Language = python ]
{
        import iris
        import firebase_admin
        from firebase_admin import credentials
        from firebase_admin import firestore

        if not firebase_admin._apps:
            cred = credentials.Certificate(keyPath)
            firebase_admin.initialize_app(cred)

        db = firestore.client()

        # Read Data
        docs_refer = db.collection(docName)
        docs = docs_refer.where("id",">",lastId).stream()
        # docs = docs_refer.stream()

        for doc in docs:
            # listOfDocs.Insert(doc.to_dict())            
            listOfDocs.Insert(str(doc.to_dict()).replace("'", '"'))
        return 1
}

}

As you can see we are using an identifier field id as a criterion to obtain the latest documents registered in the system, for each reading of a new document we take its identifier and store it in a global. We have included two parameters in the adapter:

  • KeyPath: where we indicate the path and name of the json file that contains the keys to access our database in Firebase.
  • DocName: in which we define the name of the collection or document that we store in the database. For each type of collection we must add a new Business Service by modifying this parameter:

With the connection keys we can now connect to the database using the function developed using the Embedded Python functionality. We have previously installed the firebase-admin library that will allow us to manage the connection and which you can find in the requirements.txt file of the associated project. Once the connection is made, we will recover all the documents with an identifier after the last one we read. We will insert the recovered documents into a string list that we will later scan and send to our Business Service. 

# Read Data
docs_refer = db.collection(docName)
docs = docs_refer.where("id",">",lastId).stream()
# docs = docs_refer.stream()

for doc in docs:
    listOfDocs.Insert(str(doc.to_dict()).replace("'", '"'))

With the developed adapter we only need to implement our Business Service to be able to use it in our production:

Class Local.BS.FirebaseBS Extends Ens.BusinessService
{

Parameter ADAPTER = "Local.Adapter.FirebaseInboundAdapter";
Method OnProcessInput(pRequest As Local.Message.FirebaseDocRequest, pResponse As %RegisteredObject) As %Status
{
        $$$TRACE(pRequest.Doc)

        Quit $$$OK
}

}

For our example we will only define the writing of a trace with the message received from the adapter, the message will only have a property of type String that we have called Doc and that will contain the recovered document.

Testing the adapter

We already have everything we need to retrieve documents from our database, so we only need to configure our production.

Perfect! We already have our BS configured to make queries to the database every 5 seconds, let's start production and check the log:

Here we have our documents, let's see now what happens if we add a new one to our database:

There we have our new record. Well ready, we now have our adapter working.

Well, that's it for today's article, if you have any questions or suggestions, don't hesitate to write them in the comments.

Discussion (2)2
Log in or sign up to continue