Written by

Software Architect at Visum
Article Yuri Marx · 4 hr ago 25m read

Developing a Microservice Application using IRIS, Kafka, and REST APIContestant

What is a Microservice?

A microservice is an architectural style that structures an application as a collection of small, autonomous services. Each component is developed around a specific business capability, can be deployed independently, and is typically managed by a miniature, specialized, self-governing team. (Source: https://microservices.io/)

 

Key Characteristics and Benefits:

  • Compact and Focused: Services are engineered to excel at a single business function (domain).
     
  • Decentralized Governance: Teams are free to select the optimal technology stack for their specific service.
     
  • Independent Deployment: Modules can be updated and deployed without disrupting the rest of the ecosystem.
     
  • Resilience: A failure in one service is unlikely to trigger a total system collapse.
     
  • Communication via Lightweight Mechanisms: Services generally interact via REST APIs or message brokers (e.g.,  Kafka, as mentioned in the title).
     
  • Technological Autonomy and Isolation: Every microservice is self-contained in its technology stack and internal structure. It means that it maintains its own data and business logic layers, utilizing the most appropriate tools for its particular function without being constrained by the technological choices of other services. This autonomy minimizes coordination and coupling between services.

The Shift from Monolithic to Microservices Architecture

Before the advent of microservices, the dominant application style was the Monolithic Architecture. In a monolith, the entire application (business logic, data access layers, and user interface) is built as a single, indivisible unit.
 

Problems with Monolithic Architecture

While straightforward at the start, monolithic applications often encountered significant challenges as they expanded:

  • Tight Coupling: All components are so intertwined that a minor change in a single part requires recompiling and redeploying the entire application.
    Slow Development Cycle: A massive codebase makes it difficult for multiple teams to navigate simultaneously, slowing down feature development and deployment.
     
  • Difficulty in Scaling: Since the entire application must be scaled, even if only a small component (e.g., a specific service or function) is under heavy load, it is inefficient in terms of resource utilization.
     
  • Technology Lock-in: Because the entire application is built employing a single technology stack (language, framework), adopting the latest technologies or migrating to newer versions becomes a massive, high-risk undertaking.
    Lower Fault Isolation (Resilience): A bug or failure in any single module can potentially bring down the entire application, leading to a complete system outage.
     
  • Barriers to Continuous Deployment (CD): Due to deployment involving the whole system, the process is often slow, risky, and infrequent.
     

How Microservices Resolve These Issues

The Microservices Architecture directly addresses the shortcomings of the monolith by breaking the application into a collection of small, independently deployable units:

Monolithic Problem

Microservice Solution

Tight Coupling

Loose Coupling: Since services are independent, changing one service requires only redeploying the affected unit.

Slow Development Cycle

Faster Development and Deployment: Smaller, focused teams can work autonomously on their services, leading to quicker iterations and continuous delivery.

Difficulty in Scaling

Independent and Elastic Scaling: Only the services experiencing high load need to be scaled, optimizing resource usage.

Technology Lock-in

Decentralized Governance/Technological Autonomy: Teams can choose the best language, database, and framework for each specific service (e.g., using Python for machine learning services and Java for core business logic).

Lower Fault Isolation (Resilience)

Improved Resilience: A failure in one microservice is isolated and less likely to cascade, preventing a total system downfall.

Barriers to Continuous Deployment (CD)

Enabling CD: Services can be deployed frequently and independently, reducing risk and time-to-market.

Microservices are Not a "Silver Bullet"

Despite its numerous advantages, Microservices architecture is not a "silver bullet" that fixes all software development problems. Adopting it introduces its own complexity and operational challenges.

In many cases, especially for smaller projects or teams with limited resources, the monolithic architecture remains the best, easiest, and most efficient choice.

When Monolith is the Best Option

Monolithic architecture is preferable in the following scenarios:

  • Small and Simple Projects: For applications that do not require extreme scalability, have well-defined business requirements, and a small codebase, the operational complexity of managing multiple services, databases, and networks is not justified.
     
  • Early Stage Startups (Proof of Concept): When speed of development and rapid product iteration are priorities, a monolith allows a small team to build and launch a product faster, postponing the complexity of migrating to microservices until it is required.
     
  • Small and Inexperienced Teams: Since developing and operating microservices demand proficiency in such areas as container management (Docker, Kubernetes), distributed monitoring, and network resilience, a monolith is significantly easier to manage and debug for undersized teams or those just starting.
     
  • Comprehensive Transaction Requirements (Distributed Transactions): In a monolith, transactions spanning multiple components are simple (ACID compliance).In microservices, however, they become "distributed transactions," a notoriously difficult problem to solve, often requiring complex patterns like SAGA.
     

Monolith vs. Microservices: Usage Scenarios

The choice of architecture should be dictated by the particular needs and constraints of the project, organization, and team.
 

Characteristic / Scenario

Monolithic Architecture

Microservices Architecture

Application Size

Small to Medium

Medium to Large and Complex

Delivery Speed

Initially Fast

Slower at first, but accelerates in the long term

Scalability

Difficult (scale everything)

Easy (scale individual services)

Operational Complexity

Low (single deployment)

High (network management, distributed monitoring)

Team Size

Small and Cohesive

Multiple autonomous teams

Technology

Single Technology (Lock-in)

Polyglot Technology (Flexibility)

Ideal Use Cases

MVP, PoC, Simple Internal Applications, Projects with limited budget/time

High-traffic systems, E-commerce, SaaS Platforms, Systems with rapidly evolving components

Resilience (Fault Tolerance)

Low (cascading failure)

High (failure isolation)

The user instruction was to generate content that explains the InterSystems IRIS capabilities in both monolithic and microservices architectures, emphasizing its strength in microservices by listing its integrated features (database, interoperability, API management, multi-language business automation, data analysis, and data science).

However, the user explicitly requested the final output in English, despite the instructions being in Portuguese ("Gere a resposta em ingles").

Compliance with Rules:

  • Language of Output: The primary instruction within the user prompt was to generate the response in English ("Gere a resposta em ingles"). It means that I must adhere to this explicit instruction, overriding the general rule to respond in Portuguese if the user's request clearly mandates another language for the output.
     
  • Content: The content must follow the structure provided by the user.
     
  • Context: The existing document is about microservices vs. monoliths, making the introduction of InterSystems IRIS as a suitable platform highly relevant.
     
  • Format: Use markdown headings and structure appropriate for the existing document (which employs markdown structure and tables).

InterSystems IRIS for Monolithic or Microservice Architecture

The InterSystems IRIS Data Platform is designed to support a wide range of application architectures, from traditional monolithic systems to highly distributed microservices deployments.

Regardless of whether you choose a monolithic or a microservices approach, IRIS always provides a robust foundation. For monolithic applications, IRIS serves as a powerful, unified platform that consolidates data management, integration, and business logic into a single high-performance engine.

However, it is in the Microservices Architecture where InterSystems IRIS delivers its most significant market advantage. IRIS provides a single, comprehensive platform that eliminates the need to stitch together multiple vendor products to meet core microservice requirements, offering unparalleled integration and speed.

InterSystems IRIS offers a complete ecosystem for developing, deploying, and managing interconnected services, effectively solving the "distributed complexity" problem often associated with microservices.
 

Key Integrated Features for Microservices:

Feature Category

IRIS Component / Capability

Microservice Benefit

Data Management

High-Performance Multi-Model Database (Relational, Document, Object, Key-Value)

Services can leverage a powerful, consistent data store while maintaining technological autonomy for internal data structures.

Interoperability & Integration

Integrated Interoperability Engine (EAI, SOA, ESB, Business Processes)

Facilitates seamless, high-speed (synchronous and asynchronous) communication between microservices and external systems, which is crucial for resilient communication.

API Management

Built-in REST and API Gateway Capabilities

Simplifies the creation, exposure, and governance of service APIs, effectively managing ingress/egress traffic.

Business Automation

Multi-Language Support for Business Logic (Python, Java, .NET, BPEL, ObjectScript)

Allows development teams to select the best language for each service's business logic, enhancing Polyglot development while keeping core automation unified.

Data Analysis & Intelligence

Business Intelligence (BI) and Reporting Tools

Enables real-time analytics across distributed service data without complex ETL processes.

Data Science & ML

IntegratedML (AutoML) and Python Gateway for Machine Learning

Embeds AI and ML models (both Automated and Custom Python-based) directly into service workflows for intelligent decision-making at the point of action.

Unlike other manufacturers and solutions, where the microservice would have to create external dependencies with databases, integration mechanisms, data analysis systems, and external Machine Learning engines, IRIS already has all of this within the microservice's internal environment. It eliminates single points of failure, reduces dependencies, and definitively allows a microservice to adhere to the microservices architecture entirely.

 

Microservice Sample

Download the microservice sample on Open Exchange, the project ms-iris-credit-risk into https://openexchange.intersystems.com/package/ms-iris-credit-risk. This application is a microservice developed on the InterSystems IRIS platform to manage and predict customer credit risks. It exposes a RESTful API for CRUD operations (Create, Read, Update, Delete) and a specific Artificial Intelligence/Machine Learning endpoint that evaluates credit risk using IRIS's IntegratedML. Additionally, the system features integration with Apache Kafka via the interoperability engine (IRIS Interoperability) for asynchronous messaging. The application follows a well-defined architecture with separation of concerns (API, Service/Business, and Integration Processes).

Download and run the sample:
1. Clone/git pull the repo into any local directory:

$ git clone https://github.com/yurimarx/ms-iris-credit-risk.git

2. Open the terminal in this directory and execute the following:

$ docker-compose build

3. Run the IRIS container with your project:

$ docker-compose up -d

Test the REST API of the Microservice

1. Download and import the Postman collection of the project on https://github.com/yurimarx/ms-iris-credit-risk/blob/master/Credit%20Risk%20API.postman_collection.json or the local file Credit Risk API.postman_collection.json.
2. To get all current credit risk data, execute GET http://localhost:52795/api/creditrisk/creditrisk. The response should resemble the one below:
 

[
    {
        "CreditRiskId": 1,
        "Age": 27,
        "Sex": "female",
        "Job": 2,
        "Housing": "own",
        "SavingAccounts": "little",
        "CheckingAccount": "little",
        "CreditAmount": 3123,
        "Duration": 24,
        "Purpose": "car",
        "CreditRisk": 2,
        "Id": "1"
    }
    …
]

3. To get a credit risk analysis occurrence, do a GET http://localhost:52795/api/creditrisk/creditrisk/1 (1 or any other existing ID). The response should be as follows:

{
        "CreditRiskId": 1,
        "Age": 27,
        "Sex": "female",
        "Job": 2,
        "Housing": "own",
        "SavingAccounts": "little",
        "CheckingAccount": "little",
        "CreditAmount": 3123,
        "Duration": 24,
        "Purpose": "car",
        "CreditRisk": 2,
        "Id": "1"
}

4. To create a credit risk record, do a POST http://localhost:52795/api/creditrisk/creditrisk with a body like the one below:

{
        "Age": 27,
        "Sex": "female",
        "Job": 2,
        "Housing": "own",
        "SavingAccounts": "little",
        "CheckingAccount": "little",
        "CreditAmount": 3123,
        "Duration": 24,
        "Purpose": "car",
        "CreditRisk": 2
}

5. To update, do a PATCH  http://localhost:52795/api/creditrisk/creditrisk/1 (1 or any other existing ID of the record to be updated) with a body as follows:

{
        "Age": 27,
        "Sex": "female",
        "Job": 2,
        "Housing": "own",
        "SavingAccounts": "little",
        "CheckingAccount": "little",
        "CreditAmount": 5000,
        "Duration": 12,
        "Purpose": "tv",
        "CreditRisk": 2
}

6. To delete a record, do a DELETE http://localhost:52795/api/creditrisk/creditrisk/1 (1 or any other existing ID of the record to be deleted).

1. Open and Start the production: http://localhost:52795/csp/user/EnsPortal.ProductionConfig.zen?PRODUCTION=dc.creditrisk.CreditRiskProduction
2. Open Kafka UI: http://localhost:8080/
3. Go to Topics and create a CreditRiskInTopic if it does not exist.
4. Click the CreditRiskInTopic link and then click Produce Message (located on the top right corner of the page).
5. Fill in the iris into Key and the following JSON data into Value:
 

{
   "Age": 42,
   "Sex": "male",
   "Job": 1,
   "Housing": "own",
   "SavingAccounts": "rich",
   "CheckingAccount": "little",
   "CreditAmount": 10000,
   "Duration": 6,
   "Purpose": "car"
}

6. Click the button Produce.
7. Proceed to Topics and click the CreditRiskOutTopic link.
8. Move to Messages and check out the response message sent.
9. Additionally, examine the request and response flow on the Interoperability production.
 

Behind the Scenes
 

1. The credit_risk_data.csv was imported using the OEX app CSVgen:
The CSVgen was installed into the module.xml file, and the file was copied to the Docker container:
 

<Dependencies>
       <ModuleReference>
           <Name>csvgen</Name>
           <Version>*</Version>
       </ModuleReference>
</Dependencies>
<FileCopy Name="credit_risk_data.csv" Target="/tmp/credit_risk_data.csv"></FileCopy>

The FileLoader class imported the file to the table dc.creditrisk.CreditRisk, and module.xml called the FileLoader at the build stage:

     
ClassMethod LoadCreditRiskData() As %Status
{
   Set sc = $$$OK
   zw ##class(community.csvgen).Generate("/tmp/credit_risk_data.csv", ",", "dc.creditrisk.CreditRisk")
   Return sc
}

<Invokes>
   <Invoke Class="dc.creditrisk.FileLoader" Method="LoadCreditRiskData">
   </Invoke>
</Invokes>

2. The ML model was created and trained to predict new credit risks:
The CreditRiskModel class does the CREATE MODEL and TRAIN MODEL for credit risk data, which leaves us with a functional model to predict credit risk:
 

Class dc.creditrisk.CreditRiskModel
{

ClassMethod BuildAndTrain() As %Status
{
   Set sc = $$$OK
  
   Try {
       Set stmt = ##class(%SQL.Statement).%New()
      
       Set dropSQL = "DROP MODEL CreditRiskModel"
       Do stmt.%Prepare(dropSQL)
       Do stmt.%Execute()
      
       Write !, "Creating the model definition..."
       Set createSQL = "CREATE MODEL CreditRiskModel PREDICTING (CreditRisk) FROM "_
           "(SELECT Age, CheckingAccount, CreditAmount, CreditRisk, Duration, Housing, "_
           "Job, Purpose, SavingAccounts, Sex FROM dc_creditrisk.CreditRisk)"
       Set sc = stmt.%Prepare(createSQL)
       If $$$ISERR(sc) Quit
      
       Set result = stmt.%Execute()
       If result.%SQLCODE < 0 {
           Set sc = $$$ERROR($$$SQLError, result.%SQLCODE, result.%Message)
           Quit
       }
       Write !, "Model created successfully!"
      
       Write !, "Training the model(This may take a few moments)..."
       Set trainSQL = "TRAIN MODEL CreditRiskModel"
       Set sc = stmt.%Prepare(trainSQL)
       If $$$ISERR(sc) Quit
      
       Set result = stmt.%Execute()
       If result.%SQLCODE < 0 {
           Set sc = $$$ERROR($$$SQLError, result.%SQLCODE, result.%Message)
           Quit
       }
       Write !, "CreditRiskModel model trained successfully!"
      
   } Catch ex {
       Set sc = ex.AsStatus()
       Write !, "Exception occurred: ", $System.Status.GetErrorText(sc)
   }
  
   Return sc
}

The CreditRiskModel is created and trained with invokes using module.xml:
 

     <Invokes>
       <Invoke Class="dc.creditrisk.CreditRiskModel" Method="BuildAndTrain">
       </Invoke>
       <Invoke Class="dc.creditrisk.CreditRiskModel" Method="TestPredictions">
       </Invoke>
     </Invokes>

3. The credit risk data was defined in the class CreditRisk:

Class dc.creditrisk.CreditRisk Extends (%Persistent, %JSON.Adaptor)
{

Property CreditRiskId As %Integer [ Calculated, SqlComputeCode = { set {*}={%%ID}}, SqlComputed ];

Property Age As %Integer;

Property Sex As %String;

Property Job As %Integer;

Property Housing As %String;

Property SavingAccounts As %String;

Property CheckingAccount As %String;

Property CreditAmount As %Integer;

Property Duration As %Integer;

Property Purpose As %String;

Property CreditRisk As %Integer;

}

The CreditRisk property is the target of our prediction, with 1 for poor credit and 2 for good credit. All other properties are employed by the model to forecast the CreditRisk value. For more details about the properties, access https://www.kaggle.com/datasets/benjaminmcgregor/german-credit-data-set….
4. All the business logic of the microservice was encapsulated in the class CreditRiskService:

The CRUD business logic:

ClassMethod Exists(pId As %String) As %Boolean
{
   Return ##class(dc.creditrisk.CreditRisk).%ExistsId(pId)
}

ClassMethod GetAll(Output pArray As %DynamicArray) As %Status
{
   Set sc = $$$OK
   Try {
       Set pArray = []
       Set stmt = ##class(%SQL.Statement).%New()
       Set sc = stmt.%Prepare("SELECT ID FROM dc_creditrisk.CreditRisk")
       If $$$ISERR(sc) Quit
      
       Set rs = stmt.%Execute()
       If rs.%SQLCODE < 0 {
           Set sc = $$$ERROR($$$SQLError, rs.%SQLCODE, rs.%Message)
           Quit
       }
      
       While rs.%Next() {
           Set obj = ##class(dc.creditrisk.CreditRisk).%OpenId(rs.%Get("ID"))
           If $IsObject(obj) {
               Do obj.%JSONExportToString(.jsonString)
               Set jsonObj = {}.%FromJSON(jsonString)
               Set jsonObj.Id = obj.%Id()
               Do pArray.%Push(jsonObj)
           }
       }
   } Catch ex {
       Set sc = ex.AsStatus()
   }
   Return sc
}

ClassMethod GetById(pId As %Integer, Output pObj As dc.creditrisk.CreditRisk) As %Status
{
   Set sc = $$$OK
   Try {
       Set pObj = ##class(dc.creditrisk.CreditRisk).%OpenId(pId)
       If '$IsObject(pObj) {
           Set sc = $$$ERROR($$$GeneralError, "Object not found")
       }
   } Catch ex {
       Set sc = ex.AsStatus()
   }
   Return sc
}

ClassMethod Create(data As %DynamicObject, Output pId As %String) As %Status
{
   Set sc = $$$OK
   Try {
       Set obj = ##class(dc.creditrisk.CreditRisk).%New()
       Set sc = obj.%JSONImport(data)
       If $$$ISERR(sc) Quit
      
       Set sc = obj.%Save()
       If $$$ISERR(sc) Quit
      
       Set pId = obj.%Id()
   } Catch ex {
       Set sc = ex.AsStatus()
   }
   Return sc
}

ClassMethod Update(pId As %String, data As %DynamicObject) As %Status
{
   Set sc = $$$OK
   Try {
       Set obj = ##class(dc.creditrisk.CreditRisk).%OpenId(pId)
       Set sc = obj.%JSONImport(data)
       If $$$ISERR(sc) Quit
      
       Set sc = obj.%Save()
   } Catch ex {
       Set sc = ex.AsStatus()
   }
   Return sc
}

ClassMethod Delete(pId As %String) As %Status
{
   Return ##class(dc.creditrisk.CreditRisk).%DeleteId(pId)
}

The IntegratedML prediction and classification logic (Select Predict and Select Probability):

ClassMethod Predict(data As %DynamicObject, Output pResponse As %DynamicObject) As %Status
{
   Set sc = $$$OK
   Try {
       Set sql = "SELECT PREDICT(CreditRiskModel) AS RiskPrediction, "_
                 "       PROBABILITY(CreditRiskModel FOR 1) AS RiskProbability "_
                 "  FROM (SELECT ? As Age, ? As Sex, ? As Job, ? As Housing, ? As SavingAccounts, "_
                 "               ? As CheckingAccount, ? As CreditAmount, ? As Duration, ? As Purpose) "
      
       Set tStatement = ##class(%SQL.Statement).%New()
       Set sc = tStatement.%Prepare(sql)
       If $$$ISERR(sc) Quit
      
       Set rset = tStatement.%Execute(data.Age, data.Sex, data.Job, data.Housing, data.SavingAccounts,
                                      data.CheckingAccount, data.CreditAmount, data.Duration, data.Purpose)
       Do rset.%Next()

       Set pResponse = {}
       Set pResponse.Age = data.Age
       Set pResponse.Sex = data.Sex
       Set pResponse.Job = data.Job
       Set pResponse.Housing = data.Housing
       Set pResponse.SavingAccounts = data.SavingAccounts
       Set pResponse.CheckingAccount = data.CheckingAccount
       Set pResponse.CreditAmount = data.CreditAmount
       Set pResponse.Duration = data.Duration
       Set pResponse.Purpose = data.Purpose
       Set pResponse.RiskPrediction = rset.RiskPrediction
       Set pResponse.RiskProbability = rset.RiskProbability
       If (pResponse.RiskPrediction = "1") {
           Set pResponse.RiskAnalisysPrediction = "Good credit"
       } Else {
           Set pResponse.RiskAnalisysPrediction = "Poor credit"
       }
      
   } Catch ex {
       Set sc = ex.AsStatus()
   }
   Return sc
}


5. The CreditRiskAPI class used the service to expose REST API operations for CRUD and to foresee processes:

Class dc.creditrisk.CreditRiskAPI Extends %CSP.REST
{

XData UrlMap [ XMLNamespace = "http://www.intersystems.com/urlmap" ]
{
<Routes>
   <Route Url="/creditrisk/:id" Method="GET" Call="GetById"/>
   <Route Url="/creditrisk" Method="GET" Call="GetAll"/>
   <Route Url="/creditrisk" Method="POST" Call="Create"/>
   <Route Url="/creditrisk/:id" Method="PUT" Call="Update"/>
   <Route Url="/creditrisk/:id" Method="DELETE" Call="Delete"/>
   <Route Url="/creditrisk/predict" Method="POST" Call="Predict"/>
   <Route Url="/_spec" Method="GET" Call="SwaggerSpec" />
</Routes>
}

Parameter CHARSET = "utf-8";

Parameter CONTENTTYPE = "application/json";

Parameter HandleCorsRequest = 1;

/// Implements GET /creditrisk
ClassMethod GetAll() As %Status
{
   Set sc = $$$OK
   Try {
       Set sc = ##class(dc.creditrisk.CreditRiskService).GetAll(.jsonArray)
       If $$$ISERR(sc) {
           Set %response.Status = 500
           Write { "error": { "summary": "Failed to get all", "details": ($System.Status.GetErrorText(sc)) } }.%ToJSON()
           Quit
       }
      
       Write jsonArray.%ToJSON()
      
   } Catch ex {
       Set sc = ex.AsStatus()
       Set %response.Status = 500 // Internal Server Error
       Write { "error": { "summary": (ex.Name), "details": ($System.Status.GetErrorText(sc)) } }.%ToJSON()
   }
   Return sc
}

ClassMethod GetById(id As %Integer) As %Status
{
   Set sc = $$$OK
   Try {
      
       If '##class(dc.creditrisk.CreditRiskService).Exists(id) {
           Set %response.Status = 404 // Not Found
           Return 1
       }
      
       Set sc = ##class(dc.creditrisk.CreditRiskService).GetById(id, .obj)
       If $$$ISERR(sc) {
           Set %response.Status = 500
           Return 1
       }
      
       Do obj.%JSONExport()
       Return 1
      
   } Catch ex {
       Set sc = ex.AsStatus()
       Set %response.Status = 500 // Internal Server Error
       Write { "error": { "summary": (ex.Name), "details": ($System.Status.GetErrorText(sc)) } }.%ToJSON()
   }
   Return sc
}

/// Implements POST /creditrisk/predict
ClassMethod Predict() As %Status
{
  
   Set sc = $$$OK

   Try {

     If '$IsObject(%request.Content) || (%request.Content.Size = 0) {
         Set %response.Status = 400 // Bad Request
         Write { "error": { "summary": "Bad Request", "details": "Request body is empty" } }.%ToJSON()
         Return sc
     }
    
     Set data = {}.%FromJSON(%request.Content)
    
     Set sc = ##class(dc.creditrisk.CreditRiskService).Predict(data, .responseObj)
     If $$$ISERR(sc) {
         Set %response.Status = 500
         Write { "error": { "summary": "Prediction failed", "details": ($System.Status.GetErrorText(sc)) } }.%ToJSON()
         Return sc
     }
    
     Write responseObj.%ToJSON()
   } Catch ex {
       Set sc = ex.AsStatus()
       Set %response.Status = 500 // Internal Server Error
       Write { "error": { "summary": (ex.Name), "details": ($System.Status.GetErrorText(sc)) } }.%ToJSON()
   }

   Return sc
}

/// Implements POST /creditrisk
ClassMethod Create() As %Status
{
   Set sc = $$$OK
   Try {
       If '$IsObject(%request.Content) || (%request.Content.Size = 0) {
           Set %response.Status = 400 // Bad Request
           Write { "error": { "summary": "Bad Request", "details": "Request body is empty" } }.%ToJSON()
           Return sc
       }
      
       // Gets the JSON from the request body
       Set json = {}.%FromJSON(%request.Content)
      
       Set sc = ##class(dc.creditrisk.CreditRiskService).Create(json, .newId)
       If $$$ISERR(sc) {
           Set %response.Status = 400 // Bad Request
           Write { "error": { "summary": "Failed to create", "details": ($System.Status.GetErrorText(sc)) } }.%ToJSON()
           Return sc
       }
      
       Set %response.Status = 201 // Created
       Set %response.Headers("Location") = %request.URL _ "/" _ newId
       Write { "id": (newId) }.%ToJSON()
      
   } Catch ex {
       Set sc = ex.AsStatus()
       Set %response.Status = 500 // Internal Server Error
       Write { "error": { "summary": (ex.Name), "details": ($System.Status.GetErrorText(sc)) } }.%ToJSON()
   }
   Return sc
}

/// Implements PUT /creditrisk/:id
ClassMethod Update(pId As %String) As %Status
{
   Set sc = $$$OK
   Try {
       If '##class(dc.creditrisk.CreditRiskService).Exists(pId) {
           Set %response.Status = 404 // Not Found
           Return sc
       }
      
       If '$IsObject(%request.Content) || (%request.Content.Size = 0) {
           Set %response.Status = 400 // Bad Request
           Write { "error": { "summary": "Bad Request", "details": "Request body is empty" } }.%ToJSON()
           Return sc
       }
      
       Set json = {}.%FromJSON(%request.Content)
      
       Set sc = ##class(dc.creditrisk.CreditRiskService).Update(pId, json)
       If $$$ISERR(sc) {
           Set %response.Status = 400 // Bad Request
           Write { "error": { "summary": "Failed to update", "details": ($System.Status.GetErrorText(sc)) } }.%ToJSON()
           Return sc
       }
      
       Set %response.Status = 200 // OK
      
   } Catch ex {
       Set sc = ex.AsStatus()
       Set %response.Status = 500 // Internal Server Error
       Write { "error": { "summary": (ex.Name), "details": ($System.Status.GetErrorText(sc)) } }.%ToJSON()
   }
   Return sc
}

/// Implements DELETE /creditrisk/:id
ClassMethod Delete(pId As %String) As %Status
{
   Set sc = $$$OK
   Try {
       If '##class(dc.creditrisk.CreditRiskService).Exists(pId) {
           Set %response.Status = 404 // Not Found
           Return sc
       }
      
       Set sc = ##class(dc.creditrisk.CreditRiskService).Delete(pId)
       If $$$ISERR(sc) {
           Set %response.Status = 500 // Internal Error
           Return sc
       }
      
       Set %response.Status = 204 // No Content
      
   } Catch ex {
       Set sc = ex.AsStatus()
       Set %response.Status = 500 // Internal Server Error
       Write { "error": { "summary": (ex.Name), "details": ($System.Status.GetErrorText(sc)) } }.%ToJSON()
   }
   Return sc
}

ClassMethod SwaggerSpec() As %Status
{
 Set tSC = ##class(%REST.API).GetWebRESTApplication($NAMESPACE, %request.Application, .swagger)
 Do swagger.info.%Remove("x-ISC_Namespace")
 Set swagger.basePath = "/api/creditrisk"
 Set swagger.info.title = "Credit Risk API"
 Set swagger.info.version = "1.0.0"
 Set swagger.host = "localhost:52773"
 Write swagger.%ToJSON()
 Return $$$OK
}

}

6. The CreditRiskPredictService Interoperability Business Service received a Kafka event to do a credit risk prediction:

Class dc.creditrisk.CreditRiskPredictService Extends Ens.BusinessService
{

Parameter ADAPTER = "EnsLib.Kafka.InboundAdapter";

Property TargetConfigName As Ens.DataType.ConfigName [ InitialExpression = "CreditRiskProcess" ];

Parameter SETTINGS = "TargetConfigName:Basic:selector?context={Ens.ContextSearch/ProductionItems?targets=1&productionName=@productionId}";

Method OnProcessInput(pInput As EnsLib.Kafka.Message, Output pOutput As Ens.StringResponse) As %Status
{
   Set sc = $$$OK
   Try {
       // Extracts the Kafka message payload to a persistent object
       Set request = ##class(Ens.StringRequest).%New()
       Set request.StringValue = pInput.value
      
       // Sends asynchronously to the Business Process
       Set sc = ..SendRequestAsync(..TargetConfigName, request)

       Set pOutput = ##class(Ens.StringResponse).%New()
       Set pOutput.StringValue = "OK"
      
   } Catch ex {
       Set sc = ex.AsStatus()
   }
   Return sc
}

}

7.The CreditRiskProcess Interoperability Business Process class received the credit risk message and called the CreditRiskService to do the prognosis. The results were sent to the CreditRiskOperation:

Class dc.creditrisk.CreditRiskProcess Extends Ens.BusinessProcess
{

Property TargetConfigName As Ens.DataType.ConfigName [ InitialExpression = "CreditRiskOperation" ];

Parameter SETTINGS = "TargetConfigName:Basic:selector?context={Ens.ContextSearch/ProductionItems?targets=1&productionName=@productionId}";

Method OnRequest(pRequest As Ens.StringRequest, Output pResponse As Ens.StringResponse) As %Status
{
   Set sc = $$$OK
   Try {
       Set data = {}.%FromJSON(pRequest.StringValue)
      
       Set sc = ##class(dc.creditrisk.CreditRiskService).Predict(data, .responseObj)
       If $$$ISERR(sc) Quit
      
       Set request = ##class(EnsLib.Kafka.Message).%New()
       Set request.topic = "CreditRiskOutTopic"
       Set request.value = responseObj.%ToJSON()
       Set request.key = "iris"
      
       Set sc = ..SendRequestAsync(..TargetConfigName, request, 0)
       If $$$ISERR(sc) Quit

       Set pResponse = ##class(Ens.StringResponse).%New()
       Set pResponse.StringValue = "OK"
      
   } Catch ex {
       Set sc = ex.AsStatus()
   }
   Return sc
}

Storage Default
{
<Data name="CreditRiskProcessDefaultData">
<Subscript>"CreditRiskProcess"</Subscript>
<Value name="1">
<Value>TargetConfigName</Value>
</Value>
</Data>
<DefaultData>CreditRiskProcessDefaultData</DefaultData>
<Type>%Storage.Persistent</Type>
}

}

8.The CreditRiskProduction orchestrated all the Interoperability components (Business Service, Business Process, and Business Operation) with Kafka and CreditRiskService to allow consumption and production of credit risk forecasts using Kafka:

Class dc.creditrisk.CreditRiskProduction Extends Ens.Production
{

XData ProductionDefinition
{
<Production Name="dc.creditrisk.CreditRiskProduction" LogGeneralTraceEvents="false">
 <Description></Description>
 <ActorPoolSize>2</ActorPoolSize>
 <Item Name="CreditRiskPredictService" Category="" ClassName="dc.creditrisk.CreditRiskPredictService" PoolSize="1" Enabled="true" Foreground="false" Comment="" LogTraceEvents="false" Schedule="">
   <Setting Target="Adapter" Name="GroupID">iris</Setting>
   <Setting Target="Adapter" Name="Topic">CreditRiskInTopic</Setting>
   <Setting Target="Adapter" Name="Servers">kafka:9092</Setting>
 </Item>
 <Item Name="CreditRiskOperation" Category="" ClassName="EnsLib.Kafka.Operation" PoolSize="1" Enabled="true" Foreground="false" Comment="" LogTraceEvents="false" Schedule="">
   <Setting Target="Adapter" Name="ClientID">iris</Setting>
   <Setting Target="Adapter" Name="Servers">kafka:9092</Setting>
 </Item>
 <Item Name="CreditRiskProcess" Category="" ClassName="dc.creditrisk.CreditRiskProcess" PoolSize="1" Enabled="true" Foreground="false" Comment="" LogTraceEvents="false" Schedule="">
 </Item>
</Production>
}

}

9. Docker Compose and Docker were exploited to deploy the classes as a microservice:

Docker Compose:
 

services:
 iris:
   build:
     context: .
     dockerfile: Dockerfile
   restart: always
   environment:
     - ISC_CPF_MERGE_FILE=/home/irisowner/dev/merge.cpf
   networks:
       - ms-credit-risk-network    
   ports:
     - 1972
     - 52795:52773
     - 53773
   volumes:
     - ./:/home/irisowner/dev

 zookeeper:
     image: confluentinc/cp-zookeeper:7.5.0
     container_name: zookeeper
     hostname: zookeeper
     networks:
       - ms-credit-risk-network
     ports:
       - "2181:2181"
     environment:
       ZOOKEEPER_CLIENT_PORT: 2181
       ZOOKEEPER_TICK_TIME: 2000

 kafka:
   image: confluentinc/cp-kafka:7.5.0
   container_name: kafka
   hostname: kafka
   networks:
     - ms-credit-risk-network
   ports:
     - "9092:9092"
   depends_on:
     - zookeeper
   environment:
     KAFKA_BROKER_ID: 1
     KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
     KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
     KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:29092,PLAINTEXT_HOST://kafka:9092
     KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
     KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0
 kafka-ui:
     image: provectuslabs/kafka-ui:latest
     container_name: kafka-ui
     hostname: kafka-ui
     networks:
       - ms-credit-risk-network
     ports:
       - "8080:8080"
     depends_on:
       - kafka
     environment:
       KAFKA_CLUSTERS_0_NAME: local_kafka_cluster
       KAFKA_CLUSTERS_0_BOOTSTRAPSERVERS: kafka:29092

networks:
 ms-credit-risk-network:
   driver: bridge
Dockerfile:
ARG IMAGE=intersystemsdc/irishealth-community
ARG IMAGE=intersystemsdc/iris-ml-community
FROM $IMAGE

WORKDIR /home/irisowner/dev

ARG TESTS=0
ARG MODULE="objectscript-template"
ARG NAMESPACE="USER"

## Embedded Python environment
ENV IRISUSERNAME="_SYSTEM"
ENV IRISPASSWORD="SYS"
ENV IRISNAMESPACE="USER"
ENV PYTHON_PATH=/usr/irissys/bin/
ENV PATH="/usr/irissys/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/irisowner/bin"

COPY .iris_init /home/irisowner/.iris_init

RUN --mount=type=bind,src=.,dst=. \
   iris start IRIS && \
   iris session IRIS < iris.script && \
   ([ $TESTS -eq 0 ] || iris session iris -U $NAMESPACE "##class(%ZPM.PackageManager).Shell(\"test $MODULE -v -only\",1,1)") && \
   iris stop IRIS quietly

10. Congratulation! Now you have a complete microservice alive and kicking! Enjoy!