Question
· 21 hr ago

Using the Workflow Engine to capture Data Quality Issues

I am looking for a way to capture Data Quality issues with the Source data that is populating HealthShare Provider Directory. 1 way is to use Managed Alerts, but since it could be multiple Providers and different messages it seems silly to alert on every message that has the error. Instead, I was thinking of using the Workflow Engine so it could populate a Worklist for someone to review and work.

Looking over the Demo.Workflow Engine example, I am not comprehending on how to send a task to the Workflow manager to populate the worklist from a DTL.

Does anyone have any examples on how this could be done? Do I need to do it in a Process or can it be sent via a DTL?

Product version: IRIS 2024.2
$ZV: HealthShare Provider Directory 2024.2.0 Build: 1009 [HealthShare Modules: Core:28.0 + Provider Directory:28.0] - IRIS for UNIX (Red Hat Enterprise Linux 8 for x86-64) 2024.1 (Build 267_2_24113U) Wed Jul 2 2025 16:36:55 EDT [Core:28.0|Provider Directory:28
Discussion (1)2
Log in or sign up to continue

Capturing Data Quality issues using the Workflow Engine to populate a worklist can be managed effectively in InterSystems IRIS. Here’s a detailed explanation and the steps involved:

  1. Workflow Process Creation:
    To automate tasks with ensemble workflow, create a Workflow process using the Business Process Designer. Define tasks to be handled manually (which correspond to data quality issues) as asynchronous calls within your BPL logic. Each call should involve the creation of a task object using the EnsLib.Workflow.TaskRequest class. This class can hold details like the message subject, additional contextual messages, and even form fields for user input or review. The task responses are carried using the EnsLib.Workflow.TaskResponse class [1][2].

  2. Integration with Workflow Operations:
    Define roles and users associated with these tasks within the Management Portal under Workflow Roles and Workflow Users. Ensure that Workflow Operations are correctly configured to correspond to these roles in your production. Each operation handles tasks assigned under its scope, making them visible in the connected Workflow portal [3][2].

  3. DTL Usage:
    Data Transformation Language (DTL) in Ensemble primarily handles format conversions and transformations. For capturing data issues and linking it to workflows, however, it is necessary to go beyond DTL work. The Workflow subsystem requires specific orchestration of tasks based on specified roles and users, which is typically managed through business processes, not directly through DTL [4][2].

  4. Interaction with Workflow Portal:
    All created tasks can then populate the Workflow portal for specific users to review, accept, or address. This ensures that Data Quality issues populate efficiently into a reviewable worklist and tasks don't overwhelm the Managed Alerts system directly [5][1].

  5. Workflow Sample and Testing:
    InterSystems provides a sample workflow setup in its HelpDesk application and other examples where automated workflows interact with manual tasks via class definitions like EnsLib.Workflow.TaskRequest. Reviewing these examples can help clarify how to connect DTL transformations with the Workflow responses effectively [6][2].

Tasks from the workflow system are sent to users with defined roles, and responses can update back to the originating process once the data quality issue is resolved.

For any further configurations or testing specifics, please refer to Ensemble Workflow testing techniques and sample configurations [7][2].

Sources: