Question
· Jul 20, 2017

SQL Inbound Adapter - Not returning all the records it should from a View in Micrsoft SQL Server

I am currently working on a issue with WRC on one of my Inbound SQL Adapters not returning all the records it should be. If I looked at the count of the records in Ensemble and compare it to that of a Microsoft SQL View, Ensemble seems to be off by a few records here and there. I am using a full dynamic select statement in my settings of the adapter.

SELECT Text, PhysicianLastName, PhysicianFirstName, PhysicianAddress1, PhysicianAddress2, PhysicianCity,

PhysicianState, PhysicianZip, PhysNum, InsertDate, FaxK, EnsIndex, MRN, AccountNum, FirstName, LastName

FROM vUHEpicMailEns
Order By FaxK

I have added some trace elements but the missing records are not getting even to the point of trace.

Has anyone run into this issue?

I am thinking of changing the Dynamic SQL to call a Stored Procedure instead that way eliminating some of the overhead and possible contention. This way its just reading Microsoft SQL Memory to Ensemble Memory. How would one go about and execute a Stored Procedure in OnProcessInput() of the SQL Inbound Adapter class?

Discussion (5)1
Log in or sign up to continue

Hi Scott,

It does sound like you have duplicate or previously processed records. The SQL inbound adapter will skip these.

One of the things to note about the inbound adapter is that the underlying adapter will call the on process input of your service for every row in the resultset. The potential problem with this is that the service could be killed before it has finished off all of the rows in the resultset (forced shut down etc). For this reason, the adapter has to keep track of where it last processed a record, such that it can continue from where it left off.

In your instance, it sounds like this behavior does not fit your data.

If you want to implement your own ground up service solution then you would have to build your own adapter and make the query execution from that adapters OnTask() method. Your adapter should probably extend EnsLib.SQL.Common and implement ExectureQuery* as a member method of the adapter.

If you are going to go down this route then be mindful of building resilience to handle forced shut downs so that it can continue from where it left off.

Also, it's good behavior for adapters to not block the production for long periods. Any adapter such as this will be looping around a set of data, calling out to the ProcessInput method of its business host. If there are many rows then this loop could be going for minutes. It's only when an adapter drops out of its OnTask method can the Ensemble Director cleanly shut down a production. This is why you sometimes see a production struggling to shut down. To not block the production, the adapter will need to chunk its work down, for instance limiting a query size and continuing on from that point on the next OnTask().

Alternatively, you could look to use the SQL outbound which avoids all of the high water mark functionality. Your query will always return everything you expect. I often do it this way and have never had any problems with skipped rows (I run separate audit jobs that also double check this).

Sean.

The SQL Inbound Service has a set start and stop time. I give it about a 1/2 hr to run before closing the connection. This is because I am treating it like a job, it doesn't need to be up all the time, just once a night.

I have used SQL Outbound Adapters in many other Operations I have built. What do you use as the Trigger, to say yes go ahead and process these records? In this case I use the records to send to a BPL to then use an Outbound SQL Adapter Operation I have created to get what ever else data wasn't in the record but I need in the outbound file.

There are a few approaches.

The schedule setting on a service can be hijacked to trigger some kind of start job message to an operation. It's not a real scheduler and IMHO a bit of a fudge.

A slightly non Ensemble solution is to use the Caché Task manager to trigger an Ensemble service at specific times. The service would be adapterless and would only need send a simple start message (Ens.StringContainer) to its job target. A custom task class (extends %SYS.Task.Definition) would use the CreateBusinessService() method on the Ens.Director to create an instance of this service and call its ProcessInput() method.

The only downside to this is those scheduled configuration settings are now living outside of the production settings. If you can live with that then this would be an ok approach.

Alternatively, you could write your own custom schedule adapter that uses custom settings for target names and start times. The adapters OnTask would get called every n seconds via its call interval setting and would check to see if it's time to trigger a process input message for one of the targets. The service would then send a simple start message to that target.

I prefer this last approach because it's more transparent to an Ensemble developer new to the production, also the settings stay with the production and are automatically mirrored to fail over members.

In working with WRC, I think I am going to have to go down the route of creating a Custom Service to send a tickler to a BPL to begin selecting the data. In theory it makes sense but I am having an issue trying to grasp my head around to code it. Without having to build a message structure to send the request to the BPL is there an easier way?