Thanks.  We will be exporting as code to compile on the customers' version (in this specific instance it is 2017) but ideally we just would like to ensure it will run on any prior version (within reason) without having to maintain a bunch of prior versions and test compile on each. The exportversion qualifier may be useful, but it's any dependencies like updated APIs you mention that we are wanting to catch.

MESH is just a message transport mechanism and is agnostic of payload type.  The key thing is the recipient system understands the format you're sending.  You can embed PDF files base64 encoded in a Kettering document, but whether that will work depends on whether EMIS is configured to ingest those PDFs.  

You need to configure the correct MESH Workflow ID so that EMIS understands what kind of message it is.  You can't fire random FHIR messages at another system and expect it to just understand it.  EMIS does understand standard NHSD FHIR Transfer of Care messages and Digital Medicines messages for example.

What kind of documents are these and have you checked the NHSD standard API catalogue for a suitable FHIR standard, and (assuming MESH transport) checked the list of approved MESH workflow IDs?

What issues if any are you experiencing?

Due to a change of job it's been a couple of years since I used Health Connect or Ensemble, so I've not used the specific FHIR functionality and can't give specific answers on that; though I've done similar for bespoke RESTful APIs.  Hence my answer is generic.   

It would be a standard pattern for integration middleware to provide a FHIR facade in front of another database, or orchestrate calls to several data sources and return a single response (e.g. Content Enricher)

Your FHIR business service needs to deal with the request.  I assume the built in FHIR support parses the URL and any FHIR JSON/XML resources on the request into objects for you.  You then pass the data items or objects you need into a business process via an internal request message that you define.  That process then calls into a SQL business operation that does the querying you need on the remote data source and passes results back in an internal response message.

If there's any other business logic you need, you'd normally put that in the business process.