image

This article will cover turning over control of provisioning the InterSystems Kubernetes Operator, and starting your journey managing your own "Cloud" of InterSystems Solutions through Git Ops practices. This deployment pattern is also the fulfillment path for the PID^TOO||| FHIR Breathing Identity Resolution Engine.

7 3
1 665

Hi,

This is Jayanth from OAK Technologies.

Hope you are all doing well!!

We have a position for InterSystems IRIS Technology Role for our client if anyone is interested, please drop your resume to jayanth@oaktechinc.com

Job Role: IRIS technology role

Location: Chicago, Illinois (Remote Work)

Contract:1+ Year W2 OR 1099 Contact

0 0
0 228

A simple production that enables FHIR transaction bundles to be loaded into InterSystems® FHIR® Server via Box and Dropbox. Using the included MFT Connection Components and a 14 liner Custom Business Process, this production will process your transaction bundles to FHIR Resources for immediate consumption with Harry Potter like wizardry. Great for Hackathons, Research and FHIR® Cocktail parties.

7 2
0 475

We are ridiculously good at mastering data. The data is clean, multi-sourced, related and we only publish it with resulting levels of decay that guarantee the data is current. We chose the HL7 Reference Information Model (RIM) to land the data, and enable exchange of the data through Fast Healthcare Interoperability Resources (FHIR®).

13 2
1 759

Deploying InterSystems HealthShare code, supporting lookups and artifacts like ssl certs, keys etc is relatively straight forward using Gitlab Runners. Not only does this approach enable managing the code base and deploying with git type workflows, but it also lends to a speedy recovery and repeatable environments for some implementations.

4 1
2 547
Article
· Dec 7, 2020 6m read
IRIS Python Native API in AWS Lambda

If you are looking for a slick way to integrate your IRIS solution in the Amazon Web Services ecosystem, server less application, or boto3 powered python script, using the IRIS Python Native API could be the way to go. You don't have to build out to far with a production implementation until you'll need to reach out and get something or set something in IRIS to make your application do its awesome sauce, so hopefully you will find value in this article and build something that matters or doesn't matter at all to anybody else but you as that is equally important.

image

7 2
1 834

Loading your IRIS Data to your Google Cloud Big Query Data Warehouse and keeping it current can be a hassle with bulky Commercial Third Party Off The Shelf ETL platforms, but made dead simple using the iris2bq utility.

Let's say IRIS is contributing to workload for a Hospital system, routing DICOM images, ingesting HL7 messages, posting FHIR resources, or pushing CCDA's to next provider in a transition of care. Natively, IRIS persists these objects in various stages of the pipeline via the nature of the business processes and anything you included along the way. Lets send that up to Google Big Query to augment and compliment the rest of our Data Warehouse data and ETL (Extract Transform Load) or ELT (Extract Load Transform) to our hearts desire.

A reference architecture diagram may be worth a thousand words, but 3 bullet points may work out a little bit better:

  • It exports the data from IRIS into DataFrames
  • It saves them into GCS as .avro to keep the schema along the data: this will avoid to specify/create the BigQuery table schema beforehands.
  • It starts BigQuery jobs to import those .avro into the respective BigQuery tables you specify.

5 3
0 1.1K

For Global Summit 2016, I set out to showcase a Reference Architecture I had been working on for a National Provider Directory solution with State Level Instances and a National Instance all running HealthShare Provider Directory and all running on AWS Infrastructure.

In short, I wanted to highlight:

  • The implementation of Amazon Web Services to provision the infrastructure, including the auto-creation of the state level instances through Cloud Formation.
  • The use of the HSPD Broadcast functionality to Notify Upstream Systems Changes in Master Provider Data.
  • The implementation of a transformation of the standard Broadcast Object to HL7 MFN for interoperability.
  • The principals of Master Data Management applied to the Provider Directory.

5 4
0 1.7K