Article
· Dec 7, 2020 6m read
IRIS Python Native API in AWS Lambda

If you are looking for a slick way to integrate your IRIS solution in the Amazon Web Services ecosystem, server less application, or boto3 powered python script, using the IRIS Python Native API could be the way to go. You don't have to build out to far with a production implementation until you'll need to reach out and get something or set something in IRIS to make your application do its awesome sauce, so hopefully you will find value in this article and build something that matters or doesn't matter at all to anybody else but you as that is equally important.

image

7 2
1 733
Question
· Dec 12, 2019
ComplexMap Riddle

Im usually pretty good at ComplexMaps and implemented a couple, but I have one that is stumping me on how I can implement it. My problem is I have no real "leading data" to key off of and need something else...

It goes a little bit like this:

D123456 THING1 THING2 THING3 THING4

D789101 THING1 THING2 THING3 THING4

0 1
0 155

Loading your IRIS Data to your Google Cloud Big Query Data Warehouse and keeping it current can be a hassle with bulky Commercial Third Party Off The Shelf ETL platforms, but made dead simple using the iris2bq utility.

Let's say IRIS is contributing to workload for a Hospital system, routing DICOM images, ingesting HL7 messages, posting FHIR resources, or pushing CCDA's to next provider in a transition of care. Natively, IRIS persists these objects in various stages of the pipeline via the nature of the business processes and anything you included along the way. Lets send that up to Google Big Query to augment and compliment the rest of our Data Warehouse data and ETL (Extract Transform Load) or ELT (Extract Load Transform) to our hearts desire.

A reference architecture diagram may be worth a thousand words, but 3 bullet points may work out a little bit better:

  • It exports the data from IRIS into DataFrames
  • It saves them into GCS as .avro to keep the schema along the data: this will avoid to specify/create the BigQuery table schema beforehands.
  • It starts BigQuery jobs to import those .avro into the respective BigQuery tables you specify.

5 3
0 933

Hello,

I am pretty stuck here and would appreciate any help or advice on an approach to this...

I have a single claim file, ingested that has 7 claims inside of it, I am pulling each claim out based on a qualifier, then want to remove all of the others and do something with the one that is left over.

My problem is I cant seem to figure out how to Remove the Claims programatically...

0 1
0 225

Hello,

I am running a transformDTL through COS that transforms an EDI document to JSON and would like to validate the document and throw an error if it does not validate or build a map.

I am looking at available methods and haven't been able to find one that sort of does: Set tSC = ##class(*).Validate(tDoc,"HIPAA_5100").

Any help here would be appreciated, it seems like I am missing something simple.

0 1
0 363
Question
· Mar 12, 2018
Compilation Status after Import

Is there a way to enumerate the compilation status of a package?

currently after deployment we are doing something like this to validate a successful load and compile of classes:

successful_compilation_count=`grep -a "Compilation finished successfully" output.log | wc -l`
successful_load_count=`grep -a "Load finished successfully" output.log | wc -l`

is there a method to do this where it is a little bit more elegant/dynamic without having to maintain counts for comparison ?

0 4
0 359

For Global Summit 2016, I set out to showcase a Reference Architecture I had been working on for a National Provider Directory solution with State Level Instances and a National Instance all running HealthShare Provider Directory and all running on AWS Infrastructure.

In short, I wanted to highlight:

  • The implementation of Amazon Web Services to provision the infrastructure, including the auto-creation of the state level instances through Cloud Formation.
  • The use of the HSPD Broadcast functionality to Notify Upstream Systems Changes in Master Provider Data.
  • The implementation of a transformation of the standard Broadcast Object to HL7 MFN for interoperability.
  • The principals of Master Data Management applied to the Provider Directory.

5 4
0 1.6K
Article
· Mar 24, 2016 3m read
HealthShare 2 Slack Business Operation

This will be a stretch to be useful, but it was sorta fun. If you just so happen to have a use case to make your HealthShare productions talk to a Slack channel, this is the Business Operation for you.

In Slack, it is a dead simple process to enable an incoming web hook:

  • Name It
  • Give it an Icon or Emoji
  • Declare a Channel to Interact With

One you supplies those, it spits out a URL that you can go to town posting to your channel using that endpoint.

8 3
0 686