查找

Article
· Jan 10, 2022 4m read

12 Medical Datasets along with 43 tables dynamically created by using all-new LOAD DATA SQL functionality

 

Application to import 12 Datasets along with 43 tables dynamically by using  LOAD DATA command which loads data from a source into an IRIS SQL table. 
List of Datasets

Features

  • Dynamically creation of table based on the CSV file data.
  • Import dataset by using LOAD DATA functionality by Terminal or by web application.
  • Remove dataset programmatically by terminal or by Web Application.
  • Dynamically view Imported Data from Web Application.
  • Functionality to Import data into CSV, Excel or PDF Format.
  • Responsive web application to view the status of Dataset.
  • Any or All datasets can be install or Remove with single command.
  • Use of %SQL_Diag.Result and %SQL_Diag.Message tables with LOAD DATA

It is recommended to read related documentations LOAD DATA (SQL).

 

How to install or remove any Dataset from Terminal

Use below command to import particular Dataset by passing it's ID or pass 999 to import all DataSets

do ##class(dc.data.medical.utility).ImportDS(1)

Below is the main script to create table dynamically and load data by using LOAD DATA functionality. Please note that table is created dynamically 

//Get file name 
SET filename=tRS.Get("Name")
//Remove .csv from the file name
SET tableName=$REPLACE("dc_data_"_ds_"."_tRS.Get("ItemName"),".csv","") 
//Get columns based on the header row of csv file
Do ##class(dc.data.medical.utility).GetColTypes(filename,.coltype) 
//Dynamically create table based on tablename and column types
SET qry = "CREATE TABLE "_tableName_" ("_coltype_")"
SET rset = ##class(%SQL.Statement).%ExecDirect(,qry)
//Check if table created successfully
IF rset.%SQLCODE
{
   WRITE "ERROR : ",rset.%Message,!    
}
ELSE
{
  //Dynamically construct LOAD DATA statement
  SET qry = "LOAD DATA FROM FILE  '"_filename_"' INTO "_tableName_ " "_"USING {""from"":{""file"":{""header"":""1""}}}"
  SET rset = ##class(%SQL.Statement).%ExecDirect(,qry)
  // Check result set sqlcode, In case of error write resultset message
  IF rset.%SQLCODE
  {
   WRITE "ERROR Table : " _tableName_" IMPORT FAILED: ",rset.%Message,!
  }
  ELSE
  {
  WRITE "SUCCESS table : " _tableName_" created and "_rset.%ROWCOUNT_" Rows Imported Successfully",!
  }
}

Use below command to remove particular Dataset by passing it's ID or pass 999 to remove all DataSets
 

do ##class(dc.data.medical.utility).RemoveDS(1)

 

Below is the main script to remove table dynamically

//Get file name
SET filename=tRS.Get("Name")
//Remove .csv from file name
SET tableName=$REPLACE("dc_data_"_ds_"."_tRS.Get("ItemName"),".csv","")
//Drop table
SET qry = "DROP TABLE "_tableName
SET rset = ##class(%SQL.Statement).%ExecDirect(,qry)
//Check if table deleted successfully
IF rset.%SQLCODE
{
  WRITE "ERROR : ",rset.%Message,!          
}
ELSE
{
  WRITE "Table "_tableName_" deleted successfully",!
}

 

How to install or remove any Dataset from Web Application

Navigate to http://localhost:52773/csp/datasets/index.csp  application dashboard

If Dataset is not installed then install DataSet button will be visible and if Dataset is installed then Remove Dataset button will be visible.

Just press the desire button to install or remove any data set

 

How to view and export data from Dataset

Navigate to http://localhost:52773/csp/datasets/datasets.csp  View DataSets page.

Select particular dataset and then table from the list. Press Excel, CSV, PDF file button to export the data.

 

Thanks

2 Comments
Discussion (2)1
Log in or sign up to continue
Question
· Jan 4, 2022

How to make HL7 ORM message wait in Iris until an acknowledgement is received from the EMR system?

We will receive an ORM message in Business Service in Iris. From that ORM, an ADT^A31 message will be generated and sent out to the EMR before the ORM message is sent out. The ORM message has to wait until Iris receives an acknowledgement from the EMR indicating the ADT message was received. Then, the ORM message will be sent to the EMR. What do I need to set up to hold the ORM message until the acknowledgement is received? 

Thank you!

Patty Aguirre-Romero

11 Comments
Discussion (11)5
Log in or sign up to continue
Article
· Jan 2, 2022 3m read

DB Migration using SQLgateway

Thanks to @Yuri Marx we have seen a very nice example for DB migration from Postgres to IRIS.
My personal problem is the use of DBeaver as a migration tool.
Especially as one of the strengths of IRIS ( and also Caché) before is the availability of the
SQLgateways that allow access to any external Db as long as for them an access usinig 
JDBC or ODBC is available. So I extended the package to demonstrate this.

It is a classic OEX package with a complete Docker.
The SQLgateway is installed during Docker build and the required jdbcdriver for Linux
is included in this repo In order to make this demo faster, the size of tables to migrate
have been shrunk a bit. 

How to test

All migration actions can be executed directly from SMP.

1.   
Verify the gateway connection in     
SMP> Administration> Configuration >Connectivity >SqlGateway_Configuration        

To test Connection click edit

 

and Test Connection     

 

- verify Connection successful     
- Be patient at this point. Postgres Containers sometimes take quite some time to talk to you.    
- Wait a little bit, reload the page in browser and try the test again.     

2.
Identifying the source tables. In SMP > Change to Namespace USER then step to     
SMP >Explorers >SQL >Wizards > Data Migration      

3.     
Set required import parameters         
 

- Destination Namespace       
- Type = TABLE      
- Gateway = postgres      >>>  now the first connection is established and you select             
- Schema = public               
- Tables to migrate = all         

4.      
Identify target but change schema to be OEX compatible from public to dc_public         

  don't forget to click change all     
-  we migrate Definitions and Data so both sides are selected       

5.        
Skipping special settings we use defaults we start the task in background           
       

6.     
Now we check the results and see everything was working without Errors           

- You might see errors if tables depend on the content not yet migrated.       
- And wait for completions until the status shows Done     

7.     
We terminate the Migration Wizzard and return to normal table view filtered by dc*           

- All 8 tables are visible and show meaningful columns
 

8.             
Selecting a table and clicking on OpenTable shows reasonable contents            
                         

        

9.    
A look into the related generated Class Definitions confirms the result and successful completion.           

GitHub

3 Comments
Discussion (3)2
Log in or sign up to continue
Article
· Dec 30, 2021 1m read

Did you know that IRIS Terminal could do this?!?! History and Aliases!

Hi all,

When is a Developer Community post not a Developer Community post? When it's just a few sentences wrapped around a link to InterSystems documentation! And what better way to finish up 2021 than by telling you about something cool that's been available since v2020.3? As the ball drops in Times Square, curl up with this:

Repeating Previous Commands

I think it will bring you joy! Post your favorite aliases below. Happy New Year!

31 Comments
Discussion (31)9
Log in or sign up to continue
Article
· Dec 25, 2021 2m read

AOC2021-rcc

After >40 years of writing in-countable lines of code in M*/COS/ISOS (and a bunch of archaic languages)
I decided for myself to set a strong signal for the future. We have Embedded Python available
(still pre-release)! I just felt it as a sacrilege to ignore this excellent NEW opportunity and
stay with the old sermon that I had used for decades.
Advent means the time of waiting and preparing for something important to come. 
So to me, it meant Advent of Embedded Python Code that finally showed up in time.
ALL class methods of 25 exercises + utility are exclusively written using Embedded Python.
And I'm proud that it ranked as #3 completed by registered members of the Developer Community.

For later use I have added also

  • all full descriptions of the exercises as Day*.md,
  • a snapshot of the private leaderboard at the time of completion of the exercise,
  • all test data, exercises input data and alternate exercises input data,
  • result summaries for all Test examples, all Exercises, and all alternate Exercises.
  • fully compatible with ZPM and therefore also available without Docker image So you are able to follow in all details.

Prerequisites

Make sure you have git and Docker desktop installed.
ATTENTION: The Community License of the IRIS preview image expires 12/1/2022

Installation

Clone/git pull the repo into any local directory

 git clone https://github.com/isc-at/AoC2021-rcc.git

Open the terminal in this directory and build and run the IRIS container with your project:

 docker-compose up -d --build

iris.script will import everything you place under /src into IRIS.

How to Test it

Open IRIS terminal:

$ docker-compose exec iris iris session iris
USER>do ##class(dc.aoc2021.Day1).Run()
  • Extended Run parameters: do ##class(dc.aoc2021.Day3).Run(part,test) with part = 1,2 ; run only first or second part of example, anything else = both test = 0 ; use alternate input set test = 1..n ; run other tests as provided by example
; to run all exercises use
for day=0:1:25 do $classmethod("dc.aoc2021.Day"_day,"Run")
; to run all alteratete exercises use  test=0
for day=0:1:25 do $classmethod("dc.aoc2021.Day"_day,"Run",,0)
; to run all first test examples use
for day=0:1:25 do $classmethod("dc.aoc2021.Day"_day,"Run",,1)
; to run ALL this with just one line
for day=0:1:25 for test=-1:1:1 do $classmethod("dc.aoc2021.Day"_day,"Run",,test)

Some exercises have more than 1 example:
Day8:2, Day12:3, Day16:15, Day18:10, Day22:3, Day24:none

; to run all test examlpes of day 16
for example=1:1:15 do ##class(dc.aoc2021.Day16).Run(,example)

Files /data/ALL*.txt hold all available results

GitHub

Discussion (0)3
Log in or sign up to continue