The pandemic that struck the world in 2020 made everyone follow the news and the numbers that involve the COVID-19.

Why don’t you take that opportunity to create something simple and pleasant, to follow the number of vaccinations worldwide?

To face this challenge, I'm using the data provided by Our World in Data - Research and data to make progress against the world’s largest problems.

They have a dedicated repository on Github with the data of COVID-19, and I took the vaccination data to help me with my tracker.

4 4
0 327
Article
· Mar 27, 2021 3m read
IRIS easy ECP workbench

Testing ECP-based applications often take quite some effort for setup and preparation.
I have created a Docker-based workbench that allows you to have it quick at hands.
And if you crash it? You just give your containers a fresh start.
The whole setup runs code-based during the start-up of your instance.
In that sense, it is also a portable coding example using ZPM and the objectscript-docker-template

see Video

4 3
0 464

Deploying InterSystems HealthShare code, supporting lookups and artifacts like ssl certs, keys etc is relatively straight forward using Gitlab Runners. Not only does this approach enable managing the code base and deploying with git type workflows, but it also lends to a speedy recovery and repeatable environments for some implementations.

4 1
2 467

Recently I wanted to get a list of all cached queries and their texts. Here's how to do that.

First create an SQL Procedure returning Cache Query text from a Cached Query routine name:

Class test.CQ
{

/// SELECT test.CQ_GetText()
ClassMethod GetText(routine As %String) As %String [ CodeMode = expression, SqlProc ]
{
##class(%SQLCatalog).GetCachedQueryInfo(routine)
}

}

And after that you can execute this query:

4 3
0 561

Google has one intersting tool named Data Studio. This tool allows creating some interactive dashboards, based on your data, available from the internet. It already offers hundreds of connectors to any sort of data developed by the community. As well as some amount of community developed visualizing. And most importantly, Google offers a way to develop your own connector to your data.

FHIRaaS provides a REST API, and it's available from the internet. So I've decided to try to create some basic report on data stored there. And in the end, I got this.

4 0
2 285

Hi developers!

Just want to share an old but always relevant best practice on namespaces changing @Dmitry Maslennikov shared with me (again).

Consider method:

classmethod DoSomethingInSYS() as %Status

{

set sc=$$$OK

set ns=$namespace

zn "%SYS"

// try-catch in case there will be an error

try {

// do something, e.g. config change

}

catch {}

 zn ns    ; returning back to the namespace we came in the routine

return sc

}

And with new $namespace the method could be rewritten as:

classmethod DoSomethingInSYS() as %Status

{

set sc=$$$OK

new $namespace

set $namespace="%SYS"

// do something

return sc

}

So! The difference is that we don't need to change the namespace manually as it will be back automatically once we return the method.

and we don't need try-catch (at least for this purpose) too.

4 5
0 538

Making a blog using Python + IRIS Globals

Since I started to use internet (late 90's), I always had a CMS (content management system) present to make easier post
any information in a blog, social media or even an enterprise page. And later years putting all my code into github I
used to document it on a markdown file. Observing how easy could be persisting data into Intersystems IRIS with the
Native API I decided to make this application and force myself to forget a little of SQL and stay open to key-value database
model.

4 2
0 344

When you have been using cubes for business intelligence in a namespace for some time, you may find that there are many cubes in the namespace, only some of which are actively being used. However, it can be difficult to tell which cubes users are or are not querying, and maintaining unused cubes can be costly both in terms of storage and of computation to keep them up to date. This article provides some suggestions and examples for monitoring which cubes are in active use, and for removing cubes that you determine are no longer necessary.

4 2
2 409

Challenges of real-time AI/ML computations

We will start from the examples that we faced as Data Science practice at InterSystems:

  • A “high-load” customer portal is integrated with an online recommendation system. The plan is to reconfigure promo campaigns at the level of the entire retail network (we will assume that instead of a “flat” promo campaign master there will be used a “segment-tactic” matrix). What will happen to the recommender mechanisms? What will happen to data feeds and updates into the recommender mechanisms (the volume of input data having increased 25000 times)? What will happen to recommendation rule generation setup (the need to reduce 1000 times the recommendation rule filtering threshold due to a thousandfold increase of the volume and “assortment” of the rules generated)?
  • An equipment health monitoring system uses “manual” data sample feeds. Now it is connected to a SCADA system that transmits thousands of process parameter readings each second. What will happen to the monitoring system (will it be able to handle equipment health monitoring on a second-by-second basis)? What will happen once the input data receives a new bloc of several hundreds of columns with data sensor readings recently implemented in the SCADA system (will it be necessary, and for how long, to shut down the monitoring system to integrate the new sensor data in the analysis)?
  • A complex of AI/ML mechanisms (recommendation, monitoring, forecasting) depend on each other’s results. How many man-hours will it take every month to adapt those AI/ML mechanisms’ functioning to changes in the input data? What is the overall “delay” in supporting business decision making by the AI/ML mechanisms (the refresh frequency of supporting information against the feed frequency of new input data)?

4 0
1 580

Hey everyone,

Need to change your PRIMARY email address (login email) and not lose all your activity on the Developer Ecosystem resources: Community, Global Masters, and Open Exchange?

It's easy! We will take care!

1️⃣ We will correctly transfer all your information from the old DC account to the new one.

All your posts, comments, mentions, likes, etc. will be saved on the new account.

4 0
2 319

The TOGAF is the The Open Group Architecture Framework. It provides an approach for planning, design, implement, deploy and govern your EA (Enterprise Architecture) projects.

The TOGAF has a concept called Building Block. It is any element that can be used, reused a executed to deliver value and new functions to the business.

In the picture above, I present to you the main IRIS building blocks to create fantastic apps.

4 2
3 1.1K
Article
· Dec 19, 2021 5m read
IntegratedML walkthrough

The InterSystems IRIS IntegratedML feature is used to get predictions and probabilities using the AutoML technique. The AutoML is a Machine Learning technology used to select the better Machine Learning algorithm/model to predict status, numbers and general results based in the past data (data used to train the AutoML model). You don't need a Data Scientist, because the AutoML it will test the most common Machine Learning algorithms and select the better algorithm to you, based in the data features analysed. See more here, in this article.

4 3
1 709

Hey everyone!

I recently learnt something new while working with WRC on an issue, and I wanted to share with everyone on the off chance it could help someone else.

Scenario:

Files are being inexplicably written to a folder on your server and, due to the number of files in the folder and general system throughput, it is not possible to work through the files to track down the source.

4 0
0 311

The first installment of this article series discussed how to read a big chunk of data from the raw body of an HTTP POST method and save it to a database as a stream property of a class. The second installment discussed how to send files and their names wrapped in a JSON format.

Now let’s look closer at the idea of sending large files in parts to the server. There are several approaches we can use to do this. This article discusses using the Transfer-Encoding header to indicate chunked transfer. The HTTP/1.1 specification introduced the Transfer-Encoding header, and the RFC 7230 section 4.1 described it, but it’s absent from the HTTP/2 specification.

4 0
0 683

When you have more than ten thousand users in your database, it becomes time-consuming and inconvenient to assign group access rights through the standard IRIS interface. In this artilce I want to introduce you an application to automate this process.

I’ll show you how to assign and change the role lists for the users, selecting them by context, and I will also show you how to expand this application’s functionality. you know how to apply your work as an administrator and developer to the new features of the well-proven apptools software complex. With the addition of an adminLTE template, you can now quickly and easily create interface interactions for any entity and many functional modules.
The goal while developing this toolkit is to write as little JavaScript as possible as well as transfer and implement all the dynamics as much as possible in the ObjectScript.

4 0
0 529
Article
· May 24, 2021 1m read
Data Platform Levels

It's a challenge when you need, as a software architect, design a corporate architecture to meet the current business requirements, you need achieve level 5. With InterSystems IRIS.
it's possible. With 1 product you get SQL + NoSQL + ESB + BI + Open Analytics + Real time virtual cubes + NLP + AutoML + ML (with Python) and Advanced cloud + Sharding support.

4 2
1 422
Article
· Nov 22, 2021 2m read
Apache Zeppelin + IRIS Quick Start

Apache Zeppelin it's a Multi-purpose notebook that allow you:

  • Data Ingestion
  • Data Discovery
  • Data Analytics
  • Data Visualization and Collaboration.

Apache Zeppelin interpreter concept allows any language/data-processing-backend to be plugged into Zeppelin. Currently Apache Zeppelin supports many interpreters such as Apache Spark, Apache Flink, Python, R, JDBC, Markdown and Shell.

4 4
2 433