Best Practices

Syndicate content 41 

In my previous article, we reviewed possible use-cases for macros, so let’s now proceed to a more comprehensive example of macros usability. In this article we will design and build a logging system.

Logging system

Logging system is a useful tool for monitoring the work of an application that saves a lot of time during debugging and monitoring. Our system would consist of two parts:

  • Storage class (for log records)
  • Set of macros that automatically add a new record to the log

80
4 5 1,411

Keywords: Python, JDBC, SQL, IRIS, Jupyter Notebook, Pandas, Numpy, and Machine Learning 

1. Purpose

This is another 5-minute simple note on invoking the IRIS JDBC driver via Python 3 within i.e. a Jupyter Notebook, to read from and write data  into an IRIS database instance via SQL syntax, for demo purpose. 

20
2 1 632
Article
Tony Pepper · May 25, 2016 5m read
Random Read IO Storage Performance Tool

Purpose

This tool is used to generate random read Input/Output (IO) from within the database. The goal of this tool is to drive as many jobs as possible to achieve target IOPS and ensure acceptable disk response times are sustained. Results gathered from the IO tests will vary from configuration to configuration based on the IO sub-system. Before running these tests ensure corresponding operating system and storage level monitoring are configured to capture IO performance metrics for later analysis.

120
0 16 2,166
Article
Bob Kuszewski · Jun 19, 2020 5m read
Migrate from Java Business Host to PEX

Migrate from Java Business Host to PEX

With the release PEX in InterSystems IRIS 2020.1 and InterSystems IRIS for Health 2020.1, customers have a better way to build Java into productions than the Java Business Host. PEX provides a complete set of APIs for building interoperability components and is available in both Java and .NET. The Java Business Host has been deprecated and will be retired in a future release.

Advantages of PEX

30
1 4 185
Article
Eduard Lebedyuk · Feb 5, 2016 11m read
Class Queries in InterSystems IRIS

Class Queries in InterSystems IRIS (and Cache, Ensemble, HealthShare) is a useful tool that separates SQL queries from Object Script code. Basically, it works like this: suppose that you want to use the same SQL query with different arguments in several different places.In this case you can avoid code duplication by declaring the query body as a class query and then calling this query by name. This approach is also convenient for custom queries, in which the task of obtaining the next row is defined by a developer. Sounds interesting? Then read on!

150
3 17 5,011

$LIST string format and %DynamicArray and %DynamicObject classes

IRIS, and previously Cache, contain several different ways to create a sequence containing a mixture of data values.  A data sequence that has been available for many years is the $LIST string.  Another more recent data sequence is the %DynamicArray class, which along with the %DynamicObject class, is part of the IRIS support for JSON string representation.  These two sequences involve very different tradeoffs.

$LIST String Format

90
8 4 829

Your application is deployed and everything is running fine. Great, hi-five! Then out of the blue the phone starts to ring off the hook – it’s users complaining that the application is sometimes ‘slow’. But what does that mean? Sometimes? What tools do you have and what statistics should you be looking at to find and resolve this slowness? Is your system infrastructure up to the task of the user load? What infrastructure design questions should you have asked before you went into production? How can you capacity plan for new hardware with confidence and without over-spec'ing? How can you stop the phone ringing? How could you have stopped it ringing in the first place?

200
5 12 2,964
Article
Michael Smart · Oct 7, 2016 4m read
Forwarding Requests in a REST Service

One useful feature of our REST framework is the ability for a dispatch class to identify request prefixes and forward them to another dispatch class. This approach of modularizing your URL map will improve code readability, enable you to easily maintain separate versions of an interface, and provide a means to protect API calls that only certain users will be allowed to access.

70
0 1 2,248

This time I want to talk about something not specific to InterSystems IRIS, but that I think is important if you want to work with Docker and your server at work is a PC or laptop with Windows 10 Pro or Enterprise.

As you likely know, containers technology comes basically from Linux world and, nowadays, is on Linux hosts were it shows maximum potential. Those who use Windows on a normal basis see that both, Microsoft and Docker, have done important efforts during these last years that allow us to run containers based on Linux images on our Windows system in a really easy way... but it's something not supported for production systems and, this is the big problem, is not reliable if we want to keep persistent data outside of containers, in the host system,... mostly due to the big differences between Windows and Linux file systems. In the end, Docker for Windows itself uses a small linux virtual machine (MobiLinux) to run the containers... it does it transparently for the windows user... and it works perfectly well if, as I said, you don't require that your databases survive longer than the container...

Well,...let's get to the point,... the point is that many times, to avoid issues and simplify, we need a full Linux system and, if our server is based on Windows, the only way of having it is through a virtual machine. At least till WSL2 in Windows is released, but that will be another story and sure it'll take a bit of time to become robust enough.

In this article, I'll tell you, step by step, how to install an environment where you'll be able to work, if you need it, with Docker containers on an Ubuntu system in your Windows server. Let's go...

130
2 10 4,651

So, one day you're working away at WidgetsDirect, the leading supplier of widget and widget accessories, when your boss asks you to develop the new customer facing portal to allow the client base to access the next generation of Widgets..... and he wants you to use Angular 1.x to read into the department's Caché server.   

There's only one problem:  You've never used Angular, and don't know how to make it talk to Caché.

This guide is going to walk through the process of setting up a full Angular stack which communicates with a Caché backend using JSON over REST.  

150
3 23 3,632

Date range queries going too slow for you?  SQL Performance got you down?  I have one weird trick that might just help you out! (SQL Developers hate this!)*

If you have a class that records timestamps when the data is added, then that data will be in sequence with your IDKEY values - that is, TimeStamp< TimeStampif and only if ID1 < IDfor all IDs and TimeStamp values in table - then you can use this knowledge to increase performance for queries against TimeStamp ranges.  Consider the following table:

170
1 8 14,952

In this post I show strategies for backing up Caché using External Backup with examples of integrating with snapshot based solutions. The majority of solutions I see today are deployed on Linux on VMware so a lot of the post shows how solutions integrate VMware snapshot technology as examples.

160
4 21 6,129

In this article, I would like to talk about the spec-first approach to REST API development.

While traditional code-first REST API development goes like this:

  • Writing code
  • REST-enabling it
  • Documenting it (as a REST API)

Spec-first follows the same steps but reverse. We start with a spec, also doubling as documentation, generate a boilerplate REST app from that and finally write some business logic.

This is advantageous because:

  • You always have relevant and useful documentation for external or frontend developers who want to use your REST API
  • Specification created in OAS (Swagger) can be imported into a variety of tools allowing editing, client generation, API Management, Unit Testing and automation or simplification of many other tasks
  • Improved API architecture.  In code-first approach, API is developed method by method so a developer can easily lose track of the overall API  architecture, however with the spec-first developer is forced to interact with an API from the position if API consumer which usually helps with designing cleaner API architecture
  • Faster development - as all boilerplate code is automatically generated you won't have to write it, all that's left is developing business logic.
  • Faster feedback loops - consumers can get a view of the API immediately and they can easier offer suggestions simply by modifying the spec

Let's develop our API in a spec-first approach!

80
6 1 900

Hi Developers!

Many of you publish your InterSystems ObjectScript libraries on Open Exchange and Github.

But what do you do to ease the usage and collaboration to your project for developers?

In this article, I want to introduce the way how to introduce an easy way to launch and contribute to any ObjectScript project just by copying a standard set of files to your repository.

Let's go!

50
6 14 643

This article is a continuation of Deploying InterSystems IRIS solution on GKE Using GitHub Actions, in which, with the help of GitHub Actions pipeline, our zpm-registry was deployed in a Google Kubernetes cluster created by Terraform. In order not to repeat, we’ll take as a starting point that:

10
1 1 185
Article
Eduard Lebedyuk · May 21, 2018 10m read
Adding your own provider to MFT

Managed File Transfer (MFT) feature of InterSystems IRIS enables easy inclusion of a third-party file transfer service directly into an InterSystems IRIS production. Currently, DropBox, Box, and Kiteworks cloud disks are available.

In this article, I'd like to describe how to add more cloud storage platforms.

Here's what we're going to talk about:

  • What is MFT
  • Reference: Dropbox
    • Connection
    • Interoperability
    • Direct access
  • Interfaces you need to implement
    • Connection
    • Logic
  • Installation
20
1 4 335

InterSystems supports use of the InterSystems IRIS Docker images it provides on Linux only. Rather than executing containers as native processes, as on Linux platforms, Docker for Windows creates a Linux VM running under Hyper-V, the Windows virtualizer, to host containers. These additional layers add complexity that prevents InterSystems from supporting Docker for Windows at this time.

120
4 12 2,194

Imagine you want to see what InterSystems can give you in terms of data analytics. You studied the theory and now you want some practice. Fortunately, InterSystems provides a project that contains some good examples: Samples BI. Start with the README file, skipping anything associated with Docker, and go straight to the step-by-step installation. Launch a virtual instance, install IRIS there, follow the instructions for installing Samples BI, and then impress the boss with beautiful charts and tables. So far so good. 

Inevitably, though, you’ll need to make changes.

40
0 1 478

In an earlier article (hope, you’ve read it), we took a look at the CircleCI deployment system, which integrates perfectly with GitHub. Why then would we want to look any further? Well, GitHub has its own CI/CD platform called GitHub Actions, which is worth exploring. With GitHub Actions, you don’t need to rely on some external, albeit cool, service.

In this article we’re going to try using GitHub Actions to deploy the server part of  InterSystems Package Manager, ZPM-registry, on Google Kubernetes Engine (GKE).

30
0 1 433

Some time ago I got a WRC case transferred where a customer asks for the availability of a raw DEFLATE compression/decompression function built-in Caché.

When we talk about DEFLATE we need to talk about Zlib as well, since Zlib is the de-facto standard free compression/decompression library developed in the mid-90s.

Zlib works on particular DEFLATE compression/decompression algorithm and the idea of encapsulation within a wrapper (gzip, zlib, etc.).
https://en.wikipedia.org/wiki/Zlib

70
0 6 1,432

Last time we deployed a simple IRIS application to the Google Cloud. Now we’re going to deploy the same project to Amazon Web Services using its Elastic Kubernetes Service (EKS).

We assume you’ve already forked the IRIS project to your own private repository. It’s called <username>/my-objectscript-rest-docker-template in this article. <root_repo_dir> is its root directory.

Before getting started, install the AWS command-line interface and, for Kubernetes cluster creation, eksctl, a simple CLI utility. For AWS you can try to use aws2, but you’ll need to set aws2 usage in kube config file as described here.

30
2 1 777