Beginner

Syndicate content 13 

Hi all. Today we are going to upload a ML model into IRIS Manager and test it.

Note: I have done the following on Ubuntu 18.04, Apache Zeppelin 0.8.0, Python 3.6.5.

Introduction

These days many available different tools for Data Mining enable you to develop predictive models and analyze the data you have with unprecedented ease. InterSystems IRIS Data Platform provide a stable foundation for your big data and fast data applications, providing interoperability with modern DataMining tools.

Last comment 3 days ago
+ 5   1 3
414

views

+ 5

rating

The last time that I created a playground for experimenting with machine learning using Apache Spark and an InterSystems data platform,  see Machine Learning with Spark and Caché, I installed and configured everything directly on my laptop: Caché, Python, Apache Spark, Java, some Hadoop libraries, to name a few. It required some effort, but eventually it worked. Paradise. But, I worried. Would I ever be able to reproduce all those steps? Maybe. Would it be possible for a random Windows or Java update to wreck the whole thing in an instant? Almost certainly.

Last comment 3 days ago
+ 8   4 5
135

views

+ 8

rating

Launching IRIS Using Docker

 

 

This brief document will walk through the steps to launch IRIS community edition using Docker on the MAC. For those new to Docker, the first two pages explain how to install Docker, and run some basic commands that will be needed to get the IRIS community edition running. Experienced Docker users can skip directly to page three.

 

Last comment 23 April 2019
+ 3   0 2
73

views

+ 3

rating

Terminal scripts can be used to run pre-designed commands on the terminal, like a batch file.  You can write anything that can be executed on terminal, like for loop, if else and so on,  inside Terminal scripts. In this article, I will show you how to call Terminal scripts, how to use parameters in Terminal scripts and how to avoid session disconnected when running Terminal scripts. If you have any information about how to use Terminal scripts or you have any feedback, please feel free to leave a comment

Last comment 23 April 2019
+ 4   2 3
356

views

+ 4

rating

I needed to know programmatically if last ran failed or not.

After some exploring, here's the code:

ClassMethod isLastTestOk() As %Boolean
{
  set in = ##class(%UnitTest.Result.TestInstance).%OpenId(^UnitTest.Result)
  for i=1:1:in.TestSuites.Count() {
    #dim suite As %UnitTest.Result.TestSuite
    set suite = in.TestSuites.GetAt(i)
    return:suite.Status=0 $$$NO
  }
  quit $$$YES
}

Last comment 13 April 2019
+ 1   1 4
247

views

+ 1

rating

How Tax Service, OpenStreetMap, and InterSystems IRIS
could help developers get clean addresses

 

Pieter Brueghel the Younger, Paying the Tax (The Tax Collector), 1640

 

In my previous article, we just skimmed the surface of objects. Let's continue our reconnaissance. Today's topic is a tough one. It's not quite BIG DATA, but it's still the data not easy to work with: we're talking about fairly large amounts of data. It won't all fit into RAM at once, and some of it won't even fit on the drive (not due to lack of space, but because there's a lot of junk). The name of our subject is FIAS DB: the Federal Information Address System database - the databases of addresses in Russia. The archive is 5.5 GB. And it's a compressed XML file. After extraction, it will be a full 53 GB (set aside 110 GB for extraction). And when you start to parse and convert it, that 110 GB won't be enough. There won't be enough RAM either.

+ 2   2 1
0

comments

126

views

+ 2

rating

 

Keywords:   Jupyter Notebook, Tensorflow GPU, Keras, Deep Learning, MLP,  and HealthShare    

 

1. Purpose and Objectives

In  previous"Part I" we have set up a deep learning demo environment. In this "Part II" we will test what we could do with it.

Many people at my age had started with the classic MLP (Multi-Layer Perceptron) model. It is intuitive hence conceptually easier to start with.

So let's try a Keras "deep learning MLP" with standard demo data that everybody in AI/NN community has been using. It is a kind of so called "supervised learning". We will see how simple to run it on the Keras level.

We could later touch on its history and on why it's called "deep learning" the buzz word - what actually evolved over the recent 20 years.

+ 1   1 2
0

comments

139

views

+ 1

rating

In recent discussion on CachéQuality I was (friendly) blamed for old syntax promotion and deliberate obfuscation of the code. Therefore I decided to clarify my point and shed some light on one of possible source of side effects that may unexpectedly occur with RETURN command with an argument.

Last comment 15 March 2019
0   0 1
179

views

0

rating

InterSystems Data Platform includes utilities and tools for system monitoring and alerting, however System Administrators new to solutions built on the InterSystems Data Platform (a.k.a Caché) need to know where to start and what to configure.

This guide shows the path to a minimum monitoring and alerting solution using references from online documentation and developer community posts to show you how to enable and configure the following;

  1. Caché Monitor: Scans the console log and sends emails alerts.

  2. System Monitor: Monitors system status and resources, generating notifications (alerts and warnings) based on fixed parameters and also tracks overall system health.

  3. Health Monitor: Samples key system and user-defined metrics and compares them to user-configurable parameters and established normal values, generating notifications when samples exceed applicable or learned thresholds.

  4. History Monitor: Maintains a historical database of performance and system usage metrics.

  5. pButtons: Operating system and Caché metrics collection scheduled daily.

Remember this guide is a minimum configuration, the included tools are flexible and extensible so more functionality is available when needed. This guide skips through the documentation to get you up and going. You will need to dive deeper into the documentation to get the most out of the monitoring tools, in the meantime, think of this as a set of cheat sheets to get up and running.

Last comment 11 March 2019
+ 10   3 4
672

views

+ 10

rating

In this series of articles, I'd like to present and discuss several possible approaches toward software development with InterSystems technologies and GitLab. I will cover such topics as:

  • Git 101
  • Git flow (development process)
  • GitLab installation
  • GitLab Workflow
  • Continuous Delivery
  • GitLab installation and configuration
  • GitLab CI/CD

In the previous article, we covered Git basics, why a high-level understanding of Git concepts is important for modern software development, and how Git can be used to develop software. Still, our focus was on the implementation part of software development, but this part presents:

  • GitLab Workflow - a complete software life cycle process - from idea to user feedback
  • Continuous Delivery - software engineering approach in which teams produce software in short cycles, ensuring that the software can be reliably released at any time. It aims at building, testing, and releasing software faster and more frequently.

Last comment 1 March 2019
+ 5   0 4
1052

views

+ 5

rating

In this series of articles, I'd like to present and discuss several possible approaches toward software development with InterSystems technologies and GitLab. I will cover such topics as:

  • Git 101
  • Git flow (development process)
  • GitLab installation
  • GitLab Workflow
  • Continuous Delivery
  • GitLab installation and configuration
  • GitLab CI/CD

In the first article, we covered Git basics, why a high-level understanding of Git concepts is important for modern software development, and how Git can be used to develop software.

In the second article, we covered GitLab Workflow - a complete software life cycle process and Continuous Delivery

Last comment 26 February 2019
+ 2   0 3
638

views

+ 2

rating

Hi,

This is a quick tutorial how to install and use TFS in Atelier. It is based on my self experience and some tricks that I 've noted.

If you are used to using visual studio maybe you feel that is a bit slow and heavy, but you have the same TFS panel as you have in Visual Studio, so don't need any special "training" to use it smiley

It's important don't store the file .buildpath because it has the server definition that you are working in a team, each one have the personal configuration, so the name of the server could be different.

To prevent it, create a file called .tfignore with this conten

Last comment 15 February 2019
+ 12   1 8
770

views

+ 12

rating

Good News!! You can use now the Free InterSystems IRIS Community Edition in the AWS Cloud

Hello,

It's very common that people new in InterSystems IRIS want to start to work in a personal project in a full free environment. If you are one of this, Good News!! You can use now the Free InterSystems IRIS Community Edition in the AWS Cloud.

It is pretty easy to create a new EC2 instance from InterSystems IRIS Community Edition in AWS Marketplace.

After that you have to launch your instance and then you'll can access it using ssh like this:

(note: be sure you have 'chmod 400' at pem file

Last comment 5 February 2019
+ 6   3 5
586

views

+ 6

rating

Recently I needed a classmethod that returns annotation value based on a name of a activity.

As doing it at runtime seemed inefficient, I wrote compile-time utility that iterates over all business process activities and generates relevant code.

This code could be used in a variety of situations when you need to iterate over business process activities, just add it as a secondary superclass to your BPL processes.

Last comment 27 January 2019
+ 1   0 1
100

views

+ 1

rating

Headache-free stored objects: a simple example of working with InterSystems Caché objects in ObjectScript and Python

Neuschwanstein Castle

Tabular data storages based on what is formally known as the relational data model will be celebrating their 50th anniversary in June 2020. Here is an official document – that very famous article.  Many thanks for it to Doctor Edgar Frank Codd. By the way, the relational data model is on the list of the most important global innovations of the past 100 years published by Forbes.

On the other hand, oddly enough, Codd viewed relational databases and SQL as a distorted implementation of his theory.  For general guidance, he created 12 rules that any relational database management system must comply with (there are actually 13 rules). Honestly speaking, there is zero DBMS's on the market that observes at least Rule 0. Therefore, no one can call their DBMS 100% relational :) If you know any exceptions, please let me know.

+ 4   3 1
0

comments

274

views

+ 4

rating

The following post outlines an architectural design of intermediate complexity for DeepSee. As in the previous example, this implementation includes separate databases for storing the DeepSee cache, DeepSee implementation and settings. This post introduces two new databases: the first to store the globals needed for synchronization, the second to store fact tables and indices.

Last comment 2 January 2019
+ 4   0 4
308

views

+ 4

rating

InterSystems products (IRIS, Caché, Ensemble) already include a built-in Apache web server. But the built-in server is designed for the development and administration tasks and thus has certain limitations. Though you may find some useful workarounds for these limitations, the more common approach is to deploy a full-scale web server for your production environment. This article describes how to set up Apache to work with InterSystems products and how to provide HTTPS access. We will be using Ubuntu, but the configuration process is almost the same for all Linux distributions.

Last comment 25 December 2018
+ 3   7 4
343

views

+ 3

rating

Some weeks ago, I was reading a book by Stephen Hawking and Leonard Mlodinow, The Grand Design. At a certain point, trying to define why do we exist? , why do we use the models we use in physics?, ...those kind of things you know... they pointed at the Game of Life example invented by the mathematician John Coward in 1970... Basically he wanted to show that a system with really basic fundamental laws (Physics) could evolve and "live" to become a more complex system (Chemistry) in which "something" (humans) could work out its own model and complex rules to explain its reality… the rules for this deterministic model that he exposed were so basic that I thought it could be funny to implement them in ObjectScript when I had some spare time... there are others implementations in JavaScript and other languages... but not in ObjectScript... and that had to be corrected!!… so here you are!

Last comment 25 December 2018
+ 7   2 4
304

views

+ 7

rating

This article was written as an attempt to share the experience of installing the InterSystems Caché DBMS for production environment.
We all know that the development configuration of a DBMS is very different from real-life conditions.
As a rule, development is carried out in “hothouse conditions” with a bare minimum of security measures, but when we publish our project online, we must ensure its reliable and uninterrupted operation in a very aggressive environment.

The process of installing the InterSystems Caché DBMS with maximum security settings

OS security settings

The first step is the operating system. You need to do the following:

Last comment 19 December 2018
+ 6   4 6
668

views

+ 6

rating

In this article, I would show how you can upload and download files from InterSystems products via http.

The questions about working with files over http arise fairly often on community and I'm usually linking to my FileServer project which demonstrates file upload/download but I'd like to talk a bit more on how we can serve and receive files from InterSystems products.

Downloading a file

If you have a file in on a file system and you know the path you can serve it from REST or CSP context by calling this method

Last comment 26 November 2018
+ 3   3 2
430

views

+ 3

rating

Everybody has a testing environment.

Some people are lucky enough to have a totally separate environment to run production in.

-- Unknown

.

In this series of articles, I'd like to present and discuss several possible approaches toward software development with InterSystems technologies and GitLab. I will cover such topics as:

  • Git 101
  • Git flow (development process)
  • GitLab installation
  • GitLab WorkFlow
  • GitLab CI/CD
  • CI/CD with containers

This first part deals with the cornerstone of modern software development - Git version control system and various Git flows.

Last comment 22 November 2018
+ 10   1 5
1539

views

+ 10

rating

Here I’ll walk you through the process of creating a simple Node/Express API and connect it to a InterSystems IRIS instance.

I won't go into much detail about how to work with any of the technologies I will mention in this tutorial but I will leave links, in case you want to learn more.

The objective here is to give you a practical guide on how to set up and connect a node.js back-end API to IRIS.

Before we get our hands dirty, make sure you have Node.js running on your machine. So I'll check:

➜ node --version
v8.12.0

Version 8.12.0 is the current LTS (Long Term support) version of node.js.

In case you have to install it go to https://nodejs.org

Last comment 4 November 2018
+ 5   6 4
312

views

+ 5

rating

Ensemble supports publish and subscribe message delivery. Publish and subscribe refers to the technique of routing a message to one or more subscribers based on the fact that those subscribers have previously registered to be notified about messages on a specific topic.

This article demonstrates how several Ensemble capabilities can work together:

In this article we would send emails about:

  • New workflow tasks
  • Unassigned workflow tasks
  • Uncompleted workflow tasks
  • Ensemble alerts

Email recipients would be determined using Publish/Subscribe operation and each user would receive only digest email whenever possible.

Last comment 23 October 2018
0   0 3
189

views

0

rating

Hi all. We are going to find duplicates in a dataset using Apache Spark Machine Learning algorithms.

Note: I have done the following on Ubuntu 18.04, Python 3.6.5, Zeppelin 0.8.0, Spark 2.1.1

Introduction

In previous articles we have done the following

0   0 1
0

comments

173

views

0

rating

The following post outlines a more flexible architectural design for DeepSee. As in the previous example, this implementation includes separate databases for storing the DeepSee cache, DeepSee implementation and settings, and synchronization globals. This example introduces one new databases to store the DeepSee indices. We will redefine the global mappings so that the DeepSee indices are not mapped together with the fact and dimension tables.

Last comment 17 September 2018
+ 2   0 2
195

views

+ 2

rating