Beginner

Syndicate content 13 

 

Keywords:   Jupyter Notebook, Tensorflow GPU, Keras, Deep Learning, MLP,  and HealthShare    

 

1. Purpose and Objectives

In  previous"Part I" we have set up a deep learning demo environment. In this "Part II" we will test what we could do with it.

Many people at my age had started with the classic MLP (Multi-Layer Perceptron) model. It is intuitive hence conceptually easier to start with.

So let's try a Keras "deep learning MLP" with standard demo data that everybody in AI/NN community has been using. It is a kind of so called "supervised learning". We will see how simple to run it on the Keras level.

We could later touch on its history and on why it's called "deep learning" the buzz word - what actually evolved over the recent 20 years.

0 1
0

comments

27

views

+ 1

rating

In recent discussion on CachéQuality I was (friendly) blamed for old syntax promotion and deliberate obfuscation of the code. Therefore I decided to clarify my point and shed some light on one of possible source of side effects that may unexpectedly occur with RETURN command with an argument.

Last comment 3 days ago
0 1
97

views

0

rating

InterSystems Data Platform includes utilities and tools for system monitoring and alerting, however System Administrators new to solutions built on the InterSystems Data Platform (a.k.a Caché) need to know where to start and what to configure.

This guide shows the path to a minimum monitoring and alerting solution using references from online documentation and developer community posts to show you how to enable and configure the following;

  1. Caché Monitor: Scans the console log and sends emails alerts.

  2. System Monitor: Monitors system status and resources, generating notifications (alerts and warnings) based on fixed parameters and also tracks overall system health.

  3. Health Monitor: Samples key system and user-defined metrics and compares them to user-configurable parameters and established normal values, generating notifications when samples exceed applicable or learned thresholds.

  4. History Monitor: Maintains a historical database of performance and system usage metrics.

  5. pButtons: Operating system and Caché metrics collection scheduled daily.

Remember this guide is a minimum configuration, the included tools are flexible and extensible so more functionality is available when needed. This guide skips through the documentation to get you up and going. You will need to dive deeper into the documentation to get the most out of the monitoring tools, in the meantime, think of this as a set of cheat sheets to get up and running.

Last comment 7 days ago
3 4
595

views

+ 10

rating

In this series of articles, I'd like to present and discuss several possible approaches toward software development with InterSystems technologies and GitLab. I will cover such topics as:

  • Git 101
  • Git flow (development process)
  • GitLab installation
  • GitLab Workflow
  • Continuous Delivery
  • GitLab installation and configuration
  • GitLab CI/CD

In the previous article, we covered Git basics, why a high-level understanding of Git concepts is important for modern software development, and how Git can be used to develop software. Still, our focus was on the implementation part of software development, but this part presents:

  • GitLab Workflow - a complete software life cycle process - from idea to user feedback
  • Continuous Delivery - software engineering approach in which teams produce software in short cycles, ensuring that the software can be reliably released at any time. It aims at building, testing, and releasing software faster and more frequently.

Last comment 1 March 2019
0 4
915

views

+ 5

rating

In this series of articles, I'd like to present and discuss several possible approaches toward software development with InterSystems technologies and GitLab. I will cover such topics as:

  • Git 101
  • Git flow (development process)
  • GitLab installation
  • GitLab Workflow
  • Continuous Delivery
  • GitLab installation and configuration
  • GitLab CI/CD

In the first article, we covered Git basics, why a high-level understanding of Git concepts is important for modern software development, and how Git can be used to develop software.

In the second article, we covered GitLab Workflow - a complete software life cycle process and Continuous Delivery

Last comment 26 February 2019
0 3
582

views

+ 2

rating

Hi,

This is a quick tutorial how to install and use TFS in Atelier. It is based on my self experience and some tricks that I 've noted.

If you are used to using visual studio maybe you feel that is a bit slow and heavy, but you have the same TFS panel as you have in Visual Studio, so don't need any special "training" to use it smiley

It's important don't store the file .buildpath because it has the server definition that you are working in a team, each one have the personal configuration, so the name of the server could be different.

To prevent it, create a file called .tfignore with this conten

Last comment 15 February 2019
1 8
715

views

+ 12

rating

Good News!! You can use now the Free InterSystems IRIS Community Edition in the AWS Cloud

Hello,

It's very common that people new in InterSystems IRIS want to start to work in a personal project in a full free environment. If you are one of this, Good News!! You can use now the Free InterSystems IRIS Community Edition in the AWS Cloud.

It is pretty easy to create a new EC2 instance from InterSystems IRIS Community Edition in AWS Marketplace.

After that you have to launch your instance and then you'll can access it using ssh like this:

(note: be sure you have 'chmod 400' at pem file

Last comment 5 February 2019
3 5
477

views

+ 6

rating

Recently I needed a classmethod that returns annotation value based on a name of a activity.

As doing it at runtime seemed inefficient, I wrote compile-time utility that iterates over all business process activities and generates relevant code.

This code could be used in a variety of situations when you need to iterate over business process activities, just add it as a secondary superclass to your BPL processes.

Last comment 27 January 2019
0 1
56

views

0

rating

Headache-free stored objects: a simple example of working with InterSystems Caché objects in ObjectScript and Python

Neuschwanstein Castle

Tabular data storages based on what is formally known as the relational data model will be celebrating their 50th anniversary in June 2020. Here is an official document – that very famous article.  Many thanks for it to Doctor Edgar Frank Codd. By the way, the relational data model is on the list of the most important global innovations of the past 100 years published by Forbes.

On the other hand, oddly enough, Codd viewed relational databases and SQL as a distorted implementation of his theory.  For general guidance, he created 12 rules that any relational database management system must comply with (there are actually 13 rules). Honestly speaking, there is zero DBMS's on the market that observes at least Rule 0. Therefore, no one can call their DBMS 100% relational :) If you know any exceptions, please let me know.

2 1
0

comments

216

views

+ 4

rating

The following post outlines an architectural design of intermediate complexity for DeepSee. As in the previous example, this implementation includes separate databases for storing the DeepSee cache, DeepSee implementation and settings. This post introduces two new databases: the first to store the globals needed for synchronization, the second to store fact tables and indices.

Last comment 2 January 2019
0 4
283

views

+ 4

rating

InterSystems products (IRIS, Caché, Ensemble) already include a built-in Apache web server. But the built-in server is designed for the development and administration tasks and thus has certain limitations. Though you may find some useful workarounds for these limitations, the more common approach is to deploy a full-scale web server for your production environment. This article describes how to set up Apache to work with InterSystems products and how to provide HTTPS access. We will be using Ubuntu, but the configuration process is almost the same for all Linux distributions.

Last comment 25 December 2018
7 4
281

views

+ 3

rating

Some weeks ago, I was reading a book by Stephen Hawking and Leonard Mlodinow, The Grand Design. At a certain point, trying to define why do we exist? , why do we use the models we use in physics?, ...those kind of things you know... they pointed at the Game of Life example invented by the mathematician John Coward in 1970... Basically he wanted to show that a system with really basic fundamental laws (Physics) could evolve and "live" to become a more complex system (Chemistry) in which "something" (humans) could work out its own model and complex rules to explain its reality… the rules for this deterministic model that he exposed were so basic that I thought it could be funny to implement them in ObjectScript when I had some spare time... there are others implementations in JavaScript and other languages... but not in ObjectScript... and that had to be corrected!!… so here you are!

Last comment 25 December 2018
2 4
259

views

+ 7

rating

Terminal scripts can be used to run pre-designed commands on the terminal, like a batch file.  You can write anything that can be executed on terminal, like for loop, if else and so on,  inside Terminal scripts. In this article, I will show you how to call Terminal scripts, how to use parameters in Terminal scripts and how to avoid session disconnected when running Terminal scripts. If you have any information about how to use Terminal scripts or you have any feedback, please feel free to leave a comment

1 1
0

comments

211

views

+ 3

rating

This article was written as an attempt to share the experience of installing the InterSystems Caché DBMS for production environment.
We all know that the development configuration of a DBMS is very different from real-life conditions.
As a rule, development is carried out in “hothouse conditions” with a bare minimum of security measures, but when we publish our project online, we must ensure its reliable and uninterrupted operation in a very aggressive environment.

The process of installing the InterSystems Caché DBMS with maximum security settings

OS security settings

The first step is the operating system. You need to do the following:

Last comment 19 December 2018
4 6
606

views

+ 6

rating

In this article, I would show how you can upload and download files from InterSystems products via http.

The questions about working with files over http arise fairly often on community and I'm usually linking to my FileServer project which demonstrates file upload/download but I'd like to talk a bit more on how we can serve and receive files from InterSystems products.

Downloading a file

If you have a file in on a file system and you know the path you can serve it from REST or CSP context by calling this method

Last comment 26 November 2018
3 2
340

views

+ 3

rating

Everybody has a testing environment.

Some people are lucky enough to have a totally separate environment to run production in.

-- Unknown

.

In this series of articles, I'd like to present and discuss several possible approaches toward software development with InterSystems technologies and GitLab. I will cover such topics as:

  • Git 101
  • Git flow (development process)
  • GitLab installation
  • GitLab WorkFlow
  • GitLab CI/CD
  • CI/CD with containers

This first part deals with the cornerstone of modern software development - Git version control system and various Git flows.

Last comment 22 November 2018
1 5
1400

views

+ 9

rating

Here I’ll walk you through the process of creating a simple Node/Express API and connect it to a InterSystems IRIS instance.

I won't go into much detail about how to work with any of the technologies I will mention in this tutorial but I will leave links, in case you want to learn more.

The objective here is to give you a practical guide on how to set up and connect a node.js back-end API to IRIS.

Before we get our hands dirty, make sure you have Node.js running on your machine. So I'll check:

➜ node --version
v8.12.0

Version 8.12.0 is the current LTS (Long Term support) version of node.js.

In case you have to install it go to https://nodejs.org

Last comment 4 November 2018
6 4
281

views

+ 5

rating

Ensemble supports publish and subscribe message delivery. Publish and subscribe refers to the technique of routing a message to one or more subscribers based on the fact that those subscribers have previously registered to be notified about messages on a specific topic.

This article demonstrates how several Ensemble capabilities can work together:

In this article we would send emails about:

  • New workflow tasks
  • Unassigned workflow tasks
  • Uncompleted workflow tasks
  • Ensemble alerts

Email recipients would be determined using Publish/Subscribe operation and each user would receive only digest email whenever possible.

Last comment 23 October 2018
0 3
166

views

0

rating

Hi all. We are going to find duplicates in a dataset using Apache Spark Machine Learning algorithms.

Note: I have done the following on Ubuntu 18.04, Python 3.6.5, Zeppelin 0.8.0, Spark 2.1.1

Introduction

In previous articles we have done the following

0 1
0

comments

140

views

0

rating

The following post outlines a more flexible architectural design for DeepSee. As in the previous example, this implementation includes separate databases for storing the DeepSee cache, DeepSee implementation and settings, and synchronization globals. This example introduces one new databases to store the DeepSee indices. We will redefine the global mappings so that the DeepSee indices are not mapped together with the fact and dimension tables.

Last comment 17 September 2018
0 2
171

views

+ 2

rating

 

Introduction

Any election is a highly mysterious process, and when you look at its results, the overall picture is not quite clear. I decided to put them, region by region, on the map of Moscow using InterSystems technologies that offer both storage and data analysis functionality. In this particular case, I used InterSystems Ensemble, a platform for application development and integration, but you can also build this solution using the multi-model InterSystems Caché DBMS, as well as InterSystems’ new product called IRIS Data Platform.

0 1
0

comments

281

views

+ 3

rating

Generally speaking, InterSystems products supported dynamic objects and JSON for a long while, but version 2016.2 came with a completely new implementation of these features, and the corresponding code was moved from the ObjectScript level to the kernel/C level, which made for a substantial performance boost in these areas. This article is about innovations in the new version and the migration process (including the ways of preserving backward compatibility).

2 1
0

comments

546

views

+ 5

rating

Below is a simple alert processor based on the EnsLib.HTTP.OutboundAdapter  to send text alerts via an SMS Gateway service. Typically, all that is needed to send an HTTP Post to the gateway service is the destination phone number, a source phone number, credentials, and the URL.

The code below is based on the Anveo gateway whose interface is as follows

Last comment 10 September 2018
0 5
566

views

+ 4

rating

Hi all. Today we are going to install Jupyter Notebook and connect it to Apache Spark and InterSystems IRIS.

Note: I have done the following on Ubuntu 18.04,  Python 3.6.5.

Introduction

If you are looking for well-known, widely-spread and mainly popular among Python users notebook instead of Apache Zeppelin, you should choose Jupyter notebook. Jupyter notebook is a very powerful and great data science tool. it has a big community and a lot of additional software and integrations. Jupyter notebook allows you to create and share documents that contain live code, equations, visualizations and narrative text. Uses include data cleaning and transformation, numerical simulation, statistical modeling, data visualization, machine learning, and much more. And most importantly, it is a big community that will help you solve the problems you face

1 2
0

comments

635

views

+ 5

rating

Hi all. Today we are going to upload a ML model into IRIS Manager and test it.

Note: I have done the following on Ubuntu 18.04, Apache Zeppelin 0.8.0, Python 3.6.5.

Introduction

These days many available different tools for Data Mining enable you to develop predictive models and analyze the data you have with unprecedented ease. InterSystems IRIS Data Platform provide a stable foundation for your big data and fast data applications, providing interoperability with modern DataMining tools.

Last comment 30 July 2018
1 2
308

views

+ 5

rating

Hi all. Today we are going to use k-means algorithm on the Iris Dataset.

Note: I have done the following on Ubuntu 18.04, Apache Zeppelin 0.8.0, python 3.6.5.

Introduction

K-Means is one of the simplest unsupervised learning algorithms that solves the clustering problem. It groups all the objects in such a way that objects in the same group (group is a cluster) are more similar (in some sense) to each other than to those in other groups. For example, assume you have an image with a red ball on the green grass. K-Means will split all pixels into two clusters. The first cluster will contain the pixels of the ball, the second cluster will contain the pixels of the grass

3 2
0

comments

2390

views

+ 6

rating