Article
· Apr 8, 2019 4m read
Should we use computers?

The titular question was quite relevant and often discussed some thirty years ago. The thought went: “Sure, there are industries where computers are the norm, but in my industry we got just fine so far, the benefits are questionable, problems innumerable and unsolved. Can we continue as before or should we embrace this new technology?”

Today, everyone asks the same question but about Machine Learning and Artificial Intelligence. The doubts are the same – lack of expertise, lack of known path, perceived irrelevancy to the industry.

Yet, as before, the correct, even the only possible answer is a resounding yes. Read on to find out why.

2 1
1 397

Keywords: IRIS, IntegratedML, Machine Learning, Covid-19, Kaggle

Purpose

Recently I noticed a Kaggle dataset for the prediction of whether a Covid-19 patient will be admitted to ICU. It is a spreadsheet of 1925 encounter records of 231 columns of vital signs and observations, with the last column of "ICU" being 1 for Yes or 0 for No. The task is to predict whether a patient will be admitted to ICU based on known data.

2 1
1 919

image

Hi Community,
In this article, I will demonstrate below steps to create your own chatbot by using spaCy (spaCy is an open-source software library for advanced natural language processing, written in the programming languages Python and Cython):

  • Step1: Install required libraries

  • Step2: Create patterns and responses file

  • Step3: Train the Model

  • Step4: Create ChatBot Application based on the trained model

So Let us start.

6 1
1 3.5K

Keywords: PyODBC, unixODBC, IRIS, IntegratedML, Jupyter Notebook, Python 3

Purpose

A few months ago I touched on a brief note on "Python JDBC connection into IRIS", and since then I referred to it more frequently than my own scratchpad hidden deep in my PC. Hence, here comes up another 5-minute note on how to make "Python ODBC connection into IRIS".

1 0
1 2.1K

Challenges of real-time AI/ML computations

We will start from the examples that we faced as Data Science practice at InterSystems:

  • A “high-load” customer portal is integrated with an online recommendation system. The plan is to reconfigure promo campaigns at the level of the entire retail network (we will assume that instead of a “flat” promo campaign master there will be used a “segment-tactic” matrix). What will happen to the recommender mechanisms? What will happen to data feeds and updates into the recommender mechanisms (the volume of input data having increased 25000 times)? What will happen to recommendation rule generation setup (the need to reduce 1000 times the recommendation rule filtering threshold due to a thousandfold increase of the volume and “assortment” of the rules generated)?
  • An equipment health monitoring system uses “manual” data sample feeds. Now it is connected to a SCADA system that transmits thousands of process parameter readings each second. What will happen to the monitoring system (will it be able to handle equipment health monitoring on a second-by-second basis)? What will happen once the input data receives a new bloc of several hundreds of columns with data sensor readings recently implemented in the SCADA system (will it be necessary, and for how long, to shut down the monitoring system to integrate the new sensor data in the analysis)?
  • A complex of AI/ML mechanisms (recommendation, monitoring, forecasting) depend on each other’s results. How many man-hours will it take every month to adapt those AI/ML mechanisms’ functioning to changes in the input data? What is the overall “delay” in supporting business decision making by the AI/ML mechanisms (the refresh frequency of supporting information against the feed frequency of new input data)?

4 0
1 728

Introduction

In InterSystems IRIS 2024.3 and subsequent IRIS versions, the AutoML component is now delivered as a separate Python package that is installed after installation. Unfortunately, some recent versions of Python packages that AutoML relies on have introduced incompatibilities, and can cause failures when training models (TRAIN MODEL statement). If you see an error mentioning "TypeError" and the keyword argument "fit_params" or "sklearn_tags", read on for a quick fix.

8 0
0 153

This article shares analysis in solution cycle for the Open Exchange application TOOT ( Open Exchange application )

The hypothesis

A button on a web page can capture the users voice. IRIS integration could manipulate the recordings to extract semantic meaning that IRIS vector search can then offer for new types of AI solution opportunity.

2 0
0 49

Hi all. We are going to find duplicates in a dataset using Apache Spark Machine Learning algorithms.

Note: I have done the following on Ubuntu 18.04, Python 3.6.5, Zeppelin 0.8.0, Spark 2.1.1

Introduction

In previous articles we have done the following:

0 0
1 728

This is the second post of a series explaining how to create an end-to-end Machine Learning system.

Exploring Data

The InterSystems IRIS already has what we need to explore the data: an SQL Engine! For people who used to explore data in
csv or text files this could help to accelerate this step. Basically we explore all the data to understand the intersection
(joins) which should help to create a dataset prepared to be used by a machine learning algorithm.

1 0
1 308

Introducing Smart Clinical Sidechick — the intelligent, no-drama partner your EHR wishes it could be. She reads FHIR data in real time, interprets lab results without ghosting, and explains clinical alerts like she actually cares. Built with GPT-4 brains and YAML sass, she’s not here to replace your main EHR—just to make it look bad. Tired of irrelevant alerts and cryptic warnings? Sidechick serves up real, explainable insights, not vague “elevated risk” vibes. And when your backend crashes, she doesn’t panic—she self-heals.

1 0
1 81

Keywords: Anaconda, Jupyter Notebook, Tensorflow GPU, Deep Learning, Python 3 and HealthShare

1. Purpose and Objectives

This "Part I" is a quick record on how to set up a "simple" but popular deep learning demo environment step-by-step with a Python 3 binding to a HealthShare 2017.2.1 instance . I used a Win10 laptop at hand, but the approach works the same on MacOS and Linux.

4 0
2 1.2K

Thank you community for translating an earlier article into Portuguese.
Am returning the favor with a new release of Pattern Match Workbench demo app.

Added support for Portuguese.

The labels, buttons, feedback messages and help-text for user interface are updated.

Pattern Descriptions can be requested for the new language.

2 0
0 23

Keywords: IRIS, IntegratedML, Machine Learning, Covid-19, Kaggle

Continued from the previous Part I ... In part I, we walked through traditional ML approaches on this Covid-19 dataset on Kaggle.

In this Part II, let's run the same data & task, in its simplest possible form, through IRIS integratedML which is a nice & sleek SQL interface for backend AutoML options. It uses the same environment.

0 0
0 575

As an AI language model, ChatGPT is capable of performing a variety of tasks like language translation, writing songs, answering research questions, and even generating computer code. With its impressive abilities, ChatGPT has quickly become a popular tool for various applications, from chatbots to content creation.
But despite its advanced capabilities, ChatGPT is not able to access your personal data. So in this article, I will demonstrate below steps to build custom ChatGPT AI by using LangChain Framework:

4 0
1 12.8K


Hi Community

In this article, I will introduce my application IRIS-GenLab.

IRIS-GenLab is a generative AI Application that leverages the functionality of Flask web framework, SQLALchemy ORM, and InterSystems IRIS to demonstrate Machine Learning, LLM, NLP, Generative AI API, Google AI LLM, Flan-T5-XXL model, Flask Login and OpenAI ChatGPT use cases.

6 0
1 371
Article
· Jan 16, 2020 2m read
Python Gateway VI: Jupyter Notebook

This series of articles would cover Python Gateway for InterSystems Data Platforms. Execute Python code and more from InterSystems IRIS. This project brings you the power of Python right into your InterSystems IRIS environment:

  • Execute arbitrary Python code
  • Seamlessly transfer data from InterSystems IRIS into Python
  • Build intelligent Interoperability business processes with Python Interoperability Adapter
  • Save, examine, modify and restore Python context from InterSystems IRIS

Other articles

The plan for the series so far (subject to change).

Intro

The Jupyter Notebook is an open-source web application that allows you to create and share documents that contain live code, equations, visualizations and narrative text.

This extension allows you to browse and edit InterSystems IRIS BPL processes as jupyter notebooks.

4 0
0 665

Artificial intelligence has solved countless human challenges – and medical coding might be next.
As organizations prepare for ICD-11, medical coding is about to become more complicated. Healthcare organizations in the United States already manage 140,000+ codes in ICD-10. With ICD-11, that number will rise.
Some propose artificial intelligence as a solution. AI could aid computer-based medical coding systems, identifying errors, enhancing patient care, and optimizing revenue cycles, among other benefits.

1 0
0 849

What is Distributed Artificial Intelligence (DAI)?

Attempts to find a “bullet-proof” definition have not produced result: it seems like the term is slightly “ahead of time”. Still, we can analyze semantically the term itself – deriving that distributed artificial intelligence is the same AI (see our effort to suggest an “applied” definition) though partitioned across several computers that are not clustered together (neither data-wise, nor via applications, not by providing access to particular computers in principle). I.e., ideally, distributed artificial intelligence should be arranged in such a way that none of the computers participating in that “distribution” have direct access to data nor applications of another computer: the only alternative becomes transmission of data samples and executable scripts via “transparent” messaging. Any deviations from that ideal should lead to an advent of “partially distributed artificial intelligence” – an example being distributed data with a central application server. Or its inverse. One way or the other, we obtain as a result a set of “federated” models (i.e., either models trained each on their own data sources, or each trained by their own algorithms, or “both at once”).

2 0
1 668

Fixing the terminology

A robot is not expected to be either huge or humanoid, or even material (in disagreement with Wikipedia, although the latter softens the initial definition in one paragraph and admits virtual form of a robot). A robot is an automate, from an algorithmic viewpoint, an automate for autonomous (algorithmic) execution of concrete tasks. A light detector that triggers street lights at night is a robot. An email software separating e-mails into “external” and “internal” is also a robot. Artificial intelligence (in an applied and narrow sense, Wikipedia interpreting it differently again) is algorithms for extracting dependencies from data. It will not execute any tasks on its own, for that one would need to implement it as concrete analytic processes (input data, plus models, plus output data, plus process control). The analytic process acting as an “artificial intelligence carrier” can be launched by a human or by a robot. It can be stopped by either of the two as well. And managed by any of them too.

6 0
0 377