Unlike the movie mentioned in the image (for those who don't know, Matrix, 1999), the choice between Dynamic SQL and Embedded SQL is not a choice between truth and fantasy, but it is still a decision to be made. Below, I will try to make your choice easier.

If your need is interactions between the client and the application (and consequently the database), Dynamic SQL may be more appropriate, as it "adapts" very easily to these query changes. However, this dynamism has a cost: with each new query, it is remodeled, which can have a higher cost to execute. Below is a simple example of a Python code snippet.

68 16
4 411

After so many years of waiting, we finally got an official driver available on Pypi

Additionally, found JDBC driver finally available on Maven already for 3 months, and .Net driver on Nuget more than a month.

As an author of so many implementations of IRIS support for various Python libraries, I wanted to check it. Implementation of DB-API means that it should be replaceable and at least functions defined in the standard. The only difference should be in SQL.

And the beauty of using already existing libraries, that they already implemented other databases by using DB-API standard, and these libraries already expect how driver should work.

I decided to test InterSystems official driver by implementing its support in SQLAlchemy-iris library.

13 7
3 167

Artificial Intelligence (AI) is getting a lot of attention lately because it can change many areas of our lives. Better computer power and more data have helped AI do amazing things, like improving medical tests and making self-driving cars. AI can also help businesses make better decisions and work more efficiently, which is why it's becoming more popular and widely used. How can one integrate the OpenAI API calls into an existing IRIS Interoperability application?

13 6
4 490

Introduction

As AI-driven automation becomes an essential part of modern information systems, integrating AI capabilities into existing platforms should be seamless and efficient. The IRIS Agent project showcases how generative AI can work effortlessly with InterSystems IRIS, leveraging its powerful interoperability framework—without the need to learn Python or build separate AI workflows from scratch.

11 6
1 135

The last days I've work with the great new feature: LOAD DATA With this post I would like to share my first experiences with you. The following points do not contain any order or other evaluation. These are only things that I noticed when using the LOAD DATA command. It should also be noted that these points are based on the IRIS Version 2021.2.0.617 which is a preview release. So it may be that my observations do not apply to newer IRIS versions.

6 5
2 1.1K
Article
· Jan 13, 2022 4m read
How to find the dataset you need?

Hey community! How are you doing?

I hope to find everyone well, and a happy 2022 to all of you!

Over the years, I've been working on a lot of different projects, and I've been able to find a lot of interesting data.

But, most of the time, the dataset that I used to work with was the customer data. When I started to join the contest in the past couple of years, I began to look for specific web datasets.

I've curated a few data by myself, but I was thinking, "This dataset is enough to help others?"

5 4
0 382

The pandemic that struck the world in 2020 made everyone follow the news and the numbers that involve the COVID-19.

Why don’t you take that opportunity to create something simple and pleasant, to follow the number of vaccinations worldwide?

To face this challenge, I'm using the data provided by Our World in Data - Research and data to make progress against the world’s largest problems.

They have a dedicated repository on Github with the data of COVID-19, and I took the vaccination data to help me with my tracker.

4 4
0 389

Currently, the process of using machine learning is difficult and requires excessive consumption of data scientist services. AutoML technology was created to assist organizations in reducing this complexity and the dependence on specialized ML personnel.

AutoML allows the user to point to a data set, select the subject of interest (feature) and set the variables that affect the subject (labels). From there, the user informs the model name and then creates his predictive or data classification model based on machine learning.

4 4
5 449

Prefer not to read? Check out the demo video I created:

https://www.youtube.com/embed/-OwOAHC5b3s
[This is an embedded link, but you cannot view embedded content directly on the site because you have declined the cookies necessary to access it. To view embedded content, you would need to accept all cookies in your Cookies Settings]

2 3
1 132

Let's imagine if you would like to write some real web application, for instance, some simple clone of medium.com. Such sort of application can be written using any different language on the backend side, or with any framework on the frontend side. So many ways to do the same application, and you can look at this project. Which offers a bunch of frontends and backends realizations for exactly the same application. And you can easily mix them, any chosen frontend should work with any backend.

Let me introduce the same application realization for InterSystems IRIS on a backend side.

6 3
1 925

This is a series of programming challenges for beginners and experienced Caché programmers.

For an introduction : go to article https://community.intersystems.com/post/advent-code-2016-day1-no-time-ta...

Today's challenge is about decompressing input that is compressed in an experimental format.
In the format, markers indicate how much time a number of characters need to be repeated.

For example :

2 3
0 600

A few months ago, I read this interesting article from MIT Technology Review, explaing how COVID-19 pandemic are issuing challenges to IT teams worldwide regarding their machine learning (ML) systems.

Such article inspire me to think about how to deal with performance issues after a ML model was deployed.

2 2
0 467

In the past, reading information from a bar code was limited to a simple alphanumeric code. The creation of a bar code with more than one dimension (2D), especially the QR Code, allowed to increase the amount and variety of data stored in a bar code. While conventional bar codes are capable of storing a maximum of approximately 20 digits, the QR Code is capable of handling several tens to hundreds of times more information. This revolutionized the markets. Now QR codes are everywhere and can be very useful for storing textual, numeric, alphanumeric and even binary data.

4 2
3 1.4K

image

Hi Community,

In this article I will demonstrate the functionality of my app iris-energy-isodata .
Application is accessing energy data (production, demand and supply) from the major Independent System Operators (ISOs) in the United States to ensure sustainable consumption and production patterns (SDG's 12)

3 2
0 332

Brainstorming the project we would build to showcase in the current female health themed InterSystems FHIR Contest, our girl band decided that we need to do something practical for the ordinary user and to solve some burning issues of the modern life. This discussion led to the idea of creating a project that will help women not to forget their health in daily grind - FemTech Reminder.

8 2
0 481