New post

Encontrar

Article
· Feb 10 7m read

Using SQL Gateway with Python, Vector Search, and Interoperability in InterSystems Iris - Part 2 – Python and Vector Search

Using SQL Gateway with Python, Vector Search, and Interoperability in InterSystems Iris

Part 2 – Python and Vector Search

 

Since we have access to the data from our external table, we can use everything that Iris has to offer with this data. Let's, for example, read the data from our external table and generate a polynomial regression with it.

For more information on using python with Iris, see the documentation available at https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=AFL_epython

Let's now consume the data from the external database to calculate a polynomial regression. To do this, we will use a python code to run a SQL that will read our MySQL table and turn it into a pandas dataframe:

ClassMethod CalcularRegressaoPolinomialODBC() As %String [ Language = python ]
{
    import iris
    import json
    import pandas as pd
    from sklearn.preprocessing import PolynomialFeatures
    from sklearn.linear_model import LinearRegression
    from sklearn.metrics import mean_absolute_error
    import numpy as np
    import matplotlib
    import matplotlib.pyplot as plt
    matplotlib.use("Agg")

    # Define Grau 2 para a regressão
    grau = 2
    
    # Recupera dados da tabela remota via ODBC
    rs = iris.sql.exec("select venda as x, temperatura as y from estat.fabrica")
    df = rs.dataframe()
   
    # Reformatando x para uma matriz 2D exigida pelo scikit-learn
    X = df[['x']]
    y = df['y']
    
    # Transformação para incluir termos polinomiais
    poly = PolynomialFeatures(degree=grau)
    X_poly = poly.fit_transform(X)

    # Inicializa e ajusta o modelo de regressão polinomial
    model = LinearRegression()
    model.fit(X_poly, y)

    # Extrai os coeficientes do modelo ajustado
    coeficientes = model.coef_.tolist()  # Coeficientes polinomiais
    intercepto = model.intercept_       # Intercepto
    r_quadrado = model.score(X_poly, y) # R Quadrado

    # Previsão para a curva de regressão
    x_pred = np.linspace(df['x'].min(), df['x'].max(), 100).reshape(-1, 1) 
    x_pred_poly = poly.transform(x_pred)
    y_pred = model.predict(x_pred_poly)
    
    # Calcula Y_pred baseado no X
    Y_pred = model.predict(X_poly)
            
    # Calcula MAE
    MAE = mean_absolute_error(y, Y_pred)
    
    # Geração do gráfico da Regressão
    plt.figure(figsize=(8, 6))
    plt.scatter(df['x'], df['y'], color='blue', label='Dados Originais')
    plt.plot(df['x'], df['y'], color='black', label='Linha dos Dados Originais')
    plt.scatter(df['x'], Y_pred, color='green', label='Dados Previstos')
    plt.plot(x_pred, y_pred, color='red', label='Curva da Regressão Polinomial')
    plt.title(f'Regressão Polinomial (Grau {grau})')
    plt.xlabel('X')
    plt.ylabel('Y')
    plt.legend()
    plt.grid(True)

    # Salvando o gráfico como imagem
    caminho_arquivo = 'c:\\temp\\RegressaoPolinomialODBC.png'
    plt.savefig(caminho_arquivo, dpi=300, bbox_inches='tight')
    plt.close()
    
    resultado = {
        'coeficientes': coeficientes,
        'intercepto': intercepto,
        'r_quadrado': r_quadrado,
        'MAE': MAE
    }

    return json.dumps(resultado)
}

 

 

The first action we take in the code is to read the data from our external table via SQL and then turn it into a Pandas dataframe. Always remembering that the data is physically stored in MySQL and is accessed via ODBC through the SQL Gateway configured in Iris. With this we can use the python libraries for calculation and graphing, as we see in the code.

Executing our routine we have the information of the generated model:

Our routine also generates a graph that gives us visual support for polynomial regression. Let's see how the graph turned out:

Another action we can take with the data that is now available is the use of Vector Search and RAG with the use of an LLM. To do this, we will vectorize the model of our table and from there ask LLM for some information.

For more information on using Vector Search in Iris, see the text available at https://www.intersystems.com/vectorsearch/

First, let's vectorize our table model. Below is the code with which we carry out this task:

ClassMethod IngestRelacional() As %String [ Language = python ]
{

    import json
    from langchain_iris import IRISVector
    from langchain_openai import OpenAIEmbeddings
    from langchain_text_splitters import RecursiveCharacterTextSplitter
    import iris    

    try:
    
        apiKey = iris.cls("Vector.Util").apikey()
        collectionName = "odbc_work"

        metadados_tabelas = [
            "Tabela: estat.fabrica; Colunas: chave(INT), venda(INT), temperatura(INT)"
        ]
        
        text_splitter = RecursiveCharacterTextSplitter(chunk_size=2048, chunk_overlap=0)
        documents=text_splitter.create_documents(metadados_tabelas)
            
        # Vetorizar as definições das tabelas
        vectorstore = IRISVector.from_documents(
        documents=documents,
        embedding=OpenAIEmbeddings(openai_api_key=apiKey),
        dimension=1536,
        collection_name=collectionName
        )
        
        return json.dumps({"status": True})
    except Exception as err:
        return json.dumps({"error": str(err)})
}

 

Note that we don't pass the contents of the table to the ingest code, but its model. In this way, LLM is able, upon receiving the columns and their properties, to define an SQL according to our request.

This ingest code creates the odbc_work table that will be used to search for similarity in the table model, and then ask the LLM to return an SQL. For this we created a KEY API at OpenAI and used langchain_iris as a python library. For more details about the langchain_iris see the link https://github.com/caretdev/langchain-iris

After ingest the definition of our table, we will have the odbc_work table  generated:

Now let's go to our third part, which is the access of a REST API that will consume the data that is vectorized to assemble a RAG.

See you later!

Discussion (0)1
Log in or sign up to continue
Article
· Feb 10 4m read

Using SQL Gateway with Python, Vector Search, and Interoperability in InterSystems Iris - Part 1 - SQL Gateway

Using SQL Gateway with Python, Vector Search, and Interoperability in InterSystems Iris

Part 1 - SQL Gateway

Hello

In this article we will look at the use of SQL Gateway in Iris. SQL Gateway allows Iris to have access to tables from other (external) database via ODBC or JDBC. We can access Tables or Views from various databases, such as Oracle, PostgreSQL, SQL Server, MySQL and others.

From a usage point of view, it is as if the table were local in our Iris instance, but the storage is carried out in an external location.

For more details on this component, see the documentation that is available at https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=BSQG_overview

So let's go to the necessary settings:

First, to configure SQL Gateway in Iris, we will need to configure access to an external database. For our example we will use a MySQL instance being accessed via ODBC.

We then install an instance of MySQL on a separate server, and create a table that will be accessed in our example:

So, we populate the table with information. Below is an example of the table already with data:

Now let's create a DSN in Windows to connect to MySQL. First, we need to install the MySQL ODBC driver on Windows. The driver can be easily obtained from MySQL's own website.

After installing the ODBC driver for MySQL, open the ODBC Data Source Manager on Windows:

Click on the "System DSN" tab and then on the Add button:

Select the MySQL ODBC ANSI driver  and click Finish.

Fill in the configuration screen with the name of the data source to be created, description, IP address and communication port of the MySQL server (default is port 3306 – check if your MySQL instance is actually using this communication port), the username and password to access MySQL (create in MySQL a user with access permission for the tables you want to use). Select the database to access:

Click the test button to verify that your configuration is accessing the MySQL instance configured in the DSN:

Now let's do a query via ODBC to check if everything is OK in our structure. We're going to use Excel for that. Open Excel, and go to Data->Fetch-Data->From other sources->From ODBC:

Select the DSN you created to access MySQL, and then select the schema and table you want to check. In our case we will see the venda table  of the estat scheme:

Then click the Load button  and you will see your table data in Excel cells:

So, we have our ODBC connection created and working to access our remote table. Now let's go back to Iris and perform the configuration of the SQL Gateway connection. Open the Iris Administration Panel and go to System Administration->Configuration->Connectivity->SQL Gateway Connections and click on the Create New Connection button:

Enter the DSN access data as follows:

Connection Type

ODBC

Connection Name

Enter a name at your discretion

Select an existing DSN

Select the DSN created for ODBC access to MySQL

User

Access user (same as DSN)

Password

Access password (same as the DSN)

For the test we will leave unchecked all the configuration options, except for Do not convert values that are not characters:

Finally, click on "Test Connection" to check if everything is OK. You'll see the text "Connection succeeded." if everything is set up correctly:

Once the DSN is created and the SQL Gateway access is configured, we can link the tables so that Iris can access the remote data. To do this, go back to the Admin Portal and go to the >SQL System Explorer and switch to the namespace where the class definitions that will be generated will be stored:

Click the "Wizards" link, then click "Link Table." You will see the Iris  table linking wizard screen:

Look for the SQL Gateway connection that was created in the previous step:

Select the schema and table you want to link in Iris:

Click Next and select the columns of the table you want to access:

Click Next and make the read/write name and characteristic adjustments in the available fields:

Click Next and look for the field that will be the primary key for the records:

Select the fields that form the primary key of the record:

Click Save. Enter the name of the class to be created and the name of the SQL table:

Then click Finish.

Our class in Iris was created. Open Studio and see the class:

Let's test data access by making a query via SQL. For this we will go back to the Admin Portal and we will go to >SQL System Explorer and we will change to the namespace where we created our class. Let's then run the SQL query on our linked table:

Ready! We already have access to the data from our table in MySQL. Remembering that we can perform INSERT/UPDATE/DELETE or use the Iris object manipulation syntaxes now that the table is linked, as we see below:

 

 

 

 

 

 

 

 

 

 

And using the Iris object syntax:

 

So we conclude this first part by being able to access MySQL data through Iris using the SQL Gateway.

See you in the next part of our article!

Discussion (0)1
Log in or sign up to continue
Announcement
· Feb 10

[Video] Make Upgrades Easier - Introducing the Upgrade Impact Checklist

Hi Community,

Enjoy the new video on InterSystems Developers YouTube:

⏯ Make Upgrades Easier - Introducing the Upgrade Impact Checklist @ Global Summit 2024

Join us to learn about the importance of upgrading InterSystems IRIS, and how the Upgrade Impact Checklist tool makes it easier than ever.

Presenters
🗣 @Vivian Lee, Applications Developer, InterSystems
🗣 @Peter Zubiago, Technical Writer, InterSystems

Love what you see? There’s more where that came from — subscribe now! 👍

Discussion (0)1
Log in or sign up to continue
InterSystems Official
· Feb 10

What's new with InterSystems Language Server 2.7

First, let me wish the developer community a Happy New Year! We hope to bring you many good things this year, and today I'd like to introduce the latest version of the Intersystems Language Server extension to VS Code. Most Language Server improvements are experienced via the ObjectScript extension UI, so you many not be aware of the many improvements in areas like Intellisense and hover overs that were released throughout 2024. Please take a quick read over the Language Server's CHANGELOG and find out what you missed. Most recently, version 2.7.0 brings support for the Windows ARM platform, so if you have a device such as the Surface Pro 11 (which I'm happily writing this post on), you can now have a great ObjectScript developer experience on your machine. Try it out and let us know how it goes in the comments below.

Discussion (0)2
Log in or sign up to continue
Question
· Feb 10

ssldefs.ini file error [Matching SSL server config not found]

I got this error

Connection via irisconnect failed: 
Matching SSL server config not found in ssldefs.ini or registry

and I put the ssldefs.ini file on C:\Program Files (x86)\Common Files\InterSystems\IRIS

why I still have this error?

3 Comments
Discussion (3)2
Log in or sign up to continue