New post

查找

Article
· Feb 7 3m read

Introdução ao InterSystems IRIS: Um Guia Completo

Atualizado 01/04/25

Olá Comunidade,

Vocês podem desbloquear todo o potencial do InterSystems IRIS — e ajudar sua equipe a se integrar — com a gama completa de recursos de aprendizado da InterSystems oferecidos online e presencialmente, para todas as funções em sua organização. Desenvolvedores, administradores de sistema, analistas de dados e integradores podem rapidamente se atualizar.

Recursos de Integração para Cada Função

Desenvolvedores

Administradores de Sistema

Analistas de Dados

Integradores

Implementadores

Gerentes de Projeto

Outros Recursos dos Serviços de Aprendizagem

  • 💻 Aprendizado Online: Registre-se gratuitamente em learning.intersystems.com para acessar cursos, vídeos e exercícios individuais. Você também pode concluir trilhas de aprendizado baseadas em tarefas ou programas baseados em funções para avançar em sua carreira.
  • 👩‍🏫 Treinamento em Sala de Aula: Verifique o cronograma de treinamento em sala de aula ao vivo, presencial ou virtual, ou solicite um curso privado para sua equipe. Encontre detalhes em classroom.intersystems.com.
  • 📘 Documentação do InterSystems IRIS: Materiais de referência abrangentes, guias e artigos explicativos. Explore a documentação.
  • 📧 Suporte: Para suporte técnico, envie um e-mail para support@intersystems.com.

Oportunidades de Certificação

certification badge Depois que você e os membros da sua equipe tiverem obtido treinamento e experiência suficientes, ganhem o certificado de acordo com sua função!

Aprenda com a Comunidade

💬Participe do aprendizado na Comunidade de Desenvolvedores: Converse com outros desenvolvedores, poste perguntas, leia artigos e fique atualizado com os últimos anúncios. Veja este post para dicas sobre como aprender na Comunidade de Desenvolvedores.

Com esses recursos de aprendizado, sua equipe estará bem equipada para maximizar os recursos do InterSystems IRIS, impulsionando o crescimento e o sucesso de sua organização. Para obter assistência adicional, poste perguntas aqui ou pergunte ao seu Engenheiro de Vendas dedicado.

Discussion (0)1
Log in or sign up to continue
Article
· Feb 7 9m read

Orchestrating a local LLM with IRIS Interoperability

Learning LLM Magic

The world of Generative AI has been pretty inescapable for a while, commercial models running on paid Cloud instances are everywhere.  With your data stored securely on-prem in IRIS, it might seem daunting to start getting the benefit of experimentation with Large Language Models without having to navigate a minefield of Governance and rapidly evolving API documentation.   If only there was a way to bring an LLM to IRIS, preferably in a very small code footprint....

Some warnings before we start

  1. This article targets any recent version of IRIS (2022+) which includes Embedded Python support.   This should work without issue on IRIS Community Edition
  2. LLMs are typically optimised for use against GPU processing.   This code will operate correctly against a CPU-only system, but it will be an order of magnitude slower compared to a system which can leverage a GPU
  3. This article uses fairly small Open Source models, to keep performance on less powerful hardward at a sensible level.   If you have more resource, this tutorial will work on larger models without any major changes (just substitute the model name, in most cases)

Part 1 - Isn't hosting an LLM difficult?

The LLM ecosystem has evolved rapidly.  Luckily for us, the tooling for this ecosystem has also evolved to keep pace.   We are going to use the Ollama package.  This can be installed on your platform of choice using their installation tools (available at https://ollama.com/)  Ollama allows us to spin up an interactive session to begin using LLM models, but also allows for very easy to use programatic access via Python APIs.  I am not going to cover installing Ollama in this article, but come back here when you have completed the install.

Excellent, you made it back!   Time to spool up a model.  We are going to use the reasonably lightweight Open Source GEMMA model, at the lowest entry point (2 billion)  https://ollama.com/library/gemma:2b.  With Ollama installed, running this is easy.  We just need to run 

ollama run gemma:2b

On our first run of this, the model will download (it's quite large, it might take a minute), install, and finally, you will get an interactive prompt into the LLM.  Feel free to ask it a question to verify that it's operating correctly

We now have an LLM cached and available to use on our instance!   Now, let's connect it to IRIS.

Step 2 - Accessing Ollama from IRIS to summarise text data

Before we begin, we will need to install the Ollama Python library.  This will provide very easy and automated access to this Ollama instance and model.  Refer to the documentation for your specific version to ensure you are running the correct installation command (https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls... is the current version).  On my instance, I ran 

python3 -m pip install --target /db/iris/mgr/python ollama

 We are now ready to create a Business Operation which will use this library to access the model.   Create a class which extends Ens.BusinessOperation and a Message classes to hold our requests and responses

Class Messages.SummaryInput Extends Ens.Request
{

Property jsonText As %String(MAXLEN = "");
Property plainText As %String(MAXLEN = "");
Property issueId As %String(MAXLEN = "");
}
Class Messages.SummaryOutput Extends Ens.Response
{

Property summaryText As %String(MAXLEN = "");
}
Class Operations.GenerateSummary Extends Ens.BusinessOperation
{

Property ollama As %SYS.Python;
Property json As %SYS.Python;
Method GetSummaryFromText(request As Messages.SummaryInput, Output response As Messages.SummaryOutput) As %Status
{
   #dim sc As %Status = $$$OK
   Try {
        Set embedding = ..PyTransform(request.plainText)


     
      Set response = ##class(Messages.SummaryOutput).%New()
      Set response.summaryText=embedding
      set ^zSummary(request.issueId)=embedding
    
       } Catch ex {
      Set sc  = ex.AsStatus()
   }

   Return sc
}

// }
Method OnInit() As %Status
{
   #dim sc As %Status = $$$OK
   Try {
      Do ..PyInit()
   } Catch ex {
      Set sc = ex.AsStatus()
   }
   Quit sc
}

Method PyInit() [ Language = python ]
{
  
   import os
   import json
   import ollama
   import sys
   
   os.environ['TRANSFORMERS_CACHE'] = '/caches'
   os.environ['HF_HOME'] = '/caches'
   os.environ['HOME'] = '/caches'
   os.environ['HF_DATASETS_CACHE'] = '/caches'
   self.ollama = ollama
   self.json = json
}

Method PyTransform(text As %String) As %String [ Language = python ]
{
 
    import os
    import json
    import ollama
    import sys

    response = ollama.chat(model='gemma:2b', messages=[
        {
        'role': 'system',
        'content': 'Your goal is to summarize the text given to you in roughly 300 words. It is from a meeting between one or more people. Only output the summary without any additional text. Focus on providing a summary in freeform text with what people said and the action items coming out of it. Give me the following sections: Problem, Solution and Additional Information.  Please give only the detail, avoid being polite'
        },
        {
        'role': 'user',
        'content': text,
        },
    ])

    
    return response['message']['content']
}

XData MessageMap
{
<MapItems>
  <MapItem MessageType="Messages.SummaryInput">
    <Method>GetSummaryFromText</Method>
  </MapItem>
</MapItems>
}

}

Once we have these classes in place, we can add this Operation to an Interoperability production.  Make sure to enable Testing at the Production level, so we can feed in some test conversation data, and check that the model is working.  In the example code above, the message allows for the passing of jsonText or plainText.  For now, only the plainText is read so we should populate this field in testing.  Additionally, we should pass in an IssueId, as this will transparently store the results of Summarisation in IRIS for later review

Let's give this a test:

And the model gives us in return...  

So, we now have an Operation which can access our local LLM, pass in data and get a response!   That was pretty easy, what else can we do?   Let's add a second Operation using a different model. 

Step 3 - Adding an image classification model

Ollama is able to run a wide range of models seamlessly.  Llava (https://llava-vl.github.io/) is a model optimised to analyse visual data such as images.  We are able to pass in an array of image data, encoded as Base64, and can then ask the model to analyse the text.  In this example, we will just ask it for a basic summary of what it sees, but other use cases could be to extract any text data, compare 2 images for likeness and so on.  Before we start, drop to your OS terminal and make sure to run the model once, to download all required files for setup

ollama run llava

As we are working with stream data here, testing is a little more challeging.   Typically a stream would be retrieved from somewhere in your codebase, and passed into Python.  In this example, I have Base64 encoded my Developer Community avatar as it's small enough to embed in the source file.  Let's see what Llava has to say about this image

Class Operations.ClassifyImage Extends Ens.BusinessOperation
{

Property ollama As %SYS.Python;
Property json As %SYS.Python;
Method GetImageSummary(request As Messages.SummaryInput, Output response As Messages.SummaryOutput) As %Status
{
   #dim sc As %Status = $$$OK
   Try {
        set stream = ##class(Issues.Streams).GetStreamByIssueId(request.issueId)
        Set embedding = ..PyTransform(stream)
  

       $$$TRACE(embedding)
        
      Set response = ##class(Messages.SummaryOutput).%New()
      Set response.summaryText=embedding
      
    
       } Catch ex {
      Set sc  = ex.AsStatus()
   }

   Return sc
}

// }
Method OnInit() As %Status
{
   #dim sc As %Status = $$$OK
   Try {
      Do ..PyInit()
   } Catch ex {
      Set sc = ex.AsStatus()
   }
   Quit sc
}

Method PyInit() [ Language = python ]
{
  
   import os
   import json
   import ollama
   import sys
   
   os.environ['TRANSFORMERS_CACHE'] = '/caches'
   os.environ['HF_HOME'] = '/caches'
   os.environ['HOME'] = '/caches'
   os.environ['HF_DATASETS_CACHE'] = '/caches'
   self.ollama = ollama
   self.json = json
}

Method PyTransform(image As %Stream.GlobalBinary) As %String [ Language = python ]
{
 
    import os
    import json
    import ollama
    import sys
    ## We would normally pass in the stream from the image paramater, but this is hardcoded here for ease of testing
    response = ollama.chat(model='llava', messages=[
        {
      "role": "user",
      "content": "what is in this image?",
      "images": ["/9j/4AAQSkZJRgABAQEAYAB...  Snipped for brevity"]
        }
    ]
    )

    
    return response['message']['content']
}

XData MessageMap
{
<MapItems>
  <MapItem MessageType="Messages.SummaryInput">
    <Method>GetImageSummary</Method>
  </MapItem>
</MapItems>
}

}

 Once we have run this using the Test Harness, we get a plaintext summary returned

This has done a pretty decent job of describing this image (leaving aside 'middle-aged', obviously).   It has correctly classified the main aspects of my appearance, and has also extracted the presence of the word "STAFF" within the image. 

So, with just 4 classes and a couple of external packages installed, we now have the ability to access 2 different LLM models from within IRIS Interoperability.  These Operations are available to use by another other code running on the system, simply by invoking the Operations with the defined messaging types.  The calling code does not need any special modification in order to leverage the output of the LLMs, plain text is returned and all of the complex plumbing is abstracted away

Step 4 - What's next?

We now have a template to run any models that can be hosted on Ollama (with another reminder that you may need a hefty GPU to run some larger models).  These operations are intentionally very simple, so you can use them as building blocks for your own use cases.   What else could you do next?  Here's some ideas

  • Convert the summary output to a Vector embedding to store in the IRIS Vector Store.  The new Vector Index functionality in IRIS allows very fast comparison of Vector data, allowing you to find and cluster similarly summarised data very quickly in SQL queries.  More details are available at https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls...
  • Allow for selectable prompts based on your input Message.   Prompt engineering is a massive subject by itself, but you can offer different processing options for your data simply by switching aspects of the prompt
  • Allow for REST access.  Allowing for API access to these services is simple using %CSP.REST, allowing IRIS to act as an LLM mini-Cloud for your organisation.   One of my previous articles has instructions on how to do this easily (https://community.intersystems.com/post/creating-iris-cross-functional-app)
  • Prompt your LLM to return richer data, such as JSON and process this using IRIS
  • Customise an Ollama model, and host that (https://medium.com/@sumudithalanz/unlocking-the-power-of-large-language-... is a good guide for this)

Example code available at: https://github.com/iscChris/LLMQuickStart.   Note that the Docker image does not build with Ollama (sorry, I'm bad at Docker), but the code will work on a properly configured instance (I'm using WSL)

Discussion (0)1
Log in or sign up to continue
Question
· Feb 7

Visual Studio Code eating licenses

Windows 10/Windows Server 2016:

I am currently monitoring our license use with a new rest-service I am implementing when I noticed my licenses on my instance being consumed and never released by Visual Studio Code.

For this I restarted my Instance, watched the licenses for a while, which remained at 1 during idle (I am guessing my MMGT Portal session uses 1) 
But when I connect to my Instance using Visual Studio Code (with my Instance setup in the extension already), suddenly 2 licenses are used. (I am guessing 1 for the "studio" and 1 for a terminal session, so far so good) 

But then when I open any file in the project, another license is used. 

All of them have User-IDs and the "CSP" type. No Grace time and stay active.  

So 3 licenses per active developer.

 

Now here's the problem:
If I do that on a new machine, with a InterSystems IRIS 2024.2 Entree - Concurrent Users for x86-64 (Microsoft Windows):5, Natural Language Processing (NLP), Develop license 

Even AFTER CLOSING VS Code, 2 of the 3 licenses are still in use. When I reopen VSCode, I get "service unavailable" because I ran out of licenses.

Is this normal behaviour? Do I need more than 5 licenses to program with 2 developers on a new machine? 

  

4 Comments
Discussion (4)4
Log in or sign up to continue
Article
· Feb 7 6m read

IRIS %Status and Exceptions Part-2

In this article, exceptions are covered.

Working with Exceptions

Instead of returning a %Status response, you can raise and throw an Exception. You are then responsible for catching the exception and validating it. IRIS provides five main classes to handle exceptions effectively. Additionally, you can create custom exception class definition based on your needs.

These exceptions are different from %Status, so you can't directly/efficiently use the exception object with $SYSTEM.OBJ.DisplayError() or $SYSTEM.Status.DisplayError(). Instead, use the DisplayString() method to display the exception message Alternatively, you can convert the exception to %Status using the AsStatus() method and use the display methods mentioned above.

All exceptions inherit from %Exception.AbstractException. When you instantiate the object, use the %OnNew() callback method to set values since the formal parameters are common.

Set exception = ##Class(%Exception.General).%New(pName, pCode, pLocation, pData, pInnerException)

Where:

  • pName - is the Name is the name of the error(error message).
  • pCode - is the error code (like $$$GeneralError, $$$NamespaceDoesNotExist, or custom numbers when using a general exception).
  • pLocation - is the location where the error occurred.
  • pData - Data is extra information supplied for certain errors.
  • pInnerException - is an optional inner exception.

Let's continue with the 5 major exceptions

%Exception.SystemException

This exception is raised when a system process terminates with an error. IRIS raises this system error by default. Additionally, $ZERROR and $ECODE are defined when this exception is raised.

ClassMethod SystemException()
{
    Set $ECODE=""
    Set $ZREFERENCE=""
    Set $ZERROR=""
    Try {
        Write ^(1)
    }
    catch ex {
        Write $ECODE,!
        Write $ZERROR,!
        Write ex.DisplayString()
    }
}


Result

LEARNING>Do ##class(Learning.myexcept).SystemException()
,M1,
<NAKED>SystemException+5^Learning.myexcept.1
<NAKED> 30 SystemException+5^Learning.myexcept.1
LEARNING>

 

%Exception.StatusException

This Exception behaves similarly to the %Status type exception. You can raise it using various macros, such as 

  • $$$THROWONERROR(%sc,%expr) - execute the expression, set into %sc and throw exception
  • $$$ThrowOnError(%expr) -  execute the expression, set into %sc internally and throw exception
  • $$$TOE(%sc,%expr) - Invoke $$$THROWONERROR(%sc,%expr)
  • $$$ThrowStatus(%sc) - Directly throw the exception

Most commonly used method to raise this exception is "CreateFromStatus" which accepts the %Status as a argument and convert into StatusException

ClassMethod StatusException()
{
	Set sc1 = $$$ERROR($$$UserCTRLC) 
	#dim sc As %Exception.StatusException = ##class(%Exception.StatusException).CreateFromStatus(sc1)
	#; display the error message
	Write sc.DisplayString()
	
	#; convert to %Status again
	Set st = sc.AsStatus()
	Do $SYSTEM.Status.DisplayError(st)
}

Result 

LEARNING>Do ##class(Test.Exception).StatusException()
ERROR #834: Login aborted
ERROR #834: Login aborted

 

%Exception.SQL

This Exception handles SQL errors, using SQLCODE and %msg values. There are two methods to create this exception

Example:
either way you can raise an exception

  •  $$$ThrowSQLCODE(%sqlcode,%message)
  • ##class(%Exception.SQL).CreateFromSQLCODE(%sqlcode,%message)

CreateFromSQLCODE : The first parameter is the SQL code, and the second is the message.
 

ClassMethod SQLExceptionBySQLCODE()
{
    Set tSQLException = ##class(%Exception.SQL).CreateFromSQLCODE(-30,"SAMPLE.PERSON1 not found")
    Write tSQLException.DisplayString(),!
    Set st = tSQLException.AsStatus()
    Do $SYSTEM.Status.DisplayError(st)
}

Create object instance: Create an instance of the exception and then handle it.

ClassMethod GeneralException()
{
    Set ex1 = ##class(%Exception.General).%New(,5001)  ; just for inner exception
    Set exGen = ##class(%Exception.General).%New("RegistrationException",5852,,"Invalid MRN",ex1)
    Write exGen.DisplayString()
    Set sc = exGen.AsStatus()
    Do $SYSTEM.OBJ.DisplayError(sc)
}

%Exception.PythonException

This Exception provides Python error information. Here is the simplified sample.

ClassMethod pythonException()
{
    Try{
        Do ..pyTest()
    }
    Catch ex {
        Write $classname(ex)_"  "_ex.DisplayString()
    }
}
ClassMethod pyTest() [ Language = python ]
{
    print(1/0)
}

Result 

LEARNING>Do ##class(Learning.myexcept).pythonException()
%Exception.PythonException  <PYTHON EXCEPTION> 246 <class 'ZeroDivisionError'>: division by zero
 
ERROR #5002: ObjectScript error: <PYTHON EXCEPTION> *<class 'ZeroDivisionError'>: division by zero1

 

Last but not least

Custom Exception classes

For application-specific errors, it's crucial to handle exceptions precisely. Incorporate string localization on error messages. You can create custom Exception class.

For instance, you can create custom error messages with different translations based on the session's language.

^IRIS.Msg("MyApp")="en"
^IRIS.Msg("MyApp","en",62536)="InvalidMRN"
^IRIS.Msg("MyApp","ru",62536)="недействительный MRN"
 

Simplified custom Exception class sample

 
MyApp.Exception

 

 
MyApp.Errors.inc

Execution of the custom exception

This method demonstrates how to raise and display custom exceptions in different languages, including English and Russian

Include MyApp.Errors
Class MyApp.Utils Extends %RegisteredObject
{
    ClassMethod CustomExpcetionInEng()
    {
        Set tmyappException = ##class(MyApp.Exception).%New(,$$$InvalidMRN)
        wWritetmyappException.DisplayString()
    }

    ClassMethod CustomExpcetionInRussian()
    {
        Do ##class(%MessageDictionary).SetSessionLanguage("ru")
        Set tmyappException = ##class(MyApp.Exception).%New(,$$$InvalidMRN)
        write tmyappException.DisplayString()
    }
}

 Results 

LEARNING>d ##Class(MyApp.Utils).CustomExpcetionInEng()
InvalidMRN 62536
LEARNING>d ##Class(MyApp.Utils).CustomExpcetionInRussian()
недействительный MRN 62536
LEARNING>

Conclusion

In these ways you can handle the exceptions and %Status errors efficiently in your application code. Handling the errors effectively extremely helps in many ways!

Discussion (0)1
Log in or sign up to continue
Question
· Feb 7

Need Help to do a connection Between IRIS and SAP RFC

Hello,

Please we need some help in connecting IRIS to SAP RFC.

We would appreciate if anyone has a detailed documentation for it. 

Thank you

2 Comments
Discussion (2)1
Log in or sign up to continue