We all know that having a set of proper test data before deploying an application to production is crucial for ensuring its reliability and performance. It allows to simulate real-world scenarios and identify potential issues or bugs before they impact end-users. Moreover, testing with representative data sets allows to optimize performance, identify bottlenecks, and fine-tune algorithms or processes as needed. Ultimately, having a comprehensive set of test data helps to deliver a higher quality product, reducing the likelihood of post-production issues and enhancing the overall user experience.

In this article, let's look at how one can use generative AI, namely Gemini by Google, to generate (hopefully) meaningful data for the properties of multiple objects. To do this, I will use the RESTful service to generate data in a JSON format and then use the received data to create objects.

29 4
0 653

The invention and popularization of Large Language Models (such as OpenAI's GPT-4) has launched a wave of innovative solutions that can leverage large volumes of unstructured data that was impractical or even impossible to process manually until recently.

27 4
5 735

Learning LLM Magic

The world of Generative AI has been pretty inescapable for a while, commercial models running on paid Cloud instances are everywhere. With your data stored securely on-prem in IRIS, it might seem daunting to start getting the benefit of experimentation with Large Language Models without having to navigate a minefield of Governance and rapidly evolving API documentation. If only there was a way to bring an LLM to IRIS, preferably in a very small code footprint....

19 0
5 291

1. IRIS RAG Demo

IRIS RAG Demo

This demo showcases the powerful synergy between IRIS Vector Search and RAG (Retrieval Augmented Generation), providing a cutting-edge approach to interacting with documents through a conversational interface. Utilizing InterSystems IRIS's newly introduced Vector Search capabilities, this application sets a new standard for retrieving and generating information based on a knowledge base.
The backend, crafted in Python and leveraging the prowess of IRIS and IoP, the LLM model is orca-mini and served by the ollama server.
The frontend is an chatbot written with Streamlit.

17 3
2 985
Article
· Jun 19, 2023 8m read
Open AI integration with IRIS

As you all know, the world of artificial intelligence is already here, and everyone wants to use it to their benefit.

There are many platforms that offer artificial intelligence services for free, by subscription or private ones. However, the one that stands out because of the amount of "noise" it made in the world of computing is Open AI, mainy thanks to its most renowned services: ChatGPT and DALL-E.

15 6
3 1.3K

With the advent of Embedded Python, a myriad of use cases are now possible from within IRIS directly using Python libraries for more complex operations. One such operation is the use of natural language processing tools such as textual similarity comparison.

14 4
4 564

Last week, we announced the InterSystems IRIS Data Platform, our new and comprehensive platform for all your data endeavours, whether transactional, analytics or both. We've included many of the features our customers know and loved from Caché and Ensemble, but in this article we'll shed a little more light on one of the new capabilities of the platform: SQL Sharding, a powerful new feature in our scalability story.

14 11
2 1.7K

Artificial Intelligence (AI) is getting a lot of attention lately because it can change many areas of our lives. Better computer power and more data have helped AI do amazing things, like improving medical tests and making self-driving cars. AI can also help businesses make better decisions and work more efficiently, which is why it's becoming more popular and widely used. How can one integrate the OpenAI API calls into an existing IRIS Interoperability application?

13 6
4 490

With the introduction of vector data types and the Vector Search functionality in IRIS, a whole world of possibilities opens up for the development of applications and an example of these applications is the one that I recently saw published in a public contest by the Ministry of Health from Valencia in which they requested a tool to assist in ICD-10 coding using AI models.

How could we implement an application similar to the one requested? Let's see what we would need:

11 2
2 288

Apache Spark has rapidly become one of the most exciting technologies for big data analytics and machine learning. Spark is a general data processing engine created for use in clustered computing environments. Its heart is the Resilient Distributed Dataset (RDD) which represents a distributed, fault tolerant, collection of data that can be operated on in parallel across the nodes of a cluster. Spark is implemented using a combination of Java and Scala and so comes as a library that can run on any JVM.

11 5
1 2.7K

In today's data landscape, businesses encounter a number of different challenges. One of them is to do analytics on top of unified and harmonized data layer available to all the consumers. A layer that can deliver the same answers to the same questions irrelative to the dialect or tool being used.

11 2
1 439

Last week saw the launch of the InterSystems IRIS Data Platform in sunny California.

For the engaging eXPerience Labs (XP-Labs) training sessions, my first customer and favourite department (Learning Services), was working hard assisting and supporting us all behind the scene.

11 3
0 1K

We have a yummy dataset with recipes written by multiple Reddit users, however most of the information is free text as the title or description of a post. Let's find out how we can very easily load the dataset, extract some features and analyze it using features from OpenAI large language model within Embedded Python and the Langchain framework.

10 3
2 365
Article
· Jun 12, 2023 3m read
LangChain fixed the SQL for me

This article is a simple quick starter (what I did was) with SqlDatabaseChain.

Hope this ignites some interest.

Many thanks to:

sqlalchemy-iris author @Dmitry Maslennikov

Your project made this possible today.

The article script uses openai API so caution not to share table information and records externally, that you didn't intend to.

A local model could be plugged in , instead if needed.

10 7
3 3.5K

Artificial intelligence is not limited only to generating images through text with instructions or creating narratives with simple directions.

You can also make variations of a picture or include a special background to an already existing one.

Additionally, you can obtain the transcription of audio regardless of its language and the speed of the speaker.

So, let's analyze how the file management works.

9 2
1 451

Fun or No Fun - how serious is it?


Large language models are stirring up some phenomena in recent months. So inevitably I was playing ChatGPT too over last weekend, to probe whether it would be a complimentary to some BERT based "traditional" AI chatbots I was knocking up, or rather would it simply sweep them away.

9 3
1 1.1K

Preview releases are now available for InterSystems IRIS Advanced Analytics, and InterSystems IRIS for Health Advanced Analytics! The Advanced Analytics add-on for InterSystems IRIS introduces IntegratedML as a key new feature.

9 1
3 629

Problem

Do you resonate with this - A capability and impact of a technology being truly discovered when it's packaged in a right way to it's audience. Finest example would be, how the Generative AI took off when ChatGPT was put in the public for easy access and not when Transformers/RAG's capabilities were identified. At least a much higher usage came in, when the audience were empowered to explore the possibilities.

8 6
6 462