Hello Community,

In this article, I will outline and illustrate the process of implementing ObjectScript within embedded Python. This discussion will also reference other articles related to embedded Python, as well as address questions that have been beneficial to my learning journey.

As you may know, the integration of Python features within IRIS has been possible for quite some time. This article will focus on how to seamlessly incorporate ObjectScript with embedded Python.

2 1
0 88

Hello Community,

This article aims to walk you through the process of setting up and utilizing the Flexible Python Runtime Feature for embedded Python. Prior to version 2024.2, Intersystems IRIS installer included a preinstalled version of Python, You can find the Python libraries and application files located in the \lib\python directory within your IRIS installation folder (for example, C:\InterSystems\IRIS20242\lib\python).

1 2
0 65

Currently we are exploring how we can allocate additional disk space to our current environment as we have seen a significant increase in growth of our Database files. Currently we have 3 namespaces, all with 1 IRIS.dat each that contains both the Global and Routines.

Since we have started down the route of everything within a single IRIS.dat file for each namespace, is it logical as we see growth to be able to split the current IRIS.dat for each namespace into a separate IRIS.dat for global and a IRIS.dat with for routines for each namespace in a Mirror environment?

1 4
0 92
Article
· Aug 19 4m read
Accessing Azure Blob Storage

Accessing an Azure cloud storage to upload/download blobs is quite easy using the designated %Net.Cloud.Storage.Client class API methods, or using the EnsLib.CloudStorage.* inbound/outbound adaptors.

Note that you'll need to have the %JavaServer External Language Server up and running to use the cloud storage API or adaptors, since they both use the PEX framework using the Java Server.

Here is a quick summary:

8 0
2 87

InterSystems IRIS Adaptive Analytics version 2024.1.3 is now available from the InterSystems Software Distribution page. This release includes AtScale 2024.1.3, and an updated User Defined Aggregate Function (UDAF) file. This release includes and the following new modeling and BI capabilities:

0 0
0 74

docker login -u=<username> -p=<password> containers.intersystems.com
WARNING! Using --password via the CLI is insecure. Use --password-stdin.
Login Succeeded

$ docker pull containers.intersystems.com/intersystems/irishealth:2024.1
Error response from daemon: unauthorized: The client does not have permission for manifest: Download request for repo:path 'docker-customer-remote-cache:intersystems/irishealth/2024.1/list.manifest.json' is forbidden for user: 'anonymous'.

why the download is not working.

0 2
0 56

Hello Community,

The Certification Team of InterSystems Learning Services is excited to announce the release of our new InterSystems IRIS SQL Specialist exam. It is now available for purchase and scheduling in InterSystems exam catalog. Potential candidates can review the exam topics and the practice questions to help orient them to exam question approaches and content. Candidates who successfully pass the exam will receive a digital certification badge that can be shared on social media accounts like LinkedIn. <--break->

4 0
1 54

during an upgrade, a customer wants to load custom schemas in a particular order.
For this, they renamed a file EPIC_MDM.HL7 to 1EPIC_MDM.HL7
Upon importing this file into their server using VSCode, the custom schema was renamed to 1EPIC_MDM.HL7 inside the <Category>

This is impacting the upgrade and they are looking for a way to import a custom schema without renaming the schema itself.

0 2
0 52

I have some Services using EnsLib.File.InboundAdapter to go directly to respective operations using EnsLib.File.OutboundAdapter which has a 'File Path' specified.

Using this File Path as a root directory, I'd like to instead pass this through a Router where I could somehow inject a subdirectory to place the file into on the outbound side based off the source service it is coming from. There will likely be several inbound services writing to each outbound operation and I'd like to be able to sort the output into subfolders.

1 1
0 47