#include with # used in Mac routines
for classes should be just include without #
- Log in to post comments
#include with # used in Mac routines
for classes should be just include without #
When you start background job, in the master process, you can collect $zchild with a pid of started process
and collect them in the logs, and then in the loop you can check $data(^$job(child)) if it's still running
that's the simplest approach
Implemented for Interoperability: it can check status, including items in it, restart, update, recover, and check for queues and errors.
There is also SQL Query executon
Looks like there is no one specific use case for server, and there are so many variants on how it can be implemented
Do you have something in mind, how would you use it?
Just thinking about the list of tools to add in server implementation
looks like an interesting challenge.
The value should be static, at the time of creating record on table
For dates it uses range feature, so, you can split it by months, year (I suppose)
and move the whole bunch of records including indexes to another databse
While we moved our application to AWS, and we have some data, which we need to keep for a while. With this feature, we can move old data to a cheaper storage.
I believe the ability to move to a cheaper storage is mostly the case. Another option is that some table is too big, and someone would like to split it to be stored in multiple different databases, together with the indexes.
I don't see any difference
Time taken: 23.218561416957527 seconds to insert 100000 at 4306.899045302852 records per second.
Time taken: 23.179011167027056 seconds to insert 100000 at 4314.247889152987 records per second.
from sqlalchemy import create_engine
import numpy as np
import time
import iris
hostname = "localhost"
port = 1972
namespace = "USER"
username = "_SYSTEM"
password = "SYS"
# Create the SQLAlchemy engine
DATABASE_URL = (
f"iris+intersystems://{username}:{password}@{hostname}:{port}/{namespace}"
)
engine = create_engine(DATABASE_URL, echo=True)
args = {
"hostname": hostname,
"port": port,
"namespace": namespace,
"username": username,
"password": password,
}
connection = iris.connect(**args)
# connection = engine.raw_connection()
# Generate data for each row (50 fields)
columns_count = 50
drop_table_sql = f"DROP TABLE IF EXISTS test_table"
columns = [f"field_{i + 1}" for i in range(columns_count)]
create_table_sql = f"CREATE TABLE test_table ({', '.join([f'{col} DOUBLE' for col in columns])})"
num_records = 100000
# Define SQL insert statement
sql_insert = f"INSERT INTO SQLUser . test_table ({', '.join(columns)}) VALUES ({', '.join(['?'] * columns_count)})"
record_values = []
# Execute SQL insert
try:
start_time = time.perf_counter() # Capture start time
batch = 0
cursor = connection.cursor()
cursor.execute(drop_table_sql)
cursor.execute(create_table_sql)
connection.commit()
for _ in range(num_records):
record_values = [np.random.rand() for _ in range(columns_count)]
cursor.execute(sql_insert, record_values)
batch = batch + 1
if batch >= 10000:
connection.commit()
print("Batch inserted successfully!")
batch = 0
connection.commit()
end_time = time.perf_counter() # Capture end time
elapsed_time = end_time - start_time
print(
f"Time taken: {elapsed_time} seconds to insert {num_records} at ",
num_records / elapsed_time,
" records per second.",
)
except Exception as e:
print("Error inserting record:", e)
finally:
cursor.close()
connection.close()
engine.dispose()
I think you just confusing $listbuild with list in BPL context. In BPL when you define it as List Collection, it will use class %Collection.ListOfDT, or in case of array it will be %Collection.ArrayOfDT
That means, that you should use
if 'context.Facilities.Find(##class(Ens.Rule.FunctionSet).SubString(context.EpicDepartmentID,1,4))
{
do context.Facilities.Insert(##class(Ens.Rule.FunctionSet).SubString(context.EpicDepartmentID,1,4))
}The support for NodeJS in IRIS is quite primitive and limited to only native functions, globals, methods, no SQL
check in IRIS folder, what do you have installed with IRIS for nodejs
I don't have windows, only docker, and in my case
/usr/irissys/bin/iris1200.node /usr/irissys/bin/iris800.node /usr/irissys/bin/iris1600.node /usr/irissys/bin/iris1400.node /usr/irissys/bin/iris1000.node /usr/irissys/dev/nodejs/intersystems-iris-native/bin/lnxubuntuarm64/irisnative.node
If you want to use IRIS SQL from nodejs, you can try my package, which you can install with npm
npm install intersystems-irisconst { IRIS } = require("intersystems-iris");
async function main() {
const db = new IRIS('localhost', 1972, 'USER', '_SYSTEM', 'SYS')
console.log('connected')
let res = await db.sql("select 1 one, 2 two")
console.log(res.rows);
await db.close()
}
main()It's quite simple at the moment, only supports SQL with no parameters, but should work I believe
I'm not familiar with HL7 processing, but I suppose should be like this
foreach(<field>)
{
set $list(list, * + 1) = <field>
}if you need to create list, than $list on the left side, with *+1 as position, to add to this list, and the right side, any value you would like to add to the list
USER>kill list for i=1:1:10 { set $list(list, *+1) = i } zw list
list=$lb(1,2,3,4,5,6,7,8,9,10)Is this what you're after?
or do GenerateEmbedded
do $system.OBJ.GenerateEmbedded("*")Which can help to discover hidden issues, like when some old SQL does not compile anymore (real case).
if you happened to install the official python driver, it could mess here.
Which version of IRIS do you use? Try on the latest preview?
Awesome, thanks for using sqlalchemy-iris
btw, in case when you run it inside IRIS, you can use python embedded mode, this way
engine = create_engine('iris+emb:///USER')Studio is already deprecated and works only on Windows
Accessing the management portal depends on how you installed IRIS and which version you used. Your installation may not have private web server installed, and you would need to install any webserver yourself.
run command:
iris list
it will output something like this
Configuration 'IRIS' (default)
directory: /usr/irissys
versionid: 2025.1.0L.172.0com
datadir: /usr/irissys
conf file: iris.cpf (WebServer port = 52773)
status: running, since Sat Feb 15 06:34:51 2025
SuperServers: 1972
state: ok
product: InterSystems IRISHere is WebServer port is what are you looking for, and the full URL will be http://localhost:52773/csp/sys/UtilHome.csp
If you don't have a private web server, check documentation, on how to configure separate web server
Yes, looks like it's solved now, pulled again and got 2025.1 now
Tried to check 2025.1, but did not find it
$ docker inspect --format '{{ index .Config.Labels "com.intersystems.platform-version" }}' containers.intersystems.com/intersystems/iris:latest-{em,cd,preview}
2024.1.3.456.0
2024.3.0.217.0
2024.1.3.456.0For some reason, the preview points to em version, why so?
For the community edition,
$ docker inspect --format '{{ index .Config.Labels "com.intersystems.platform-version" }}' containers.intersystems.com/intersystems/iris-community:latest-{em,cd,preview}
2024.1.2.398.0com
2024.3.0.217.0com
2025.1.0.204.0comAt least here it is 2025.1 version
but, I've noticed that em version now not the same as for enterprise edition
Anyway, where is 2025.1 for the Enterprise version? The community edition is too limited to check everything on it.
The post is two months old at this point, but it's not available.
if you are using the community edition, this is most probably related to the amount of connection used.
try on enterprise version
I'm curious why would you need it, are there any issues when working via shared memory?
As already mentioned, you have some options, use a different host, or use DriverParameters
Optional. Boolean indicating whether or not to always use shared memory for localhost and 127.0.0.1. Default = null. See IRISDataSource methods getSharedMemory() and setSharedMemory(). Also see “Shared Memory Connections”.
For instance, in DBeaver it may look like this, if there is no SharedMemory option visible, you still can add it manually
.png)
Sorting is useless, but anyway
.png)
USER>zw [1,2,3].addAll([4,5,6])
[1,2,3,4,5,6] ; <DYNAMIC ARRAY>
USER>w $zv
IRIS for UNIX (Ubuntu Server LTS for ARM64 Containers) 2024.1Where do you see the error with ISC.FeatureTracker.SSL.Config? There is nothing in the code example, that would point to using it. Why would you even use it anyway, and the error says, that this config is not enabled, you can check it here
.png)
Usually build is happening in one thread, and you would not see this error even with unlicensed version
And this error means, that you have multiple processes trying to connect to IRIS during the build
And even using Community Edition may not help, because it has a limit on connections, and you may face the same issue there too.
In some scenarios, it's possible to use multi stage building process, where you can use Community Edition image as a building stage, and target image without starting IRIS or with using only one connection, finish the build.
You can check multi-stage way with Community, or use iris.key during build stage
as long as those processes do not load the same files, it's safe, it uses locks per loaded item
It depends on how much non-unicode data you have. If it's not much, you can try to use XML way.
Another way, is to use some simple scripts, that order over all globals, and convert in place. Skipping indexes, with full rebuild.
I think there were multiple solutions, to this task. You can try to find them.
You have to collect as much as possible about your data.
The last time I implemented a converter for 20+ years old application, more than 15 years ago, it was an application with textual terminal interface, and it went well.
While iris session can’t be created as a job, with inherited security, I’m rely on ability to automatically login to iris session, without entering username and password. And when session is opened I try to use $sysyem.Security.Login without password to user actually logged to iterm, and it may fail and probably will get into black screen
In my future no one uses windows
But anyway could you check if try’s can work on windows, maybe I can find a way to use it here