go to post Dmitry Maslennikov · Jun 12 It was developed ages ago. I did not do it for Windows, but I think it should be possible to make it working
go to post Dmitry Maslennikov · Jun 10 Ages ago I've impleted project named BlocksExplorer, with a various of ways of using it One of them is generating a picture of of the database, and visualization of defragmentation. I think I tried it once on 2TB Database, but that was long time ago. Anyway, this tool generates picture in BMP format, where each pixel represents each block in the database, so, for For 2TB database, it will be around 268435456 blocks, that will be a picture 16384x16384, and in BMP with a 3 bytes per pixel, it will be around 768MB for this picture, quite big one This project is fully open, you can try to modify it a bit, and maybe if you do it per global, you will be able to make a map
go to post Dmitry Maslennikov · Jun 1 Ensemble was renamed to Interoperability and its part of any IRIS editions
go to post Dmitry Maslennikov · May 28 There is one more way of getting it, is using ASQ USER> write responseData.apply("$.items[0].titles[0].value.en_US").%Get(0) Professor
go to post Dmitry Maslennikov · May 18 USER>:py Python 3.12.3 (main, Feb 4 2025, 14:48:35) [GCC 13.3.0] on linux Type quit() or Ctrl-D to exit this shell. >>> import sys >>> sys.path ['/usr/irissys/mgr/python', '/usr/lib/python312.zip', '/usr/lib/python3.12', '/usr/lib/python3.12/lib-dynload', '/usr/local/lib/python3.12/dist-packages', '/usr/lib/python3/dist-packages', '/usr/irissys/mgr/python', '/usr/irissys/lib/python', '/usr/local/lib/python3.12/dist-packages', '/usr/lib/python3/dist-packages', '/usr/lib/python3.12/dist-packages'] >>> did you try /usr/irissys/mgr/python, that's the path expected to be used by IRIS packages pip install whatever --target /usr/irissys/mgr/python and another place is /usr/irissys/lib/python, it's a place where InterSystems places Python Embedded support sys.path, is a list of folders where python will look for the installed modules
go to post Dmitry Maslennikov · Apr 16 Did you try using more appropriate package %Stream for it? In your case, you would need to use %Stream.FileCharacter Set stream=##class(%Stream.FileCharacter).%New() $$$ThrowOnError(stream.LinkToFile("c:\export.csv")) set rs = ##class(%SQL.Statement).%ExecDirect(, "SELECT a, b FROM table") if rs.%SQLCODE'=0 { throw ##class(%Exception.SQL).CreateFromSQLCODE(rs.%SQLCODE, rs.%Message) } while rs.%Next() { set line = $listbuild(rs.a, rs.b) do f.WriteLine($listtostring(line, ",")) } $$$ThrowOnError(stream.%Save()) Additionally, nowadays using embedded classes does not give much advantage, if it's not for one row result, only makes the code a bit harder to read
go to post Dmitry Maslennikov · Apr 15 And what was the reason for it? Routnes now deprecated in IRIS?
go to post Dmitry Maslennikov · Apr 11 The codebase for AI, not enough in comparison to other languages. Even with much more popular languages, it makes mistakes, but yes, much less. To help with it we need to significantly increase codebase in GitHub in ObjectScript, so that AI will have something to learn from.
go to post Dmitry Maslennikov · Apr 5 #include with # used in Mac routines for classes should be just include without #
go to post Dmitry Maslennikov · Apr 2 When you start background job, in the master process, you can collect $zchild with a pid of started process and collect them in the logs, and then in the loop you can check $data(^$job(child)) if it's still running that's the simplest approach
go to post Dmitry Maslennikov · Mar 25 Implemented for Interoperability: it can check status, including items in it, restart, update, recover, and check for queues and errors.There is also SQL Query executonhttps://www.youtube.com/embed/EVzZnkjIvoM[This is an embedded link, but you cannot view embedded content directly on the site because you have declined the cookies necessary to access it. To view embedded content, you would need to accept all cookies in your Cookies Settings]
go to post Dmitry Maslennikov · Mar 21 Looks like there is no one specific use case for server, and there are so many variants on how it can be implemented Do you have something in mind, how would you use it? Just thinking about the list of tools to add in server implementation
go to post Dmitry Maslennikov · Mar 16 The value should be static, at the time of creating record on table For dates it uses range feature, so, you can split it by months, year (I suppose) and move the whole bunch of records including indexes to another databse
go to post Dmitry Maslennikov · Mar 16 While we moved our application to AWS, and we have some data, which we need to keep for a while. With this feature, we can move old data to a cheaper storage. I believe the ability to move to a cheaper storage is mostly the case. Another option is that some table is too big, and someone would like to split it to be stored in multiple different databases, together with the indexes.
go to post Dmitry Maslennikov · Mar 13 I don't see any differenceTime taken: 23.218561416957527 seconds to insert 100000 at 4306.899045302852 records per second. Time taken: 23.179011167027056 seconds to insert 100000 at 4314.247889152987 records per second. from sqlalchemy import create_engine import numpy as np import time import iris hostname = "localhost" port = 1972 namespace = "USER" username = "_SYSTEM" password = "SYS" # Create the SQLAlchemy engine DATABASE_URL = ( f"iris+intersystems://{username}:{password}@{hostname}:{port}/{namespace}" ) engine = create_engine(DATABASE_URL, echo=True) args = { "hostname": hostname, "port": port, "namespace": namespace, "username": username, "password": password, } connection = iris.connect(**args) # connection = engine.raw_connection() # Generate data for each row (50 fields) columns_count = 50 drop_table_sql = f"DROP TABLE IF EXISTS test_table" columns = [f"field_{i + 1}" for i in range(columns_count)] create_table_sql = f"CREATE TABLE test_table ({', '.join([f'{col} DOUBLE' for col in columns])})" num_records = 100000 # Define SQL insert statement sql_insert = f"INSERT INTO SQLUser . test_table ({', '.join(columns)}) VALUES ({', '.join(['?'] * columns_count)})" record_values = [] # Execute SQL insert try: start_time = time.perf_counter() # Capture start time batch = 0 cursor = connection.cursor() cursor.execute(drop_table_sql) cursor.execute(create_table_sql) connection.commit() for _ in range(num_records): record_values = [np.random.rand() for _ in range(columns_count)] cursor.execute(sql_insert, record_values) batch = batch + 1 if batch >= 10000: connection.commit() print("Batch inserted successfully!") batch = 0 connection.commit() end_time = time.perf_counter() # Capture end time elapsed_time = end_time - start_time print( f"Time taken: {elapsed_time} seconds to insert {num_records} at ", num_records / elapsed_time, " records per second.", ) except Exception as e: print("Error inserting record:", e) finally: cursor.close() connection.close() engine.dispose()
go to post Dmitry Maslennikov · Mar 4 I think you just confusing $listbuild with list in BPL context. In BPL when you define it as List Collection, it will use class %Collection.ListOfDT, or in case of array it will be %Collection.ArrayOfDT That means, that you should use if 'context.Facilities.Find(##class(Ens.Rule.FunctionSet).SubString(context.EpicDepartmentID,1,4)) { do context.Facilities.Insert(##class(Ens.Rule.FunctionSet).SubString(context.EpicDepartmentID,1,4)) }
go to post Dmitry Maslennikov · Mar 4 The support for NodeJS in IRIS is quite primitive and limited to only native functions, globals, methods, no SQL check in IRIS folder, what do you have installed with IRIS for nodejs I don't have windows, only docker, and in my case /usr/irissys/bin/iris1200.node /usr/irissys/bin/iris800.node /usr/irissys/bin/iris1600.node /usr/irissys/bin/iris1400.node /usr/irissys/bin/iris1000.node /usr/irissys/dev/nodejs/intersystems-iris-native/bin/lnxubuntuarm64/irisnative.node If you want to use IRIS SQL from nodejs, you can try my package, which you can install with npm npm install intersystems-iris const { IRIS } = require("intersystems-iris"); async function main() { const db = new IRIS('localhost', 1972, 'USER', '_SYSTEM', 'SYS') console.log('connected') let res = await db.sql("select 1 one, 2 two") console.log(res.rows); await db.close() } main() It's quite simple at the moment, only supports SQL with no parameters, but should work I believe