$system.CLS.Property accepts property by name as well, so you don't need loop
- Log in to post comments
$system.CLS.Property accepts property by name as well, so you don't need loop
This is only example and yes, for client, where community.intersystems.com is a server for request to. To simplify I just create ssl config with server's name
ClassMethod GetSSLConfiguration(host) As %String
{
NEW $NAMESPACE
SET $NAMESPACE = "%SYS"
IF '##class(Security.SSLConfigs).Exists(host) {
DO ##class(Security.SSLConfigs).Create(host)
}
QUIT host
}
Set tRequest = ##class(%Net.HttpRequest).%New() Set tRequest.Server = "community.intersystems.com" Set tRequest.Https = 1 Set tRequest.SSLConfiguration = ..GetSSLConfiguration(tRequest.Server) ....
Actually, with my VSCode extension is not supposed that you need to sync your code. It expects, that you have only one source of truth, your local sources. But if you can export any code from the server, it can be the same folder where your sources already placed or any others. Just use Export action in Context menu in Explorer, and it will export all your classes or routines.

When you try to open any file in explorer, and this class or routine you already have locally, it will show your local file instead of server's file. Just because it does not expects any difference, as I said your local sources have the privilege to be used first.
Any file opened from server will be shown in read-only mode.
It actually, does not matter, where to install drivers.
My Caché works in docker, so, I downloaded ODBC drivers from the ftp.
extracted it, just in Downloads, and run there ODBCInstall from a terminal.
with ODBCInstall I got file mgr/cacheodbc.ini, which I copied to ~/Library/ODBC/odbc.ini, there will be as User DSN. /Library/ODBC/odbc.ini for System DSN.
DYLD_LIBRARY_PATH should follow to bin folder in extracted ODBC Drivers folder
in my case
export DYLD_LIBRARY_PATH=/Users/daimor/Downloads/ODBC-2018.1.2.309.0-macx64/bin
you can check connection with iODBC Manager, running right from terminal
DYLD_LIBRARY_PATH=/Users/daimor/Downloads/ODBC-2018.1.2.309.0-macx64/bin open /Applications/iODBC/iODBC\ Administrator64.app
and open excel same way
DYLD_LIBRARY_PATH=/Users/daimor/Downloads/ODBC-2018.1.2.309.0-macx64/bin open /Applications/Microsoft\ Excel.app
So, you'll be able to test settings without relogin
What do you have already?
Did you configure ODBC DSN, DYLD_LIBRARY_PATH?
Some information about configuring ODBC, you can find here.
First of all I think you need iODBC installed on your mac.
then ODBC Caché drivers, with correct environment variable DYLD_LIBRARY_PATH, do not forget to relogin after the set variable.
As proof, that's possible to do it. Done with Caché 2018.1
(1).gif)
With such amount of data, does you have enough global buffers configured?
Is it only one slow place, or you have any others as well?
How big is your database, indexes globals and globals buffer?
USER>set desc = "Review symptoms to report with patient"
USER>set abbrv = ##class(%Regex.Matcher).%New("([^\ ]{2})[^\ ]*\ ?",desc).ReplaceAll("$1")
USER>write abbrv
Resytorewipa
I think the best solution would be to configure SSH server, so, you'll be able to use putty as well. If your server on Linux, it should be only SSH. With SSH your connection surely will be encrypted.
Look at this
Class User.Test
{
ClassMethod Test(Args...)
{
ZWRITE Args
}
ClassMethod Test2(Arg1, Arg2, Arg3, Arg4, Arg5, Arg6, Arg7, Arg8)
{
ZWRITE Arg1
ZWRITE Arg2
ZWRITE Arg3
ZWRITE Arg4
ZWRITE Arg5
ZWRITE Arg6
ZWRITE Arg7
ZWRITE Arg8
}
}
and let's call it
USER>do ##class(Test).Test(1,2,,,,,,8) Args=8 Args(1)=1 Args(2)=2 Args(8)=8
and second method
USER>do ##class(Test).Test2(1,2,,,,,,8) Arg1=1 Arg2=2 Arg8=8
And another way
USER>set args=8 ;just the number of the latest one USER>set args(8)="test"
and calls
USER>do ##class(Test).Test(args...) Args=8 Args(8)="test" USER>do ##class(Test).Test2(args...) Arg8="test"
What do you expect instead of "USER>", it is a session prompt, where you should put your commands. At this point, for me, it looks expectable.
1. What you did, to get it. Please add all your steps, so we can find what you did wrong.
2. IRIS container changed nothing what comes from base Ubuntu image. If you can install mc there it should be possible on IRIS as well. But I don't see any reasons, why mc should be available inside the container. And why you need it there?
I think you are talking about subscripts and about the full length of global reference. You can't do anything with it and just accept it as a fact.
It's not so much important to have key before install, much more important to have it when server is running.
But how you sure that your key is suitable for this platform. You can check it on running container, where you can enter inside, and go the csession. You can find some intersting methods for $SYSTEM.License in the documentation which can help you to check license file inside the container.
Most of the reasons for getting such error is just missed license file or exceeded license limit.
Just check it, you can mount it during docker run or copy it inside the image during docker build.
I see that you use quite an old version, I would recommend considering usage latest version based on IRIS. Due to many limitations of using such an old version in Docker.
So, you can convert $List to some kind of List in C?
$LIST is a very simple format, but you can't get any particular item just by position. You should go through the list from the first item.
In a simple explanation, it is just concatenation of individual $LIST. So, you can't say how many items you have until you go through this list and count them.
I have not played with C, yet. But I found CachePopList and CachePushList. have you tried to play with it? Looks like it may help you to order over $List.
Sure, all project specific settings, should be stored in .vscode folder In the root of project. So, you can control this folder with source control system. Look at the VSCode documentation. And at one of my ne projects as an example.
It's really great news. And so cool that InterSystems started to participate more in developers conferences. I wish to participate all of them :)
Base64 does not work with unicode with two or more bites. You should convert it first
write $system.Encryption.Base64Encode($zcvt("тест","O","UTF8"))
0YLQtdGB0YI=
Some time ago I did an example of Angular Application with IRIS on a backend.
Source for this project available on gitlab.
How to develop Angular application, you should look at angular documentation, and all about frontend development. There are some tools, which helps to develop and build your frontend side. Such as webpack, which do most of work related with build your sources to production ready environment.
In my simple project, you need only docker, and any editor, preferable VSCode.
By command, you will get a running server in development mode. So, you can edit Angular code, and IRIS code, and see an immediate result.
docker-compose up -d
And this project also deployable with Kubernetes. So, after any push of changes to gitlab, it will build, and test it.
node_modules, never goes to source control. It's enough to have package.json and package-lock.json there. node_modules may contain platform specific libraries.
Hope I will manage to write a complete article about this project, with all the details.
Sorry, usually I use it to add custom methods to properties. Did not not, that it doesn't work for such methods as Set/Get. But you can add any other method.
Why do not use PropertyClass, so, you can generate own getter and setter for any property in a class?
There is also ClassMethod GetGlobalSize in the class %Library.GlobalEdit , where you can select a fast way to count or not, and you will get a different result.
ClassMethod GetGlobalSize(Directory As %String, GlobalName As %String, ByRef Allocated As %Integer, ByRef Used As %Integer, fast As %Boolean = 0) as %Status
Get size of this global
'Allocated' - total size, in MB, of blocks allocated for the global.
'Used' - total used data, in MB, for the global.
'fast' - TRUE : faster return, it won't return the value of 'Used'.
FALSE - slower return,, it returns values for both 'Allocated' and 'Used'.
So, when fast, it just counts blocks and don't care how those blocks fill by data and multiply the number of blocks on Size of the block.
Used, counts only when you pass fast=0, and it calculates exact size, and to be more accurate reads all blocks, so it could be slower.
Web applications defined in Security.Applications class in %SYS namespace.
zn "%SYS"
set props("Path") = "/opt/my/app/csp"
set props("Description") = "My Cool Application"
do ##class(Security.Applications).Create("/csp/test", .props)
Visual Studio and Visual Studio Code are two very different products but just with similar names.
To configure Visual Studio Code, you can use this settings
{
"objectscript.conn": {
"active": true,
"host": "localhost",
"port": 57772,
"ns": "SAMPLES",
"username": "admin",
"password": "SYS"
}
}
Where,
active, should be true, if you going to be connectedVery interesting, I did not now that LaTex maybe used in such a way. How it is used Syntax Highlighting, maybe I can help?
How Pygments related? Some time ago I started to do Syntax highlightings to Pygments but did not finished it, yet. But I have textmate grammar, which can be used in many places.
I'm not an InterSystems guy and can say only from my point of view, how it works.
Every global name has internal representation in some kind of binary format, I don't know how it can be converted to and back. But this string used to find a correct block. Like when you looking for ^C(9996,46, yellow), it first should read Map (Block 3), to find where Global ^C started (Block 44), then using this internal format for global, it can find the closest node in the first pointer block if it points to another pointer block, the same search repeats until it reached any Data block, which may also contain data for multiple nodes.
Not sure If I can better explain it, but most important is that B* tree helps to very quickly find the final block, and their neighbours.
There is a workaround with a storage driver. Here is an example of ccontrol-wrapper which can help with it. And example of Dockerfile how it can be used.
I would argue about using VM instead of Docker. My choice is Docker for sure, for many reasons. It is not so difficult to make it worked in Docker, and you will get a lot from it. And if it is possible for you to move to IRIS, it would be a good way to go.
I've been working for many different projects at the same time, and use Docker for all of them.