Thanks. What do you mean with "run it in a loop"?
- Log in to post comments
Thanks. What do you mean with "run it in a loop"?
Nice! To me it is pretty surprising that the compiler is smart enough to see the difference.
I've updated my post to make my intentions clear.
I'm talking about the exposing of existing globals via SQL with Cache SQL Storage mapping. I.e. about subject of this post https://community.intersystems.com/post/art-mapping-globals-classes-1-3
Wow! A really nice hack! :) Well done!
Great! Thanks!
Yes, now I can import it. But controls Direct View and the filter do not exist. And as just a side note, everything related to Ensable fails to compile on my Cache.
Would be nice to have a way to extend Portal with custom pages. This way colleagues need not to learn and store links to the pages.
Can you say what min Cache version is supported? My Cache 217.2 fails to parse the xml.
>you develop this view by yourself
yes, that's clear. But what is "ad-hoc portal page"? A way to extent Management Portal with custom pages?
What's ad-hoc portal page? Google has no idea
I've huge globals, so this way doesn't work for me.
>to estimate the age of ^%G take a look to copyright
oh ye :)
Thanks, I'm aware of Search Mask functionality.
>... if you have some idea of the last subscript ...
That's the point: very often when viewing a global on production server via Portal, you just want to see a few latest records, not starting to search for last subscripts.
Updated my question. I mean to view them in Portal.
Mikhail, haven't you considered to keep the time series in Cache and use Grafana directly? @David Loveluck seems got something working in this direction: https://community.intersystems.com/post/using-grafana-directly-iris. Cashe/IRIS is powerful database, so another database in the middle feels like Boeing using parts from Airbus.
David, thanks for sharing your work.
Does your system work reliable and your approach can be taken into real use? What about performance? I have same intentions: to use Grafana directly with Cache.
Edit: answered by Robert Cemper above.
Let's say the type is string and ExtentSize is 1000. Why second query needs temp file? I don't see why condition 'equal to' should be executed in different way than 'greater than'.
Good point! Thank you Robert.
No, it is not about breakpoints. With no breakpoints set, the debugger will just exit without any error messages and with two lines printed into Output window:
Das Programm wurde beendet.
Ziel hat den Debugger beendet
But now, with one particular database, I'm getting popup window with "#error 6704 Kein Anbinden möglich'
which I never seen for > 10 years of using the debugger.
Ok, thanks for the response.
We have been using the debugger for many years so we are pretty sure how to use it.
This way involves copying of the data but thanks, can be useful in some cases
Nice, thanks. But it seems that this way will work only if the global has an ID as first index.
Robert, does it mean every time I have two processes accessing a table, I have to insert in this way? Or is is only about some certain scenarios?
What about Cache? Hopefully all generic enhancements are being added also to Cache.
Have you seen DdlAllowed flag? https://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=ROBJ_class_ddlallowed
JavaScript as raw text strings? In my opinion, can not be really used in any serious project.
OK, I see. Thanks for your time![]()
OK. To kbase of this issue: in my case, this class were imported many-many times into the namespace without issues. So, it is definitely not about the first creation.
Thanks, I'm aware of deleteexten qualifier.
Regarding to "should never": to me, reasonable way to solve unexpected issue is:
Thank you for the answer.
DeleteExtentDefinition had only temporary effect: no errors on first compilation, but the same error on second one.
What fixed the problem is deletion of whole ^rINDEXEXT global and Rebuild All.
That 'happens typically' in your answer sounds alarming. From your experience, what factors can cause this problem?