Yes I'm using virtual document thus the xml was loaded as an EnsLib.EDI.XML.Document and saved into BPL context.
While the path about seems not working.
- Log in to post comments
Yes I'm using virtual document thus the xml was loaded as an EnsLib.EDI.XML.Document and saved into BPL context.
While the path about seems not working.
Thanks Eduard.
It works in this case indeed. While there are still cases for some of our partners to work on XMLs without schema. Is there someway to do it?
It can be down by adding a custom function
Class MDMDemo.Utils.XMLDocumentUtils Extends Ens.Rule.FunctionSet
{
/// GetAt implementation for rulesets
ClassMethod ElementGetAt(ByRef pDocument As EnsLib.EDI.XML.Document, pPath As %String) As %String [ CodeMode = expression, Final ]
{
pDocument.GetValueAt(pPath)
}
}
But is there anyway to directly visit the elements in the rule?
This can also solve the problem. Thanks.
Solved by using SQL to query the class and an url as http://localhost:52880/ui/interop/rule-editor/index.html?$NAMESPACE=FHI…
Happy new year guys! 🔔
Thanks Eduard.
It seems the client missed some settings with their docker engine installation but we have not figured it out yet.
They reinstalled the docker env, make sure it was installed under root previlege then it worked.
Thanks for the help from @Stefan Cronje and @Vitaliy Serdtsev
I'm using the SQL
select count(1) from (
select distinct by (VISIT_SERIAL_NO,HQ_ORG_CODE) VISIT_SERIAL_NO,HQ_ORG_CODE
from %PARALLEL ST_SEARCH_VISIT
)
which ends in 2 seconds in my test env
.png)
Since the end user is expecting a query ends in 3 seconds, I'll test its performance with their other conditions combined.
Thanks again for the help.
Hi Vitaliy,
Yes Benjamin also find out that the exported DDL is not consistent to the class definition, but users source code is so far controlled by the end user thus we'll need to wait until they can share it with us.
But I'm not sure what happened in your test with the Fill() method as mentioned above.
In the clients env, there are more than 7 million records and the result of count distinct end is 40,000 less than the total number, which means only a very small portion of VISIT_SERIAL_NO,HQ_ORG_CODE is not unique. Thus I used populate facility to generate 6 million data and the result of count distinct is more than 5.99 million. Then the query is slow.
.png)
while the distinct by version is much better
.png)
Both was run after tune table and rebuild indecies
You can find the class file I used in the test as below, so far I only used too index to handle this case, if any other index is helpful, we may add it back
Class User.STSEARCHVISIT Extends (%Persistent, %Populate) [ ClassType = persistent, DdlAllowed, Final, Owner = {SuperUser}, ProcedureBlock, SqlRowIdPrivate, SqlTableName = ST_SEARCH_VISIT ]
{
Property HQORGCODE As%Library.String(MAXLEN = 32) [ SqlColumnNumber = 2, SqlFieldName = HQ_ORG_CODE ];Property HQORGNAME As%Library.String(MAXLEN = 32) [ SqlColumnNumber = 3, SqlFieldName = HQ_ORG_NAME ];Property TENANTID As%Library.String(MAXLEN = 32) [ SqlColumnNumber = 4, SqlFieldName = TENANT_ID ];Property GROUPPATIENTSN As%Library.String(MAXLEN = 32) [ SqlColumnNumber = 5, SqlFieldName = GROUP_PATIENT_SN ];Property ORGPATIENTSN As%Library.String(MAXLEN = 32) [ SqlColumnNumber = 6, SqlFieldName = ORG_PATIENT_SN ];Property NAME As%Library.String(MAXLEN = 64) [ SqlColumnNumber = 7 ];Property SEXCODE As%Library.String(MAXLEN = 2) [ SqlColumnNumber = 8, SqlFieldName = SEX_CODE ];Property SEXNAME As%Library.String(MAXLEN = 16) [ SqlColumnNumber = 9, SqlFieldName = SEX_NAME ];Property BIRTHDATE As%Library.Date [ SqlColumnNumber = 10, SqlFieldName = BIRTH_DATE ];Property MARRYCODE As%Library.String(MAXLEN = 2) [ SqlColumnNumber = 11, SqlFieldName = MARRY_CODE ];Property MARRYNAME As%Library.String(MAXLEN = 16) [ SqlColumnNumber = 12, SqlFieldName = MARRY_NAME ];Property IDENTIFYTYPECODE As%Library.String(MAXLEN = 5) [ SqlColumnNumber = 13, SqlFieldName = IDENTIFY_TYPE_CODE ];Property IDENTIFYTYPENAME As%Library.String(MAXLEN = 32) [ SqlColumnNumber = 14, SqlFieldName = IDENTIFY_TYPE_NAME ];Property IDENTIFYNUMBER As%Library.String(MAXLEN = 50) [ SqlColumnNumber = 15, SqlFieldName = IDENTIFY_NUMBER ];Property BLOODTYPECODE As%Library.String(MAXLEN = 2) [ SqlColumnNumber = 16, SqlFieldName = BLOOD_TYPE_CODE ];Property BLOODTYPENAME As%Library.String(MAXLEN = 12) [ SqlColumnNumber = 17, SqlFieldName = BLOOD_TYPE_NAME ];Property MOBILE As%Library.String(MAXLEN = 50) [ SqlColumnNumber = 18 ];Property MAILINGADDRESS As%Library.String(MAXLEN = 127) [ SqlColumnNumber = 19, SqlFieldName = MAILING_ADDRESS ];Property VISITSERIALNO As%Library.String(MAXLEN = 64) [ SqlColumnNumber = 20, SqlFieldName = VISIT_SERIAL_NO ];Property TABLEFLAG As%Library.String(MAXLEN = 1) [ SqlColumnNumber = 21, SqlFieldName = TABLE_FLAG ];Property VISITTYPECODE As%Library.String(MAXLEN = 24) [ SqlColumnNumber = 22, SqlFieldName = VISIT_TYPE_CODE ];Property VISITTYPENAME As%Library.String(MAXLEN = 64) [ SqlColumnNumber = 23, SqlFieldName = VISIT_TYPE_NAME ];Property VISITDEPTCODE As%Library.String(MAXLEN = 32) [ SqlColumnNumber = 24, SqlFieldName = VISIT_DEPT_CODE ];Property VISITDEPTNAME As%Library.String(MAXLEN = 64) [ SqlColumnNumber = 25, SqlFieldName = VISIT_DEPT_NAME ];Property INOUTVISITNO As%Library.String(MAXLEN = 32) [ SqlColumnNumber = 26, SqlFieldName = IN_OUT_VISIT_NO ];Property VISITTIME As%Library.TimeStamp [ SqlColumnNumber = 27, SqlFieldName = VISIT_TIME ];Property DISCHARGETIME As%Library.TimeStamp [ SqlColumnNumber = 28, SqlFieldName = DISCHARGE_TIME ];Property INHOSPITALTIME As%Library.TimeStamp [ SqlColumnNumber = 29, SqlFieldName = IN_HOSPITAL_TIME ];Property ICDCODE As%Library.String(MAXLEN = 20) [ SqlColumnNumber = 30, SqlFieldName = ICD_CODE ];Property ICDNAME As%Library.String(MAXLEN = 64) [ SqlColumnNumber = 31, SqlFieldName = ICD_NAME ];Property HEALTHCARDNO As%Library.String(MAXLEN = 32) [ SqlColumnNumber = 32, SqlFieldName = HEALTH_CARD_NO ];Property HEALTHCARDTYPE As%Library.String(MAXLEN = 64) [ SqlColumnNumber = 33, SqlFieldName = HEALTH_CARD_TYPE ];Property SURGCODE As%Library.String(MAXLEN = 20) [ SqlColumnNumber = 34, SqlFieldName = SURG_CODE ];Property SURGNAME As%Library.String(MAXLEN = 64) [ SqlColumnNumber = 35, SqlFieldName = SURG_NAME ];Property AGE As%Library.String(MAXLEN = 10) [ SqlColumnNumber = 36 ];Property DISEASECODE As%Library.Integer(MAXVAL = 2147483647, MINVAL = -2147483648) [ SqlColumnNumber = 37, SqlFieldName = DISEASE_CODE ];Property DISEASENAME As%Library.String(MAXLEN = 64) [ SqlColumnNumber = 38, SqlFieldName = DISEASE_NAME ];Property LastChangedTime As%Library.TimeStamp [ SqlColumnNumber = 39 ];Property LastCreateTime As%Library.TimeStamp [ SqlColumnNumber = 40 ];Property GROUPORG As%Library.String(MAXLEN = 50) [ SqlColumnNumber = 41 ];Parameter USEEXTENTSET = 1;/// Bitmap Extent Index auto-generated by DDL CREATE TABLE statement. Do not edit the SqlName of this index.
Index DDLBEIndex [ Extent, SqlName = "%%DDLBEIndex", Type = bitmap ];
Index IdxMapOrg On HQORGCODE [ SqlName = IdxMapOrg, Type = index ];
Index IdxVisitorg On (VISITSERIALNO, HQORGCODE) [ SqlName = IdxVisitorg, Type = index ];
}Hi Stefan.
It is from a random search page with 20+ possible conditions to combine. This case is one of the slowest SQL we find. In other cases, with more conditions to act as filters, the performance is OK.
I'll try whether Bitslice index works for this case.
Thanks.
Hi Stefan,
If I change
select distinct by (VISIT_SERIAL_NO,HQ_ORG_CODE) VISIT_SERIAL_NO,HQ_ORG_CODE
into select distinct VISIT_SERIAL_NO,HQ_ORG_CODE,
The sum number is quite different
.png)
.png)
actually the second sql is equal to
.png)
it seems to be caused by the group by clause on a bitmap index, if I remove it, the result is then consistent
.png)
I tuned the table and rebuild the indices but the result is the same.
The followings are the query plans
.png)
.png)
.png)
Yep, that will help them. Thanks Eduard.![]()
Hi Mr. Zhou
As David already explained, license consumption will be increased dramatically once the threshold of 25 connections per user is exceeded. I.e, if there are 26 connections from one user account, 26 license units are used, instead of one, to handle that.
Thus if the web application is widely used by a group of users, it might be useful to investigate why the one or two users used more than 25 connections. It may be caused by:
1. Wrong handling of web sessions from the web application. If it does not release or reuse previous sessions and keep creating new connections to the server in a short period, it may consume the license units quickly.
2. If 25 connections are not enough to handle all request. The application may need more Cache accounts to connect to the serve.
Hi Robert
So far we just bypass the issue avoid using dynamic proxy. Thanks.
Thanks, yes that's what they are doing now
Thanks a lot.
Thanks, I'll see that can be done through it.
I forgot to mention the client is using a core based license and this method returned zero. Is there other utilities we can use? Thanks.
This also do the job, thanks.
It will generate a GUID but....
.png)
We may need to wrap it into a function and force the call with row dependency as Robert mentioned above.
Thanks.
It works, thank you very much Robert
Oh, missed one line of code.
Set confItem = ##Class(Ens.Config.Item).%OpenId(itemId)
Do prodConf.RemoveItem(confItem)
Set tSC = ##class(EnsPortal.Template.prodConfigSCPage).CallProductionUpdateAndSaveToClass(prodConf,"","SaveProduction")
Quit:$$$ISERR(tSC) tSC
Set tSC = ##class(Ens.Director).UpdateProduction()
Quit:$$$ISERR(tSC) tSC
Quit tSCJust add
Set tSC = ##class(EnsPortal.Template.prodConfigSCPage).CallProductionUpdateAndSaveToClass(prodConf,"","SaveProduction")
Now it's working
Hi Denny,
I think the most common cause of a refused connection is OS firewall blocking the superserver port of IRIS, which is used by JDBC connections. You can try close the firewall or open the superserver port in firewall settings.
I remember that SqlDBx is using ODBC to connect to DBs, so it will use a different connection.
Hi Sai,
From the error log you attached before:
ERROR <Ens>ErrTCPReadBlockSize: TCP Read(32000) with timeout period (5) failed with : (1144):l version="1.0" encoding="UTF-8"?> Wsma2Message xmlns:ns1="urn:hl7-org:v3" ......
It seems to me it was very likely the sender and receiver are using different Charset or Endian settings (These settings are used by counted block transferring through TCP) thus the receiver can not retrieve the XML String from the bytes in the stream.
Would you mind to check whether your BO and BS are referring to the same Charset and Endian settings?
.png)
Hi Kevin,
I think it is better if we refine the requirement for some details so that we can refer to a set of configurations to achieve what you want.
1. The user can only visit specific DBs instead of all DBs. Therefore, we will use specific resource to protect specific DB.
2. The user can use SQL from third party tools to visit the DBs through ODBC and JDBC connections, therefore, he need not only DB access but also specific SQL privileges.
Reference:https://docs.intersystems.com/irisforhealthlatest/csp/docbook/Doc.View…
We can make the user a READONLY one by assigning only Select SQL privilege to him.
We can also use portal to define the privileges:
.png)
3. If the user can only use SQL and should not use portal to visit IRIS, the simplest way is not to assign roles such as %Operator, %Manager to the user, he then can only visit portal but can not use menus to perform action and we will Never refer to %All from a user expected to have limited previleges.
Hi, how about tick the application option?
.png)
Hi
According to the document
https://docs.intersystems.com/healthconnectlatest/csp/docbook/DocBook.U…
The fourth parameter should be application name instead of dispatch class name
Hi Yufeng,
%SYSTEM.Encryption.RSASHASign might be useful. You may use the argument Bitlength to decide the length. %SYSTEM.Encryption.RSASHASign可能是有用的。你可以使用参数Bitlength来决定长度。
Hi, Dongsheng,
Setting on pool size could be a little tricky.
Planning pool size before the implementation is good but it depends on whether we can predict the actual throughput of a business host and the performance of it.
As for my own experience, the more effective way to use pool size is to use it as a performance tuning approach.
Say, we deploy a business host online and perform test run for several rounds with actual or estimated loads. During the test, we'll need to observe the queue. For example, if a queue is increasing much faster than the others, it usually means the underlining business host is a performance bottleneck and we'll need to either optimize the business host or simply increase it's pool size as long as there are spare CPU cores to be used.
Before a project goes live we usually try it out for a period of time, during which we gradually determine the pool size of each business host to achieve the required performance.
__________________________________________________________________________________
pool size的设置并不会对每一个应用都一样。
在实施之前规划好池大小是好的,但这要看我们是否能预测business host的吞吐量和它的性能。
就我自己的经验而言,将pool size作为一种性能调整的方法比较有用。
比如说,我们在线部署一个business host,在实际或预估负载的情况下,进行几轮测试运行。在测试过程中,我们需要观察队列的情况。例如,如果一个队列的增长速度比其他队列快得多,通常意味着对应的business host是一个性能瓶颈,我们需要对它进行优化,或者,只要有空余的CPU核心可以使用,增加它的pool size通常就会带来可观的性能提升。
在项目上线之前我们通常会试运行一段时间,期间逐步确定每一个business host的pool size以达到所需的性能。
Hi Scott, I'm not sure whether I captured your question exactly. Say, if you want two fields to be taken at the same time to compose a key field, it can be done by combine the two fields into one "virtual" field then use it as the key field. For example
SELECT HQ_ORG_CODE||'|'||ORDER_PROJ_CODE||'|'||FORM_DATE As OID,* FROM tableX WHERE FORM_DATE >= ?In this SQL, we concatenated the three fields with the splitter "|" into one "virtual" filed OID, then we can directly use OID as the key field because it will exist in the result set. Also, we can use SQL function to concatenate the fields, as below
SELECT STRING(HQ_ORG_CODE,ORDER_PROJ_CODE,FORM_DATE) As OID,* FROM tableXHope that helps