It looks like that error message is coming directly from SQL Server and has something to do with the linked-server config within SQL Server:
https://docs.microsoft.com/en-us/archive/blogs/mdegre/access-to-the-rem…
- Log in to post comments
It looks like that error message is coming directly from SQL Server and has something to do with the linked-server config within SQL Server:
https://docs.microsoft.com/en-us/archive/blogs/mdegre/access-to-the-rem…
Business rule classes store the rules as XML in an XData block with the name "RuleDefinition":
Class Demo.Rule.GenericRouter Extends Ens.Rule.Definition
{
Parameter RuleAssistClass = "EnsLib.MsgRouter.RuleAssist";
XData RuleDefinition [ XMLNamespace = "http://www.intersystems.com/rule" ]
{
<ruleDefinition alias="" context="EnsLib.MsgRouter.RoutingEngine" production="TESTINGPKG.FoundationProduction">
<ruleSet name="" effectiveBegin="" effectiveEnd="">
<rule name="">
<when condition="1">
<send transform="Demo.DTL.Generic" target="Test.DummyOperation"></send>
<return></return>
</when>
</rule>
</ruleSet>
</ruleDefinition>
}
}One approach would be to find all of your business rule classes, retrieve the "RuleDefinition" XData block for each one, and then parse the XML to see which DTLs are called.
To find the business rule classes, have a look at %Dictionary.ClassDefinition. You could do a query like this to find your business rule classes:
select ID from %Dictionary.ClassDefinition where super='Ens.Rule.Definition'
Then, for each business rule class you can find the %Dictionary.XDataDefinition for the RuleDefinition XData block with a query like this:
select ID from %Dictionary.XDataDefinition where parent='Demo.Rule.GenericRouter' and name='RuleDefinition'
The raw XML from the XData block can be accessed via the stream object stored in the "Data" property of the %Dictionary.XDataDefinition row.
You might be interested in the new Interface Maps feature launched in 2019.1. It, among other things, allows you to search for any business rules that refer to a chosen DTL.
https://docs.intersystems.com/healthconnect20191/csp/docbook/Doc.View.c…
Have you considered using web services to exchange messages between the productions? This has the advantage of allowing the two productions to later be placed on separate instances.
If you want to do it through the mapped DB, you could write a custom business operation to store the records in a custom table which is mapped to from both namespaces and then write a custom business service that polls that table for new entries.
I have 2 productions A and B on the same IRIS instance sharing one operational database.
To be sure the two productions aren't conflicting with each other, I'm assuming that only custom tables/globals are mapped to the shared database and not any of the Ens* globals/tables/routines/packages.
The error is complaining that the message doesn't have a doctype category.
In the message trace for this test message, under "Body", what is the value of the "DocType" field?
Are these standard HL7 batch headers and footers (BHS/BTS, FHS/FTS, etc) or something custom to your organization?
If they are one of the HL7 standards, have a look at these docs on HL7 batches:
https://docs.intersystems.com/healthconnect20191/csp/docbook/Doc.View.c…
One tricky part will be triggering when an old batch ends/new batch begins -- this will depend on your local requirements.
If they are custom, there are a few approaches you can consider:
A few questions:
If these are flat-files or CSV, and if you're working with a Health Connect or IRIS interoperability production, you can look at using the Record Mapper, which will read a flat file using a format you define and allow you to work with records from the file as objects:
https://docs.intersystems.com/healthconnect20191/csp/docbook/DocBook.UI…
If the files are XML you can do something similar by importing an XSD and using XML virtual documents, or use %XML.Adaptor methods:
https://docs.intersystems.com/healthconnect20191/csp/docbook/DocBook.UI…
The ConfirmComplete setting in EnsLib.File.PassthroughService allows you to specify how the service decides if a file is complete before picking it up.
Or something uglier...
set tDefined=$ISOBJECT(##class(Ens.Config.Production).%OpenId(##class(Ens.Director).GetActiveProductionName()).FindItemByConfigName(tHostName))
How about:
&sql(select count(*) into :tDefined from ENS_Config.Item where Name=:tHostName and Production=:tProductionName)
From a custom ObjectScript Business Process, you would use ..SendRequestSync or ..SendRequestAsync to make calls to other components. These will do the same thing as <CALL> in a BPL.
Ok, EnsLib.REST.GenericOperation also expects to receive an EnsLib.HTTP.GenericMessage.
Are you using EnsLib.File.PassthroughService to pick up the files? If so, it is sending an Ens.StreamContainer message to the target component (your business operation). You'll need to create a data transformation that creates a new EnsLib.HTTP.GenericMessage and populates it with the stream content from the Ens.StreamContainer. Then you'll need a router in the middle to run the data transformation and send the resulting EnsLib.HTTP.GenericMessage to the business operation.
What type of object is your business process sending to your business operation?
EnsLib.HTTP.GenericOperation expects to receive a message of type EnsLib.HTTP.GenericMessage.
Thanks Anton. Maybe I didn't dig deep enough in that github link but it looks like that is for AWS, not Azure?
Neerav, I'll just note that if you use Jenna's suggestion your code doesn't need to be a part of a business service -- it can be any normal ObjectScript code and can be called from anywhere (CSP, scheduled task, etc.). And your code can make as many calls as it needs to and can receive response messages.
It will appear in message traces as if a business service sent requests to the process or operation, but in reality it's just your code sending the requests.
Jenna's approach is a good one, and it's the standard way to achieve what you describe. Having said that, if you can provide more details on your use case and/or why this approach doesn't fit your need we can help you explore alternatives.
Just wondering if anyone has put together similar examples for Azure -- preferably for uploading Azure Blobs.
Agreed, if it's a simple protocol like ASTM that would be a better option.
I was assuming that the device is using something complex like SCSI-over-USB like scanners do, or some proprietary protocol based on USB bulk transfers -- something that requires a driver on the PC.
My understanding is that the Caché application is terminal based (accessed on the client PC through telnet).
The signature device connects to the client PC using USB.
So the question/challenge is for the terminal-based application to send a request to the signature device when a signature is needed and to receive the signature image.
Seems like you need a daemon running on the PC which uses the signature device's SDK to communicate with it. The daemon would then exchange messages with Caché to manage the signature process -- possibly via web services or web sockets.
I would double-check that a TargetConfig in the new service is set to the correct router. After that, I'd check the event log and see if there are any errors logged.
Which business service are you using to read the file? Are you using EnsLib.RecordMap.Service.FileService and specifying the record map type in the configuration?
It sounds like you might be using EnsLib.File.PassthroughService
Just wanted to point out that concatenating columns in the where clause will mean the DB will be forced to look at every single row, which will mean slow performance if you have a large number of rows. It won't be able to use the indices on FirstName or LastName to improve performance.
Can you just check against FirstName and LastName directly?
There is a simple explanation!
Your output will have two columns named Category, because you've specified it explicitly and it's also included in "*". So the query engine doesn't know which of these two columns you're asking to sort by.
Either remove the explicit specification for Category, or give it a unique name:
select category as Category2 ...
You could use Apache's mod_rewrite to take all requests that fall under a certain sub-path and transform them on the fly to point to your CSP page. It could add the information about which specific page/resource was requested as a URL parameter that could be accessed by your CSP page.
For example, if a client makes a request for:
http://myserver/reporting/Dashboard1/resource2.js
mod_rewrite could change it to:
http://myserver/csp/dashboards/proxy.csp?targetResource=Dashboard1/reso…
After mod_rewrite changes the URL in the request, Apache continues processing it as usual using the new URL. Since the new URL refers to a CSP page Apache will pass it to CSP Gateway as we want.
The example I sent used the syntax for DTLs -- it isn't standard ObjectScript syntax. However, you can use the GetValueAt method of EnsLib.HL7.Message and pass it that same syntax and it will work the same way.
Yes, you can do this using the same approach.
You can set a temp variable:
lastOBX = source.{OBX("*")}
With your sample message, lastOBX would now contain the number 2. Then you can use the variable lastOBX to refer to the last OBX segment.
To access OBX:5 in the last OBX it would look like this:
source.{OBX(lastOBX):5}
Have a look at this documentation on Virtual Property Path syntax.
Essentially, if you specify an asterisk "*" inside the parentheses for a repeating segment it will return a count of how many of those segments exist in the document.
<assign value='source.{PID:3("*")}' property='pid3Count'/>
In this example, the number of repetitions that exist in PID:3 will be stored in the variable pid3Count.
You could then use source.{PID:3(pid3Count)} to refer to the last item in PID:3.
But a clever one :)
Here's a stored procedure that accepts a setting name and returns the setting value for all components that have it. It's not SQL, but can be executed from SQL :)
You can call it this way -- this example returns the port setting for all components that have it:
call Sample.Util_SettingsByName('Port')Here's the source code as XML export format. Copy this into a file and then import it using Studio, terminal, or the System Management Portal.
<?xml version="1.0" encoding="UTF-8"?>
<Export generator="Cache" version="25" zv="Cache for Windows (x86-64) 2016.1.1 (Build 108U)" ts="2016-10-12 16:15:39">
<Class name="Sample.Util">
<Super>%RegisteredObject</Super>
<TimeChanged>64203,58380.929948</TimeChanged>
<TimeCreated>64202,44682.614801</TimeCreated>
<UDLText name="T">
<Content><![CDATA[
/*
*****************************************************
* ** N O T I C E ** *
* - TEST/DEMO SOFTWARE - *
* This and related items are not supported by *
* InterSystems as part of any released product. *
* It is supplied by InterSystems as a demo/test *
* tool for a specific product and version. *
* The user or customer is fully responsible for *
* the maintenance of this software after delivery, *
* and InterSystems shall bear no responsibility nor *
* liabilities for errors or misuse of this item. *
* *
*****************************************************
*/
]]></Content>
</UDLText>
<Query name="SettingsByName">
<Type>%Query</Type>
<FormalSpec>SettingName:%String</FormalSpec>
<SqlProc>1</SqlProc>
<Parameter name="ROWSPEC" value="BusinessHost:%String,SettingName:%String,SettingValue:%String"/>
</Query>
<Method name="SettingsByNameExecute">
<ClassMethod>1</ClassMethod>
<FormalSpec><![CDATA[&qHandle:%Binary,SettingNames:%String=""]]></FormalSpec>
<ReturnType>%Status</ReturnType>
<Implementation><![CDATA[
s qHandle=##class(%ArrayOfObjects).%New()
&sql(select %DLIST(id) into :tHostIDs from ENS_Config.Item order by Name desc)
s tHostIDList=##class(%Library.ListOfDataTypes).%New()
s tSC=tHostIDList.InsertList(tHostIDs)
s tSC=qHandle.SetAt(tHostIDList,"HostIDs")
s tSC=qHandle.SetAt(##class(%ArrayOfDataTypes).%New(),"Counters")
s tSC=qHandle.GetAt("Counters").SetAt(0,"CurrHost")
s tSC=qHandle.GetAt("Counters").SetAt(0,"CurrSetting")
if ($L(SettingNames)>1) {
s SettingNames=$ZCONVERT(SettingNames,"U")
s tFilterList=##class(%Library.ListOfDataTypes).%New()
s tSC=tFilterList.InsertList($LISTFROMSTRING(SettingNames))
s tSC=qHandle.SetAt(tFilterList,"FilterList")
}
Quit $$$OK
]]></Implementation>
</Method>
<Method name="SettingsByNameClose">
<ClassMethod>1</ClassMethod>
<FormalSpec><![CDATA[&qHandle:%Binary]]></FormalSpec>
<PlaceAfter>SettingsByNameExecute</PlaceAfter>
<ReturnType>%Status</ReturnType>
<Implementation><![CDATA[ Quit $$$OK
]]></Implementation>
</Method>
<Method name="SettingsByNameFetch">
<ClassMethod>1</ClassMethod>
<FormalSpec><![CDATA[&qHandle:%Binary,&Row:%List,&AtEnd:%Integer=0]]></FormalSpec>
<PlaceAfter>SettingsByNameExecute</PlaceAfter>
<ReturnType>%Status</ReturnType>
<Implementation><![CDATA[
s tCurrHost=qHandle.GetAt("Counters").GetAt("CurrHost")
s tCurrSetting=qHandle.GetAt("Counters").GetAt("CurrSetting")
s tHostIDs=qHandle.GetAt("HostIDs")
s tFilterList=qHandle.GetAt("FilterList")
s oHost=qHandle.GetAt("Host")
do {
if ('$IsObject(oHost)||(oHost.VirtualSettings.Count()<tCurrSetting)) {
if (tCurrHost=tHostIDs.Count()) {
s AtEnd=1
q
}
s tCurrHost=tCurrHost+1
s tCurrSetting=1
s tHostID=tHostIDs.GetAt(tCurrHost)
s oHost=##class(Ens.Config.Item).%OpenId(tHostID,0)
s tSC=oHost.PopulateVirtualSettings()
s tSC=qHandle.SetAt(oHost,"Host")
s tSC=qHandle.GetAt("Counters").SetAt(tCurrHost,"CurrHost")
}
s tSettings=oHost.VirtualSettings
s tSetting=tSettings.GetAt(tCurrSetting)
s tStngName=$LISTGET(tSetting,2)
s tStngValue=$LISTGET(tSetting,3)
s tCurrSetting=tCurrSetting+1
} while ($IsObject(tFilterList)&&('tFilterList.Find($ZCONVERT(tStngName,"U"))))
if ('AtEnd) {
s Row=$LB(oHost.Name,tStngName,tStngValue)
}
s tSC=qHandle.GetAt("Counters").SetAt(tCurrSetting,"CurrSetting")
Quit $$$OK
]]></Implementation>
</Method>
</Class>
</Export>
Ens.StreamContainer's %New() method expects a string as the first parameter rather than a stream object.
Something like this should work:
set tRequest=##class(Ens.StreamContainer).%New() set tSC=tRequest.StreamSet(pInput)
Or if you're trying to send one Ens.StreamContainer for each line from the input file you could do this:
while 'pInput.AtEnd {
set tReadLength=32000
set tLine=pInput.ReadLine(.tReadLength,.tSC)
set tRequest=##class(Ens.StreamContainer).%New(tLine)
//... do other stuff
}One other thing you should be aware of. The following will not work if TargetConfigNames has more than one target selected:
set tSC = ..SendRequestAsync(..TargetConfigNames,tRequest)
You should add a loop using $LENGTH and $PIECE and do a SendRequestAsync for each item in TargetConfigNames' comma-separated string.
The resulting XML files can be imported again using %System.OBJ.Load().
If you prefer GOF format you can use %Global.Export() instead, however it doesn't accept wildcards so you would need to first put together a list of which globals you want to export.
For automation you can execute these methods from your own custom class or routine. If you want to schedule it to run automatically, you can create your custom class as a %SYS.Task.Definition and schedule it to run using Task Manager.
You're correct -- you can't restore specific globals from an Online Backup (.cbk) file.