go to post Marc Mundt · Feb 13, 2020 Try changing EsquemasDatos.Gasometros.hl7.RSPK21.QUERYRESPONSE.CONTENT to extend %SerialObject: Class EsquemasDatos.Gasometros.hl7.RSPK21.QUERYRESPONSE.CONTENT Extends (%SerialObject, %XML.Adaptor)
go to post Marc Mundt · Feb 12, 2020 The source for this JSON seems to think that "data" holds a string rather than an object. Still, we can convert that back to a proper object using %DynamicObject's %FromJSON() method: USER>set myJSONObj={"app_id":"5cf57b56-c3b4-4a0d-8938-4ac4466f93af","headings":{"en":"Cita Atención Primaria","es":"Cita Atención Primaria"},"subtitle":{"en":"C.P. ISORA","es":"C.P. ISORA"},"contents":{"en":"Aqui el contenido del mensaje si aplicase","es":"Aqui el contenido del mensaje si aplicase"},"data":"{\"centro\":\"C.P. ISORA\",\"fecha\":\"yyy/mm/dd\",\"hora\":\"hh:mm\",\"profesional\":\"nombre del profesional\",\"nomUsuario\":\"nombre de usuario\",\"codcita\":\"idCita\",\"sepuedeborrar\":\"1\"}","include_player_ids":["c2917a6f-6ecf-4f45-8b31-9b72538580fd"]} USER>write myJSONObj.data {"centro":"C.P. ISORA","fecha":"yyy/mm/dd","hora":"hh:mm","profesional":"nombre del profesional","nomUsuario":"nombre de usuario","codcita":"idCita","sepuedeborrar":"1"} USER>set dataObj=##class(%DynamicObject).%FromJSON(myJSONObj.data) USER>write dataObj 11@%Library.DynamicObject USER>set myJSONObj.data = dataObj USER>write myJSONObj.%ToJSON() {"app_id":"5cf57b56-c3b4-4a0d-8938-4ac4466f93af","headings":{"en":"Cita Atención Primaria","es":"Cita Atención Primaria"},"subtitle":{"en":"C.P. ISORA","es":"C.P. ISORA"},"contents":{"en":"Aqui el contenido del mensaje si aplicase","es":"Aqui el contenido del mensaje si aplicase"},"data":{"centro":"C.P. ISORA","fecha":"yyy/mm/dd","hora":"hh:mm","profesional":"nombre del profesional","nomUsuario":"nombre de usuario","codcita":"idCita","sepuedeborrar":"1"},"include_player_ids":["c2917a6f-6ecf-4f45-8b31-9b72538580fd"]}
go to post Marc Mundt · Feb 11, 2020 Yone, does Mensajes.Response.GestionPacientes.operacionResponse extend Ens.Request or %Persistent? Does it extend %XML.Adapter? It would be helpful to see the code for Mensajes.Response.GestionPacientes.operacionResponse and the business process.
go to post Marc Mundt · Feb 11, 2020 The error suggests that the object received by the VDocRoutingEngine is not a VDoc. Try using EnsLib.MsgRouter.RoutingEngine instead.
go to post Marc Mundt · Feb 6, 2020 One possibility: You can handle the two files with two separate interfaces: The interface for the tracking file would just load it into a record map and then stop. This has the effect of saving the information as a row in a database table. The BPL for the PO file interface would just query the tracking file record map table to find the relevant entry, open it, and add the necessary info to the PO. To avoid timing issues where a PO file is processed before the corresponding tracking file you could add a check in the BPL: if a row doesn't exist in the tracking table for this PO, delay for 5 seconds using the BPL "Delay" action and try again before failing with an error.
go to post Marc Mundt · Feb 4, 2020 What happens if you remove "do result.%Display()"? I suspect that %Display() is iterating through the result set, so by the time it reaches "$$$LOGINFO("resultado siguiente: "_result.%Next())" it is already on the last row.
go to post Marc Mundt · Jan 28, 2020 Apache Tika is another option. Without writing any code, it can be run from the command-line and output an XLSX as a tab-separated file. https://tika.apache.org/ java -jar tika-app-1.23.jar -t sample.xlsx > sample.tsv
go to post Marc Mundt · Jan 28, 2020 You can use the same approaches that were suggested above, though you'll need to spend time figuring out the structure of the .xslx file so you can write your extraction logic. To save the time/effort, the Apache POI project provides a Java library that can read/write MS Office formats, including XLSX:https://poi.apache.org/https://kalliphant.com/poi-convert-xlsx-to-csv-example/
go to post Marc Mundt · Jan 21, 2020 Have a look at the StayConnected setting under Connection Settings. By default, it is set to -1 which means the adapter expects to always have an active connection and will throw an error if it doesn't. Setting StayConnected to 0 would mean the remote system can connect and disconnect as needed without triggering an error. Stay Connected Applies to all TCP adapters. If StayConnected is a positive value, the adapter stays connected to the remote system for this number of seconds between input events. A zero value means to disconnect immediately after every input event. The default of –1 means to stay permanently connected, even during idle times. Adapters are assumed idle at startup and therefore only auto-connect if they are configured with a StayConnected value of –1. The value of StayConnected controls how the TCP adapter treats disconnections. If StayConnected has a value of –1, the TCP adapter treats a disconnection as an error. If it has a value of 0 or a positive integer, the TCP adapter does not consider a disconnection an error.
go to post Marc Mundt · Jan 17, 2020 $TRANSLATE might be a possibility. It accepts a list of characters and replaces them either with other characters or just removes them. You could compare the length of the original column with the length of the column after using $TRANSLATE to remove illegal characters. For rows without illegal characters the length will match. This would identify rows that have tilde (~), pipe (|), or backtick (`) in MyField: SELECT * FROM MyTable WHERE CHAR_LENGTH($TRANSLATE(MyField,'~`|')) < CHAR_LENGTH(MyField) It's worth noting that a statement like this can't make use of indices, so it will have to scan every row in the table.
go to post Marc Mundt · Jan 10, 2020 I would be tempted to, as a next troubleshooting step, take the BPL and JDBCGateway out of the equation and test the stored procedure calls using the same JDBC driver from a Java-based SQL query tool such as Squirrel. It's not SQL Server specific, but this step-by-step walkthrough for using Squirrel to connect to Caché might save some time. [Edit: should have said "queries of the views" rather than "stored procedure calls"]
go to post Marc Mundt · Jan 10, 2020 It looks like that error message is coming directly from SQL Server and has something to do with the linked-server config within SQL Server: https://docs.microsoft.com/en-us/archive/blogs/mdegre/access-to-the-remo...
go to post Marc Mundt · Jan 8, 2020 Business rule classes store the rules as XML in an XData block with the name "RuleDefinition": Class Demo.Rule.GenericRouter Extends Ens.Rule.Definition { Parameter RuleAssistClass = "EnsLib.MsgRouter.RuleAssist"; XData RuleDefinition [ XMLNamespace = "http://www.intersystems.com/rule" ] { <ruleDefinition alias="" context="EnsLib.MsgRouter.RoutingEngine" production="TESTINGPKG.FoundationProduction"> <ruleSet name="" effectiveBegin="" effectiveEnd=""> <rule name=""> <when condition="1"> <send transform="Demo.DTL.Generic" target="Test.DummyOperation"></send> <return></return> </when> </rule> </ruleSet> </ruleDefinition> } } One approach would be to find all of your business rule classes, retrieve the "RuleDefinition" XData block for each one, and then parse the XML to see which DTLs are called. To find the business rule classes, have a look at %Dictionary.ClassDefinition. You could do a query like this to find your business rule classes: select ID from %Dictionary.ClassDefinition where super='Ens.Rule.Definition' Then, for each business rule class you can find the %Dictionary.XDataDefinition for the RuleDefinition XData block with a query like this: select ID from %Dictionary.XDataDefinition where parent='Demo.Rule.GenericRouter' and name='RuleDefinition' The raw XML from the XData block can be accessed via the stream object stored in the "Data" property of the %Dictionary.XDataDefinition row.
go to post Marc Mundt · Jan 8, 2020 You might be interested in the new Interface Maps feature launched in 2019.1. It, among other things, allows you to search for any business rules that refer to a chosen DTL. https://docs.intersystems.com/healthconnect20191/csp/docbook/Doc.View.cl...
go to post Marc Mundt · Jan 3, 2020 Have you considered using web services to exchange messages between the productions? This has the advantage of allowing the two productions to later be placed on separate instances. If you want to do it through the mapped DB, you could write a custom business operation to store the records in a custom table which is mapped to from both namespaces and then write a custom business service that polls that table for new entries. I have 2 productions A and B on the same IRIS instance sharing one operational database. To be sure the two productions aren't conflicting with each other, I'm assuming that only custom tables/globals are mapped to the shared database and not any of the Ens* globals/tables/routines/packages.
go to post Marc Mundt · Jan 3, 2020 The error is complaining that the message doesn't have a doctype category. In the message trace for this test message, under "Body", what is the value of the "DocType" field?
go to post Marc Mundt · Jan 2, 2020 Are these standard HL7 batch headers and footers (BHS/BTS, FHS/FTS, etc) or something custom to your organization? If they are one of the HL7 standards, have a look at these docs on HL7 batches:https://docs.intersystems.com/healthconnect20191/csp/docbook/Doc.View.cl... One tricky part will be triggering when an old batch ends/new batch begins -- this will depend on your local requirements. If they are custom, there are a few approaches you can consider: You could use the Record Mapper to define a record map class with header/footer and a single field record for the HL7 content. You would then use one of the EnsLib.RecordMap.Operation.* classes instead of EnsLib.HL7.Operation.FileOperation: https://docs.intersystems.com/healthconnect20191/csp/docbook/DocBook.UI.... In particular, have a look at RolloverSchedule and/or RolloverLimit to control when a new batch file is created. If you're comfortable creating a custom class that extends EnsLib.HL7.Operation.FileOperation, you could override the outputDocument method. In your custom version of outputDocument you could check if the file already exists using ##class(%File).Exists(pathToFile) and if it doesn't you would write out the footer to the previous file and the header to the new file before calling the standard version of outputDocument using ##super. https://docs.intersystems.com/healthconnect20191/csp/docbook/Doc.View.cl...
go to post Marc Mundt · Jan 2, 2020 A few questions: You didn't mention what format these files are in -- XML? flat-file/CSV? Something else? Which InterSystems product are you working with and what kind of application is this (interoperability production, web service, etc.)? If these are flat-files or CSV, and if you're working with a Health Connect or IRIS interoperability production, you can look at using the Record Mapper, which will read a flat file using a format you define and allow you to work with records from the file as objects:https://docs.intersystems.com/healthconnect20191/csp/docbook/DocBook.UI.... If the files are XML you can do something similar by importing an XSD and using XML virtual documents, or use %XML.Adaptor methods:https://docs.intersystems.com/healthconnect20191/csp/docbook/DocBook.UI....
go to post Marc Mundt · Dec 18, 2019 The ConfirmComplete setting in EnsLib.File.PassthroughService allows you to specify how the service decides if a file is complete before picking it up.
go to post Marc Mundt · Dec 4, 2019 Or something uglier... set tDefined=$ISOBJECT(##class(Ens.Config.Production).%OpenId(##class(Ens.Director).GetActiveProductionName()).FindItemByConfigName(tHostName))