go to post Alex Woodhead · Apr 25, 2023 If the original install media is not available, there are currently non-Docker software distributions for Cache 2018.1 available via wrc.intersystems.com. Your customer's business may have a license and support account with InterSystems to logon and download install the media. Is there a "*.cpf" configuration file available in the backup or original server install folder? This will include configuration for namespaces and databases involved, and how they are mapped for data and "table" definitions. ie: A CBK is a single file with consistent backup of multiple databases each which would be restored to a equivalent distinct database file location. Anticipate the OS architecture may need to be the same. ie: Backed-up on windows, then would restore to different install also on windows.
go to post Alex Woodhead · Apr 19, 2023 Excuse the typo: Should be "$C(10)" not "%C(10)" set:ln[$C(10) base64.LineTerminator=$C(10)
go to post Alex Woodhead · Apr 19, 2023 Hi Scott, Am wondering whether this is due to migrating from AIX to RedHat, and the stream is defaulting to expecting a different line terminator $C(13,10). Could try add before the while loop: set ln=base64.ReadLine() set:ln[$C(10) base64.LineTerminator=$C(10) do base64.Rewind() Kind regards, Alex
go to post Alex Woodhead · Apr 19, 2023 There is: ##class(%MessageDictionary).FormatText("Hello %1", "World") Also "Include %occMessages" at top of your class will give access to macros:* FormatText* FormatTextHTML* FormatTextJS Both approaches work for the indeterminate number of args requirement.
go to post Alex Woodhead · Apr 19, 2023 In the past have used Apache Batik https://xmlgraphics.apache.org/batik/ The scenario was that a message contained a diagnostic background image with layered SVG annotations but the receiving system needed to receive a flattened PNG image instead. Can also generate TIFF or JPEG instead. Batik is a Java based integration, so depends if IRIS integrating with Java is an option for your deployment. Interesting if someone suggests a reliable Python alternative. Other directions might be phantom NodeJS / image magic.
go to post Alex Woodhead · Apr 18, 2023 Hi Pravin, The following pattern could work for you. Override and add a callback with a single line edit. Then maintaining the custom logic in a supporting callback. Will need to track behavior changes for %JSONImport when upgrading. Method OnJSONImport(%JSONObject As %Library.DynamicAbstractObject) { set %JSONObject.Version=""_%JSONObject.Version } /// %JSONImport imports JSON or dynamic object input into this object.<br> /// The input argument is either JSON as a string or stream, or a subclass of %DynamicAbstractObject.<br> /// mappingName is the name of the mapping to use for the import. The base mapping is represented by "" and is the default. Method %JSONImport(input, %mappingName As %String = "") As %Status [ ServerOnly = 1 ] { Try { Set sc=$$$OK New %JSONObject If $isobject(input),input.%IsA("%Library.DynamicAbstractObject") { // Already a dynamic object Set %JSONObject=input } Else { // A JSON stream or string Set %JSONObject=##class(%Library.DynamicAbstractObject).%FromJSON(input) } // Add callback Do ..OnJSONImport(%JSONObject) // Do the import now. Set sc=..%JSONImportInternal() } Catch ex { Set sc=ex.AsStatus() } Quit sc }
go to post Alex Woodhead · Apr 17, 2023 Hi Thembelani, This site requires access by HTTPS. I created a TLS configuration in Management Portal with defaults called "Open". Then the following works for download. Set httpRequest = ##class(%Net.HttpRequest).%New() Set httpRequest.Server = "msedgedriver.azureedge.net" set httpRequest.Port=443 Set httpRequest.ContentType = "application/octet-stream" Set httpRequest.SSLConfiguration="Open" set httpRequest.Https=1 Do httpRequest.Get("/113.0.1774.0/edgedriver_win64.zip") Set fileName = "tryedgedriver_win64.zip" Set fileStream = ##class(%Stream.FileBinary).%New() Set fileStream.Filename = "C:\tmp\"_fileName Do fileStream.CopyFrom(httpRequest.HttpResponse.Data) Do fileStream.%Save() Do fileStream.%Close() Kind regards, Alex
go to post Alex Woodhead · Mar 6, 2023 I had an experiment with making a CSP framework for using embedded python. https://github.com/alexatwoodhead/PyNow It has some utility code that converts the %request arguments and presents as a dictionary to Python method. It also shows using XData blocks to use as a template with some Excel function+Django like expression support. Might find something reusable in there.
go to post Alex Woodhead · Mar 6, 2023 Some ideas for an Operation. Expose the Comma delimited headings to be configurable via new property and settings. Toggle the overwrite/append mode, if set to overwrite, to ensure the heading is added to new files So code code might be something like: /// The type of adapter used to communicate with external systems Parameter ADAPTER = "EnsLib.File.OutboundAdapter"; Property Adapter As EnsLib.File.OutboundAdapter; Property CSVHeaderRecord As %String; Parameter SETTINGS = "CSVHeaderRecord:Basic"; Method TestWrite(pRequest As Ens.StringContainer, pResponse As Ens.StringContainer) As %Status { set filename="OutputTest.csv" set originalOverwrite=..Adapter.Overwrite if ""'=..CSVHeaderRecord { if originalOverwrite { set tSC=..Adapter.PutLine(filename,..CSVHeaderRecord) set ..Adapter.Overwrite=0 } else { if '##class(%File).Exists(..Adapter.fixPath(..Adapter.FilePath)_pFilename) { set tSC=..Adapter.PutLine(filename,..CSVHeaderRecord) set ..Adapter.Overwrite=0 } } } set tSC=..Adapter.PutLine(filepath_filename,pRequest.StringValue) set ..Adapter.Overwrite=originalOverwrite Quit $$$OK } For updating existing files using streams: May need to consider stream line terminators and character encoding. ClassMethod AddCSVHeader(filepath = "", headerText = "") { set tmpStream=##class(%Stream.TmpCharacter).%New() set updateStream=##class(%Stream.FileCharacter).%New() set tSC=updateStream.LinkToFile(filepath) set tSC=tmpStream.CopyFrom(updateStream) do updateStream.Rewind() do tmpStream.Rewind() do updateStream.WriteLine(headerText) do updateStream.CopyFrom(tmpStream) do updateStream.Flush() do updateStream.%Save() }
go to post Alex Woodhead · Mar 5, 2023 Hi Alexey, Atomic read of a single global node. In the questions example, the reference table can use a single global node. Use cases are lookup table and application code tables (SELECT %NOLOCK ...) Conversely for reading complex records spanning multiple global nodes, then yes the lock is needed for consistent full read: 1) During the full read 2) Just before the commit of the transaction from updating process. However this is also much less than blocking for the full 30s seconds described in original scenario.
go to post Alex Woodhead · Mar 4, 2023 Hi Alexey, Thanks for question on clarification. The suggestion on the transaction was to avoid needing to lock and block by read processes. Yes, it makes sense to have a lock, with timeout, and success test on the update process, if there is logical chance of multiple operations updating the reference table. Hope that makes sense.
go to post Alex Woodhead · Mar 4, 2023 Hi Prashanth, This scenario did remind of some experiences with dynamically loading data in previous jobs. So will make a suggestion to consider whether "utility method waiting" is the best approach. At the beginning of the reload of reference table you could use "TSTART". Then at the end of reload of reference table you could use "TCOMMIT". This means ALL other processes accessing the reference table get a consistent Version1 or then Version2 of the reference table. This avoids the need wait with locks / blocking, as other processes don't need to wait for the loading process to complete or even be aware of when the reloading process is working. One other suggestion (which you may already have undertaken) is: If you delete ALL the reference data and then add it ALL back for a small number of record changes, it can cause a unnecessary activity in Journals. Instead, if you only remove unwanted records, add new records and update changed records, this can minimize activity in Journals for the reference table. ie: Where most reference table records are unchanged you could get better performance. Hope this helps. Cheers, Alex
go to post Alex Woodhead · Mar 3, 2023 I made a utility for converting between Cache Lists and Arrays with Python Lists and Dictionaries. Bi-directional. Ensures the keys are strings when going from Array keys to Dictionary keys. https://github.com/alexatwoodhead/PyNow/blob/main/Py.Helper.xml Also have an approach for storing (pickle) and retrieving Numpy arrays in IRIS Property Streams if that is of use.
go to post Alex Woodhead · Feb 21, 2023 Hi Peter, Some documentation areas that may help: Using the FTP Inbound Adapter (IRIS Integration)https://docs.intersystems.com/irisforhealth20223/csp/docbook/DocBook.UI.... Using SSH -> SFTP (IRIS)https://docs.intersystems.com/csp/docbook/DocBook.UI.Page.cls?FIND=CLASS... Cheers, Alex
go to post Alex Woodhead · Feb 20, 2023 // Hugging the Globals for a fast ride set tableName="MyTableName" set key="" for { set key=$Order(^Ens.LookupTable(tableName,key),1,value) quit:key="" Write !,key,"=",value } // As export string method with delimiters ":" and ";" classmethod ExportTableAsString(tableName) as %String { set (ret,key)="" for { set key=$Order(^Ens.LookupTable(tableName,key),1,value) quit:key="" set ret=ret_key_":"_value_";" } quit $E(ret,1,*-1) } // As export array method classMethod ExportTableAsArray(tableName,OUTPUT ary) { kill ary merge ary=^Ens.LookupTable(tableName) } ---------------- do ##class(Someclass).ExportTableAsArray("MyTable", .ary) // If you want to iterate over a slice of keys for example All keys prefixed with "Q" set tableName="MyTableName" set (key,prefix)="Q" for { set key=$Order(^Ens.LookupTable(tableName,key),1,value) quit:key="" quit:prefix'=$E(key) Write !,key,"=",value } // To confirm existence of a key in a lookup table if $Data(^Ens.LookupTable(tableName,someKey)) { Write !,key,"=",^Ens.LookupTable(tableName,someKey) } // To dump the lookup table to terminal zwrite ^Ens.LookupTable(tableName) The top node of the LookupTable has a timestamp, so it is possible you could use this to setup / invalidate an external cache of lookup values. Hope this gives some ideas.
go to post Alex Woodhead · Feb 16, 2023 Ompare can also be run against exports that had been re-imported to clean namespaces for profiling locally. It won't help for comparing namespace mappings or SQL, but it could still do the Lookup table differencing for you. The approach either GIT / Ompare for actual diff is one choice. With regard to achieving an export, I would suggest to use Studio on the target Ensemble Instance to create a "Project" for exporting. Then add the respective Classes, Routines, HL7, include, Rules and any other components. For lookup tables if the version is too early for visibility of "LUT" types, these can be exported by manually adding for example: Ens.LookupTable("MyTableName").GBL to the project, for each lookup table. If you have a project on the server it is less error prone and quicker to have an existing export project to add items next time. When you import with studio to report instance / git instance, you will have opportunity to double check which items you do actually want to import. Hope that helps.
go to post Alex Woodhead · Feb 15, 2023 Hi Colin, I created a utility a few years back which I have open sourced to: https://openexchange.intersystems.com/package/ompare It will show significant differences in Classes, Routines, Lookup Tables, HL7 schemas, Include files, Namespace mappings. ie: It is clever to ignore non-functional differences for example: Comment lines, The order of routine line labels or method definition This helps a lot for uncontrolled environments where updates have been integrated manually, in a different order. The SQL profile can help diff anything with a SQL projection so can include: Production settings, Schedule Tasks, CPF, etc. It works with flatfile exports and imports, so systems can be disconnected / on different networks. Will happily diff 5 or 10 systems side-by-side. The intention is for a client install that only includes the profilers and schedule. Meaning the reporting side web pages do not need to be installed to servers only being profiled for difference signatures. If needed it can effect code diffs just via the thumb-prints, without exporting actual code, useful for secured code scenarios. Also if looking to upgrade Ensemble, there is a zero-install version of same compare tool that might highlight some useful platform differences https://openexchange.intersystems.com/package/Ompare-V8 Cheers, Alex