We are using a DTL transformation to take HL7 and transform into custom XML (XML is a virtual document, held in an EnsLib.EDI.XML.Document object). The schema specifying the format of the XML says one element should occur no more than 24 times (maxOccurs="24" in the XSD schema). However, the transformation to produce one such element always produces 24 elements, all but the last one blank, when tested stand-alone.
https://www.youtube.com/embed/3PBqQwOn7rs [This is an embedded link, but you cannot view embedded content directly on the site because you have declined the cookies necessary to access it. To view embedded content, you would need to accept all cookies in your Cookies Settings]
Profiling CCD Documents with LEAD North’s CCD Data Profiler Ever opened a CCD and been greeted by a wall of tangled XML? You’re not alone. Despite being a core format for clinical data exchange, CCD's are notoriously dense, verbose, and unfriendly to the human eye. For developers and analysts trying to validate their structure or extract meaningful insights, navigating these documents can feel more like archaeology than engineering.
WSDL for a CDC vendor was provided with a URL using a custom socket (non 443). Everything generated fine, but when making calls to their https:// URL that has their custom port in the URL - no response comes back. The assumption is, their server isn't even processing the request, as Postman does using the custom https://path.to.server:NNNN port in the URL.
Once upon a time in Ensemble Management Portal the pool size of each component (Business Host) within the production was displayed in the Production Configuration page.
This information was very useful, especially when a production have tens or hundreds of components.
I have the need to query an external database and write the result set/snapshot to an internal %Persistent [ DdlAllowed ] table that I built. I have built inbound SQL Services before and write them externally to replace SSIS jobs, but how would querying a database via a Service and writing the data to an internal table work?
Can I just take the inbound query structure and write it to the class file of the internal table in a DTL? If so, what would be the Target? Or does this need to be done within a BPL as a Code block?
I have a repeatable field within HL7 that I want to create a List from. Do I have to initialize the List by using $LB, or can I just use $LI to keep adding on to the end of the list as it is looping through the field?
I am trying to create a treeMapChart in IRIS BI that will then be displayed on my DeepSeeWeb dashboard. In the IRIS BI User Portal, this is an example of what my treeMapChart looks like:
I know there is a huge amount of rectangles in this graphic - I care most about the common components (the largest boxes) but I still want all of the boxes to show. However, it projects to my DeepSeeWeb dashboard as the following:
I'm working with JavaScript in InterSystems IRIS, specifically in CSP pages. One issue I'm running into during development is that the browser keeps loading the cached version of my JavaScript files, even after I’ve made changes or recompiled the code.
I would have to clear my cache files or browser history for it to reload and work.
I’m using VS Code with the InterSystems ObjectScript extension and I want to keep my local folder (client-side) as the “source of truth” while still using the built-in Server Source Control features (diff, stage, commit, etc.) against my IRIS/Ensemble instance.
While developing web apps the security practice I consider safe and convenient is to create a special Role (e.g. equal application name) which contains security resources which application will need (SQL tables, priviledges, database access, etc) and assign it to the Web Application. So the user gets this role once it loggs in to the application (via password, no password or delegated).
Convenient, right?
So, the question is, when I deploy the app as an IPM module what should I put as a database access?
I think I found my solution but I'm trying to understand better why it works. Forgive me as my descriptions here may be scattered but I'm trying to piece the puzzle together.
The subroutine ^routine is not executed while the queue is being processed in WorkMgr. However, it works when defined as a function. Is it mandatory to define subroutine^routine as a function for it to execute properly?
Not sure what I'm missing here, I'm using to download files that starts with MTC_88 from S3 bucket using AmazonS3 inboundAdapter as below, but I'm not getting anything and I'm sure there thousands of files that starts with MTC_88
I'm looking for some simple heuristics to estimate the size on disk of a database based on average size of messages, number of messages per day and purge frequency. The purpose is for estimation of disk space requirements.
For some build scripting with the InterSystems Package Manager, I'd like to first uninstall a package with `zpm "uninstall"` and then load it from disk using `zpm "load"`. That way anything that got deleted from the source will also be deleted from IRIS.
The problem is that `zpm "uninstall"` takes in a module name and I only have the directory path to the package source. Is there a way to get the module name for an installed IPM module given the source path? It's okay to assume that it has previously been installed from that same directory with `zpm "load"`.