I have iKnow domain of forum posts, their full text is an iKnow data, and each post also has a number of views as a metadata field.
I want to get a sum of views by concept. Let's say I have a concept called "TESTEST" and there are 10 sources that have this concept. Each source has some views. I want to get views total - impact of this concept so to say.
What's the best iKnow architecture for this use case?
Has anyone used the ##class(HS.JSON.Path).%Evaluate() classmethod interrogate JSON and return a specific object value? I need to extract the MessageHeader.destination.endpoint value from a FHIR bundle and can't seem to get that XPath-like method to work.
I have an existing Python script that opens a child session using the pexpect library. But currently all it does is send hard-coded commands to the Cache process and expect a hard-coded response back in order to continue in the script.
I would like to run a Cache routine from the script, pass in a parameter, and wait for a response that will be different every time (a date, in this case). So the call would be something like D $$Tag^Routine(parameter) and wait for the routine to complete and return the response.
I have to do a development that should to connect with a external REST API and it throws different HttpStatus and a body content with the description of the problem.
I'm playing with OAuth2 with FHIR Server, but returned tokens cause 401 or 403 errors when trying to get FHIR resources.
I tried using fhir-client.js and Postman. Access tokens returned have been failing for both, with a 401 when trying through fhir-client.js and a 403 using Postman.
I asked previously about the DR server in the cloud but actually, I'm curious about the backup server to use as analytics server more than for recovery in DR case.
There is a recommended practice to use an async mirror as a server for BI (InterSystems Analytics, DeepSee)
The question is if I have PRIMARY in the cloud (AWS, Google, Azure, etc) "how far" should async mirror member be placed? Same cloud, same private cloud or it doesn't matter at all for analytics purposes?
I'm setting up an integration using ASTM protocol via Interoperability, but I'm having a problem in returning the communication, the message stays in Loop and doesn't return the information to the device simulator that is executing the operation.
Has anyone of you worked with this protocol and are able to help me unravel this protocol?
I am wondering if there any mechanism available in the Healhtshare where send a request from the service to the operation without storing the Data on CACHE.DAT?
My company going to receive ADT's and CCDA's from an external source (Hospital), The incoming data will have two kinds of patients, our patients, and not our patients. We do not want to keep the data on our servers of those patients that do not belongs to our company due to HIPPA complaint
Looking forward to hearing great ideas from this community.
I like to validate some use cases and have the following question. I am relatively new to IRIS. Perhaps someone can help:
1. I have a global m[x,y,z,f] distributed across multiple sharded instances 2. I know that i can set assign computed SQL expressions to class variables using Objectscript 3. Is there a possibility in Globals API to do the same ? Set f = x + y as a computed expression in the global m[x,y,z,f] ?
a. We would want to use the global API to change f programmatically using code
I'm using an embedded SQL statement with a a cursor-based Embedded SQL query that uses host variables in the where clause, however, what I'm doing doesn't seem to work. Can anyone help?
I have an existing application on HealthShare 2015 and decide to move it to HealthShare 2018 to make use of Atelier support. I am using Eclipse Photon with Atelier Plugin 1.3.
Most of things are working better on Atelier comparing with Built-in studio of HealthShare. However, when I tried debugging CSP file with Atelier I encounter 2 problems:
I am creating an import tool to convert a client's JSON data into IRIS classes. The sample file is over half a gig. I am copying the data into an instance of %Stream.FileCharacter;. My first few attempts worked fine. However I started getting an error thrown when I try to create a DynamicAbstractObject using the %FromJSON method. See the code below. The error code given is not in the documentation, at least not the doc that I was searching.
When I try to open a DTL in the tabbed editor I always get this error:
You are using Internet Explorer 7. This version is obsolete and is not compatible with diagram editors. Please update Internet Explorer to a recent version.
My actual Internet Explorer is version 11.
I'm running Eclipse Photon.
Atelier IDE 1.3.141 com.intersystems.atelier.feature.group InterSystems Corporation
The same error occurred with Atelier 1.2 on Eclipse Oxygen. I've never been able to get this to work.
As I was going though and trying to figure out why our CACHE.dat has increased in size over the past 18 days, I found that EnsLib_HL7.Message is still retaining messages dating back to 2014 even though we have our purge set to 10 days. Has anyone else experienced this?