Upon creating namespace I selected same database for both Globals and Routines. How can I separate both the databases. Please note that I already have data and code saved in database.
I have to create to a web socket client, but I'm unable to read any data from the server, after flushing the buffer. I have no access to the server, only two examples for the client, one in java and the other one in php:
java example:
socket = new Socket("192.168.0.1", 2003);
DataOutputStream out = new DataOutputStream(socket.getOutputStream());
out.writeUTF("aPassword");
out.writeInt(websiteId);
out.flush();
DataInputStream in = new DataInputStream(socket.getInputStream());
I've created a custom class that extends to the standard HS.Message.PatientSearchRequest. Inside the custom class i just want to pass the data as input only.
But i am getting ERROR #6277: Type attribute, s01:AcoPatientSearchRequest, does not specify valid type for XML input tag: pRequest (ending at line 5 character 171) </error>
I tried adding Parameter XMLIGNOREINVALIDTAG = 1; and
One of the REST APIs we need to call takes one of request parameter as a byte array. I am trying to create a message class and couldnt figure out which is equivalent type that corresponds to Byte[]. I need to read from a jpg file and then convert it to byte[] before i invoke this webservice. Appreciate if anyone can point me in the right direction. I can use %Stream.FileBinary class to store the jpg file after loading from a file. trying to figure out how to convert that to byte[].
So I am a front end developer working with REST API with Cache database on the BE. The BE guys are currently using Cache through a Windows virtual machine and they claim it is not possible to have a development server to work with. All the work they are doing is directly on the production server and changes are immediate and I think this is a bad idea going forward and we most definitely need a development server that has access to the same code base and different version (through git) to be able to do development. We also do all the testing on this production server with test accounts, but we cannot do automated testing with this setup.
I'm running into an issue performing UPDATES that I'm not getting on INSERTS. It's probably obvious, but I'm just not seeing it and could use a little help.
I'm going over an HL7 message and depending upon varying criteria, the relevent variables will get items added to them like the following:
Set patientId = pRequest.GetValueAt("PID:3")
Set sqlColumns = sqlColumns_",patient_id"
Set sqlValues = sqlValues_",?"
Set par($i(p)) = patientId
After compiling the variables, I check to see if accession number is found in the table.
I'm fairly new to this engine so I was wondering what the best approach for creating a dynamic mapper to downgrade from one HL7 version to another (e.g 2.4 to 2.3) with the DTL Editor, It seems like something that would be quite commonly needed but I haven't found a method that works yet,
I have the problem that when trying to connect to the SFTP server the error appears, the user and password are wrong. However, these are correctly stored under login data and I entered! SFTP in SSL Config. The connection works wonderfully with Filezilla. Anyone else has a tip for me?
In our clients environment, multiple sources will connect to one DB (all through JDBC connections) and perform various operations. Sometimes they found some data was deleted without reason. Thus they want some feature like SQL Server Database Audit Specifications that can log who at what time with which IP deleted data in a specific DB.
I've checked IRIS Audit but I didn't find feature about that . How can we audit and log delete of a known DB? The deletions might be performed by delete statament or truncate table stattement.
I am trying to create a scheduled task but get not implemented error when I try to run it.
Class PICIS.Core.Tasks.CleanEntry Extends %RegisteredObject {
ClassMethod ClearTasks(pBackupFile As %String = "d:\Temp\BackupTasks.xml", pDelete As %Boolean = 0) { // Create backup file Set tBackup = ##class(%Stream.FileCharacter).%New() Set tBackup.Filename = pBackupFile Do tBackup.WriteLine("<?xml version=""1.0"" encoding=""UTF-8""?>") Do tBackup.WriteLine("<Tasks>")
Does InterSystems have any best practices for deploying IRIS into Kubernetes? For example one question we have is should IRIS have a dedicated AKS cluster or can it be added to an existing cluster?
Hello! I have a question how to parse Library.ListOfObjects to JSON
I send post data like this from client using rest and want parse "templates", after save "codeForm","codeName" in database
{
"whoIs": "",
"templates": [{
"codeForm": "FORM_FIOGROUP",
"codeName": "operationDate",
"orderNumber": "1",
"codeFormat": "YYYY-MM-DD",
"header": "DATE",
"dbfFormatType": "Date",
"dbfFormatLength": "8",
"valueFrom": "Payment"
}, {
"codeForm": "FORM_FIOGROUP",
We are currently in process of implementing REST APIs using IRIS and we are also looking at using Intersystems API Manager.
Our aim is to implement a Microservices Architechture where Services are small in size, bounded by contexts, autonomously developed, and independently deployable.
We are following a spec first approach where we are first defining the API specs into Swagger Hub and using IRIS API Management Service to build the REST classes i.e. the Specification class, Dispatch class and Implementation Class.
I need to generate a DDL file from a .cls class that already exists, the idea is to create a mirror table in SQL. Is it possible to do this export or do I need to do the CREATE TABLE manually?
I have a routine tag that takes an argument and I want this argument to be an array reference. So I try something like:
do mytag(.myarr)
The mytag tag, adds subscripts to myarr.
When I evaluate myarr after the tag call, only the subscripts passed in are retained in myarr. The subscripts added by mytag are missing. Is there a way to pass an array so it will behave the way I want it to?
I usually save the path in database like "C:\folder\picture.png", but now i want to save the photo in Iris or Caché database? Which way is better to recover the image and to maintain the original quality?
In Event Log there is Assert event type along with Error, Warning, Alert, Trace and Info types. What is Assert event type in Event Log view and What is the use in production?
Hi Community...I am trying figure this out any help appreciated. Thanks in advance
The service is a file passthrough that picks up a PDF and then uses the BPL code below to create the rest wrapper and sends the stream data. I am not grabbing the stream from the inbound correctly, "request.Stream" or "request.StreamFC" always produces the following error.
Here is what is sent and returned in the message..
Background: We have our own SQL map that predates InterSystems'. A program writes an XML file for each table map class as $system.OBJ.Export would. $system.OBJ.LoadDir loads the XML files into .cls files.
The reason is a long story, but we need to update parameter EXTENTSIZE (only) in existing classes. This does not seem to happen. As a test I used $system.OBJ.Export to make an XML file and edited EXTENTSIZE in the two places it appears in the XML:
I can start cache without issue. There are no errors in the console log. However, when I access the management portal the page fails to display instead displaying the error page with the error "CSP application closed the connection before sending a complete response". This was working yesterday so the root cause is not clear. Reviewing all log files, the only error I have found is in the csp.log. The error is:
I don't know if the title is accurate enough. I have a legacy code that I need to optimize. It's a routine written in objectscript. It accepts 4 parameters and runs 6 nested FOR...$ORDER reading a big global.
The thing is when I run the routine the first time it takes around 60 seconds to run. If I run it again it takes 5 seconds. If I wait around 6 to 10 minutes to run it again, it takes 60 seconds again, but if I run it every 1, 2, 3... minutes it still takes only 5 seconds to run.