Any insights, news, alternatives or experience about sequence pattern querying like with Oracle's MATCH_RECOGNIZE, more declarative, less simple direct COS way? Both obvious solutions (shadowing data to oracle cloud or implementing MATCH_RECOGNIZE compiler in COS from scratch) seem big overkill. Do Intersystems have plans to adopt this "2007 ANSI standard proposal" and "SQL:2016 standard"
I want to create a custom list inheriting %ListOfDatatypes in order to implement some methods like Map, Filter, ForEach, but I have some questions. Just to mention, I have already read the topic about DeclarativeCOS.
When I define a property like following, caché uses %Collection.ListOfDT or %Collection.ListOfObj, as it's shown in the documentation
I have a lookup table and record batch Table I would like to do a count on the records stored in that batch by counting the number of records in that batch that have a certain key on the responseKey column. This column keys are stored in the look up table for comparison. So I would like to do a join sql pivot that will use my keys stored in the lookup as columns and count as values
so far I have managed to do this but this is not efficient I would like to fire that sql once not on every count
I'm attempting to create a pop-up box with "Yes" and "No" for the buttons. I currently have a JavaScript confirm box but it has "OK" and "Cancel" as the response buttons. I have read that JQuery is a way to do, but is there a possibility of doing this with CSP instead of JQuery?
Hello ! I am still trying to get the sample email working on my computer. I am getting the below error. I have tried a number of things and researching it on the net. I am not sure what I am doing wrong.
Currently, we are receiving an alert that states, "Write Daemon still on pass 31". It's been that way for a few hours.
I was wondering if it is possible to identify what the WD has left to work on so that we can see how we can reduce this and possibly identify if there are issues with the way something is written.
Sometimes, we need to copy part of the properties of an object into a different one. The simplest thing would be to do the following:
Set obj1.FirstName = obj2.FirstName
Set obj1.SecondName = obj2.SecondName
What happens if the object contains a large number of properties? or we just want to extract an important group of data, and complement the information in another object?
I am trying to create a list of id's which are of type integer. I have created a class like below and then I am trying to call the class from a BPL to first initialize the list. In the BPL, I have an assign action that is using set and a context variable setup to hold the list data. The property is using the class I have created, ##class(MSI.IN835.EOBList).%New(). I keep getting an error back saying the method does not exist. Why does it not like the %new method? I am new to cache object script, so if their is a better way to do this, I am open to that as well.
A function iterates global ^data(a,b,c) with $order using 3 nested for-loops. At certain point, for example with a=10, b=20, c=30, the function exits and later has to resume the iteration from the same point. What could be easiest way for it? My solution looks too ugly.
Hi! I have a local project written on Cache and Atelier on my PC. I need to move it to notebook. Tried to export globals, classes, MAC-programms and csp with frontend stuff, but after I created my apps on notebook and imported my set, it just didn't work. I think it's because I have some settings on Management Portal, so how can I export portal settings and what I should export to have my working apps on another computer?
I want to share four functions with you. I hope that you can use it at some time.
DNI: the initials of the type of national identity document, is composed of different series of numbers and letters. That proves the identity and personal data of the holder, as well as the Spanish nationality. Example: 94494452X
NIE: The NIE or foreigner identity number is a code for foreigners in Spain.
I have a service that I need to receive notifications from when the service has started. Usually this service will be off, but when started a need a method to be called so I can email and be alerted. The only method I know I can use is " OnProcessInput" but this is when an input is received (new file, new message...) but I could not find any events to deal when the service starts...I tried "OnInit" but does not seems to work...any ideas?
Hi, Trying to work with REST protocol, using IIS 8, both client and server are Cache servers, I have the following problem- When I send a PUT command, I get the following error: <HTML><HEAD><TITLE>Length Required</TITLE> <META HTTP-EQUIV="Content-Type" Content="text/html; charset=us-ascii"></HEAD> <BODY><h2>Length Required</h2> <hr><p>HTTP Error 411. The request must be chunked or have a content length.</p> </BODY></HTML>
I have a ZEN application that displays PDF files in an 'iframe', embedded on a ZEN popup dialog. embedded in a ZEN page running on IE 11 with its document mode set to 5, which is enough to make most reasonable people give up and start selling hot dogs in the park.
We experiencing a strange problem with one of the clients on Cache 2008, for time to time they can't run any of their Crystal reports and each time we had re-compile the classes used in the affected report to get it running, then it happens again for the same reports.
Does developing a RESTful API in Caché remove the requirement to use the InterSystems.Data.CacheClient.dll and generate proxy classes using the Caché Object Binding Wizard for .NET web development? If anyone has links to sample applications using .NET with Caché and REST Services, I would be grateful if you could share them.
Apache Spark has rapidly become one of the most exciting technologies for big data analytics and machine learning. Spark is a general data processing engine created for use in clustered computing environments. Its heart is the Resilient Distributed Dataset (RDD) which represents a distributed, fault tolerant, collection of data that can be operated on in parallel across the nodes of a cluster. Spark is implemented using a combination of Java and Scala and so comes as a library that can run on any JVM.