go to post Jenna Poindexter · Oct 27, 2016 Thanks Bachhar, Expanding on this a bit, we also have the need to be able to query the message store for messages we have sent/received from a given interface. These are all HL7 messages, which means they are stored in EnsLib.HL7.Message. I'm assuming there is a way to correlate a message in EnsLib.HL7.Message with either Ens.MessageBody or Ens.MessageHeader where regular Ensemble messages are stored.What would be the proper way to correlate these? I looked at the fields in EnsLib.HL7.Message but don't see anything that stands out as obvious connection between the two tables.
go to post Jenna Poindexter · Oct 19, 2016 So this begs the question, how could one invalidate the result cache for the entire cube without running %KillCache as that purges the underlying globals and would interfere with actively running queries, correct?
go to post Jenna Poindexter · Oct 19, 2016 So if we were to use the APIs to update the dimension table directly for example, this would not be adequate to insure that future queries showed the proper data?
go to post Jenna Poindexter · Oct 19, 2016 As mentioned already, running ##class(HoleFoods.Cube).%KillCache() would actually go in and purge the internal globals associated with the result cache for that particular cube. Not really the best thing to do on a production system. A safer way to invalidate cache for a given cube would be to run %ProcessFact for an individual record within the source table for the cube. For example &sql(select top 1 id into :id from <sourcetable>) do ##class(%DeepSee.Utils).%ProcessFact(<cubename>, id) replacing <sourcetable> with the source table for your cube and <cubename> with the name of the cube. This has the effect of causing the result cache for the cube to be invalidated without completely purging the internal globals that store the cache. This is much safer to run during production times than the %KillCache() method.
go to post Jenna Poindexter · Oct 6, 2016 One concern with using this method is that it actually goes in and kills all of the cache for the cube by killing the globals containing the cache. What affect will this have if it is done while users are using the Analyzer for example?
go to post Jenna Poindexter · Oct 5, 2016 After further review and advice from others I have discovered that there are two problems.1. The order of my inheritance needs to be switched to Extends (%Persistent, %Populate) 2. The POPSPEC attribute should actually point to a method name, ie: POPSPEC="Name()" and not POPSPEC="NAME" This was a problem with me mis-reading the documentation on POPSPEC.
go to post Jenna Poindexter · Jun 23, 2016 I think this is what you want.You will first want to compact the database which will move all free space to the end of the database and then you will want to truncate it. Here is a link to the documentationhttp://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=...
go to post Jenna Poindexter · Jun 23, 2016 Here's an example of how one might use the FileSet query in the %File class and the Delete class method in %File to purge backup files in a given directory before a given date. /// Purge backups older than <var>DaysToKeep</var> /// <var>Path</var> points to the directory path containing the backups. /// Only *.cbk files will be purged ClassMethod PurgeBackups(Directory As %String, DaysToKeep As %Integer = 14) As %Integer { // Calculate the oldest date to keep files on or after set BeforeThisDate = $zdt($h-DaysToKeep_",0",3) // Gather the list of files in the specified directory set rs=##class(%ResultSet).%New("%File:FileSet") do rs.Execute(Directory,"*.cbk","DateModified") // Step through the files in DateModified order while rs.Next() { set DateModified=rs.Get("DateModified") if BeforeThisDate]DateModified { // Delete the file set Name=rs.Get("Name") do ##class(%File).Delete(Name) } // Stop when we get to files with last modified dates on or after our delete date if DateModified]BeforeThisDate quit } }
go to post Jenna Poindexter · Jun 21, 2016 Funny you should ask this as I was just looking at how to do this today. Most operating systems offer a way to search for files given a certain filter, such as being older than a certain date, and then piping that list to another command, such as delete.Here is a class method I wrote to do this on a Windows 2012 R2 server running CacheClassMethod PurgeFiles(Path As %String, OlderThan As %Integer){ set Date=$zd($h-OlderThan) set cmd="forfiles /P "_Path_" /D -"_Date_" /C ""cmd /c del @path""" set sc=$zf(-1,cmd)}This method accepts a path and an integer indicating the number of days to keep files for. It then uses constructs a command line which uses the "forfiles" command passing the path and a calculated date. For each file it finds, it executes the command cmd /c del <path> which deletes the file.There are probably more elegant ways to do this, cross platform compatible, but this is one solution that I had.
go to post Jenna Poindexter · Jun 3, 2016 And yes, there are many ways to accomplish this task from the Cache server side. The document data model doesn't really apply here as this is not a new application.Once 2016.2 is released, we could just map the globals as objects using CacheSQLStorage and then use the .$toJSON() method to export to JSON at that point.That's one of the great things about the Cache technology, there are so many different options and choices for doing the same thing.This post is really just about a discussion on what the best way to represent a global structure in JSON, not the specifics of how to accomplish that in code.
go to post Jenna Poindexter · Jun 3, 2016 Stefan-My use case is that I have a Cache application (not object based) and I want to provide a set of REST services for accessing the raw global structures, all methods. I want this REST interface to be generic in nature such that it can be used to read, set and kill any global node of any global. In order to do this, I need an encoding method for sending and receiving the global data. JSON seems the best way to package the data, thus I was looking for a good JSON structure that could represent the global structure.
go to post Jenna Poindexter · Jun 2, 2016 You could create a SQL Stored procedure to return ##class(%Library.Functions).HostName(), such as: Class Utils.Procedures Extends %RegisteredObject { ClassMethod hostname() As %String [ SqlProc ] { /* method code */ Quit ##class(%Library.Function).HostName() } } And once that was done, you can then use that stored procedure from a sql query, such as: SELECT Utils.Procedures_HostName() which on my system returns poindextwin10vm which is the hostname of my Windows system
go to post Jenna Poindexter · Jun 1, 2016 Hi Andy-Take a look at this documentation, I think it may answer your questions.It explains building both readonly and read/write indexes on an active system.http://docs.intersystems.com/latest/csp/docbook/DocBook.UI.Page.cls?KEY=GSQL_indices#GSQL_indices_build_readonly
go to post Jenna Poindexter · Jun 1, 2016 Continuing my testing with Ens.Utils.HTML.Parser, I am trying to parse information contained within paragraphs. For example: <p>This is paragraph one</p> <p>This is paragraph two with some <i>italics</i>.</p> <p>This is paragraph three</p> I want to parse the contents of each of these paragraphs, including the italics content contained within paragraph two. I setup my template to: +<p>{paragraph}</p>+ This kind of works, except in the case of the second paragraph, it stops when it hits the <i> tag even though if it were to continue it would eventually hit the </p> tag. So what ends up being parsed is: This is paragraph one This is paragraph two with some This is paragraph three What I am expecting is: This is paragraph one This is paragraph two with some <i>italics</i> This is paragraph three Does this seem to be a limitation of the parser? Is there any way to get what I'm trying to get from this document using the parser?
go to post Jenna Poindexter · Jun 1, 2016 I was able to use this method (Ens.Utils.HTML.Parser) to successfully parse disease information from the CDCs website. Basically I created a persistent class to store disease names along with the source URLs from the CDC's a-z web pages. The template looks like this: <div,class=span16><ul>+<li><a,class=noLinking,href={pageurl}>{pagetitle}</a></li>+</ul> and the Class method that actually does the parsing of all of the pages: ClassMethod getDiseasesOrCondition(Output tCount As %Integer) As %Status { set tCount=0 set template="<div,class=span16><ul>+<li><a,class=noLinking,href={pageurl}>{pagetitle}</a></li>+</ul>" for alpha=1:1:26 { kill tOut set url="http://www.cdc.gov/DiseasesConditions/az/"_$c(96+alpha)_".html" do ##class(Ens.Util.HTML.Parser).test(url, template, .tOut) for i=1:1 { quit:'$d(tOut("pageurl",i)) if tOut("pageurl",i)?1"http://www.cdc.gov/".e { set iCDC=##class(iCDC.DiseaseOrCondition).%New() set iCDC.title=tOut("pagetitle",i) set iCDC.sourceUrl=tOut("pageurl",i) set tSC=iCDC.%Save() set tCount=$i(tCount) } } } quit $$$OK } There was a little checking that was needed to verify that the url that was returned for the source url's were actually pointing to the CDCs website, but other than that, the Ens.Utils.HTML.Parser worked exactly as I had hoped it would. Very clean and straight forward for my needs.
go to post Jenna Poindexter · Jun 1, 2016 Would it be possible to implement a POST instead and pass the information containing the / characters as part of the body rather than the URL?
go to post Jenna Poindexter · May 30, 2016 Assuming this is on 2016.2 you could override the %ToJSONValue method which is used to generate the JSON.