go to post Roger Merchberger · Mar 13, 2020 To add to what Peter said, there is a clue in the Read method documentation, snippet follows: If no len is passed in, ie. 'Read()' then it is up to the Read implementation as to how much data to return. Some stream classes use this to optimize the amount of data returned to align this with the underlying storage of the stream. [[ Extra emphasis mine. ]] And, if for some reason the FindAt method was called, it does only read a maximum of 20000 characters, snippet follows: Method FindAt(position As %Integer, target As %CacheString, ByRef tmpstr As %CacheString = "", caseinsensitive As %Boolean = 0) As %Integer{If caseinsensitive Set target=$zconvert(target,"l");; truncation for brevity... While '..AtEnd {Set tmp=..Read(20000)If caseinsensitive Set tmp=$zconvert(tmp,"l")Set tmpstr=$extract(tmpstr,*-targetlen+2,*)_tmp [[ extra emphasis also mine. ]] Hope this helps!
go to post Roger Merchberger · Jan 13, 2020 Vitaliy, Thanks for the code, but how would I integrate changing the the TranslateTable in the DTL of my Production Process? That's where the hash is getting created; I tried several permutations of ' set source.Stream.TranslateTable="" ' but all gave me a <PROPERTY DOES NOT EXIST> error. The CharEncodingTable is a Readonly Internal value, and trying to changing that gives me a <CANNOT SET THIS PROPERTY> error. Although, looking at the source code of the Ens.StreamContainer class, that might give me an idea how to rewrite the class _easily_ to accomplish what I need... Thanks for the input!
go to post Roger Merchberger · Jan 13, 2020 SubdirectoryLevels: 0 I honestly didn't think that it would make a difference, as when Charset = Native, it works fine. Only when Charset = Binary is when it acts odd. That said, I changed ArchivePath to c:\Export2\CSV_Complete\ -- restarted the production and tested -- no change in behaviour. Thanks!
go to post Roger Merchberger · Jan 13, 2020 Yes, recreates in the same folder, as the production then re-sends the file (with the randomized filename) via SFTP to the remote server. If it makes a difference, the production is on a Windows 10 system, although I have access to a Linux server for testing as well. Settings: File Path: C:\Export\ Archive Path: C:\Export\CSV_Complete [[ I have two different business services looking for two different extensions) Work Path: [[ Null ]] I don't see a 'DeleteFromServer' setting... will continue digging. Also, Conform Complete: Readable (default) File Access Timeout: 2 Hope this helps, and thanks!
go to post Roger Merchberger · Sep 17, 2019 And, at the terminal prompt: W $SYSTEM.SQL.DATEPART("hh","2019-09-10 23:01:55") also returns '23' - and I'm GMT-4 currently...Hope this helps!
go to post Roger Merchberger · Sep 10, 2019 I'm not sure if this is the simplest or the quickest, but if your licensing includes the ability of creating an Ensemble Production (that's the 2012 term, anyway - we've not yet upgraded to HealthShare or Iris) you could create a pair of Services based on the EnsLib.File.PassthroughService - one for the primary and one for the secondary journaling directories. Each service will scan the directory (it's configurable how often with the Call Interval parameter) and you can fire a Business Process or Operation when the new file hits. You can even set up a File Specification to make sure that if someone manually puts a file in one of the directories that the process may not 'fire' if it's not supposed to.You could even have different rules for each process - for example, if the Secondary Journaling directory gets a new file created in it, the Business Process could have a rule to fire off an email to the system administrator(s) warning that the drive housing the primary journaling directory may be full or compromised in some way.Hope this helps!
go to post Roger Merchberger · Sep 4, 2019 Eduard,I'm not 100% sure I understand _exactly_ what you're trying to accomplish -- do you want to execute code when the Production starts, or do you want to execute code when the individual Business Process is fired? Either way, I'll explain both in my example below...Did you associate your class with a particular production or business process? In Studio, I created a New >> Production >> Business Process, and it will ask you for a Package Name (I used TEST) and Class Name (I used StartItUp) and a description (Testorama) - and if you click Finish it will take you right to the graphical Business Process editor. You then need to add an activity to the process, I used "code" and entered set ^gfxdbg(+$o(^gfxdbg("z"),-1)+1)=$h under the 'Code' section. Then I connected the <start> to the <code> and the <code> to the <end> nodes (graphically). I then saved & compiled the .bpl. After that's saved & compiled, you can press <SHIFT><CTRL><V> to view the code that Studio created, and you'll see this: /// Testorama Class TEST.StartItUp Extends Ens.BusinessProcessBPL { /// BPL Definition XData BPL [ XMLNamespace = "http://www.intersystems.com/bpl" ] { <process language='objectscript' request='Ens.Request' response='Ens.Response' height='2000' width='2000' > <sequence xend='324' yend='440' > <code xpos='204' ypos='244' > <![CDATA[ set ^gfxdbg(+$o(^gfxdbg("z"),-1)+1)=$h]]> </code> </sequence> </process> } /// except this comment - I added this manually. Insert extra methods here. } Except... one line near the bottom; I added this manually to show you where to add the next segment of code. In that spot, I added this: Method OnInit() As %Status { set ^initdbg(+$o(^initdbg("z"),-1)+1)=$h quit $$$OK } ClassMethod OnProductionStart() As %Status { set ^proddbg(+$o(^proddbg("z"),-1)+1)=$h quit $$$OK } I used different globals to highlight the different functions of the methods. Every time that the OnInit() is run, it'll add a new node to ^initdbg, and every time the OnProductionStart() is run, it'll add a new node to ^proddbg. If you've made it this far... you're still not quite done. :-) You still need to go back to the Management Portal in your production, and click the (+) next to Processes to add a new Business Process. Under the Business Process Class, click the dropdown and you should now see TEST.StartItUp (scroll to the bottom to see it): Select that for the BPClass. Give the Process a Name (Below, I used "Fizzle2_Stuff" -- yea, it's a dumb name, but I didn't want to reconfigure _everything_ again just to give it something different. Sorry 'bout that.), and click "Enable Now." If the process is created and the dot stays green, you're.... still not quite there. But almost! In your command prompt, if you type this: zw ^proddbg you _should_ see an entry. This is set every time the production starts -- or more accurately, when the production starts the Business Process. If you type this: zw ^initdbg you will not get any output? Why? Because the OnInit() isn't executed on production start - it's executed on "subjob" start. When the Business Process gets some input from a Business Service that uses an InboundAdapter, then it'll populate. I have a test Service that scans a directory on my hard drive (based on the EnsLib.File.InboundAdapter) and can send it to a Business Process. That's set under the "Target Config Names" setting. My example has a _lot_ of testing things that are named weird... so be warned. :-) Hopefully that part's not too confusing... Anyway, once you have the Service pointing to the Process, every time new input goes through the queue on the Service and is sent to the Process, then the OnInit() method fires, and you'll see a new entry in: zw ^initdbg Anyway, If this doesn't quite make sense, I apologize and maybe I'll rework the tutorial to make it clearer and highlight every single step it took to get this tested fully... please respond with feedback on how this could be better understood. If you have any questions, please feel free to ask! [[ Edited initial paragraph for clarity. ]]
go to post Roger Merchberger · Mar 21, 2019 Not discounting the simple, direct answer above, but just for more information... if you had many users on many systems that you need to update (I was in that situation once or twice... :-) ) you can also change passwords, etc. from SECURITY menu in the %SYS namespace...ZN "%SYS"D ^SECURITYChoose Option 1 for the User setup menu, which contains:1) Create user2) Edit user3) List users4) Detailed list users5) Delete user6) Export users7) Import users8) ExitYou can use Option #2 to edit the users, then using Option #6 you can export all of the updated user settings to an .xml file, transfer that file to the other servers you need to update and then just use Option #7 to import those new settings 'en masse' into the new system.This can work for updated Roles, Services, Resources, etc. as well.Hope this helps!
go to post Roger Merchberger · Mar 5, 2019 Is it possible to add a bit more information? I'm guessing (but it's just a guess) that this is something set up on your server itself -- the #5001 is a user defined error and can say "anything" the programmer wanted.If what you're editing is a class, I wonder if the code in that class is defined as [ ReadOnly ] and when modification is attempted that it's checking for changes and throwing the error...I did try editing a straight .INT routine mapped to a new database, make a change but don't compile, then dismounting and remounting the datatabase in Read Only mode and compiling -- you get quite a few <PROTECT> errors and a "#5883: Item 'QQQ' is mapped from a database that you do not have permission on" showed up, but nothing like what you described above.I know this isn't much, but I hope it helps!
go to post Roger Merchberger · Feb 28, 2019 With that version of Ensemble, the compact / truncate sequence could cause database corruption, so it was disabled. It's still possible to shrink the database, but it will require downtime (if the database needs to run 24x7, you'll have to plan for this) and you'll need enough hard drive space to house the current "large" version of the database and the compacted version.You'll need to create a new empty database, then in a terminal prompt go into the %SYS namespace and do a GBLOCKCOPY:ZN "%SYS"D ^GBLOCKCOPY=-=Choose Option 1 (Interactive Copy), then option 1 (Copy from Database to Database), choose the original directory, then the new directory. (Note where these directories are -- you'll need that later.) It will then ask you if you want to copy all the globals, and choose Yes. You may get a prompt about a global having no data, if you get that choose "yes" to include it anyway.Once the database has been copied, shut down Ensemble and move the CACHE.DAT file out of the original file (I would recommend copying it to an external hard drive or somesuch -- you can't have too many backups!) then copy the now smaller CACHE.DAT from the destination directory into the original. Restart Ensemble and verify that your data and applications work as advertised, then you can delete the destination database through Ensemble.Hope this helps!