go to post Peter Cooper · Mar 18, 2018 Hi EvgenyFascinating conversation.....I am aware of projections but don't use them in my systems.I think there is some confusion when I use the term "Test" server - in my usage this is used for End User Acceptance testing and there are usually more than one project/release undergoing End User Acceptance Testing at any one time - copying the Cache.dat would take over releases that are not ready to go.= =I guess it depends on the nature of the operation - I work for individual clients rather than having a monolithic product - and (as above) there will be several projects on the go at each time for each client - so what I do works for me.If there is a possible problem with the compile (your projections) then, I think, the solution is a staging server - where individual releases are deployed to this and once proven that cache.dat is copied to the LiveMy method works in my situation I guess there is no single "correct" solution that will work for all cases.Peter
go to post Peter Cooper · Mar 16, 2018 Hey KevTry thisset x1="""this is a quoted string with a ' in it"""set x={"fred":"123", "TheQuotedString":(x1)}write x.%ToJSON(){"fred":"123","TheQuotedString":"\"this is a quoted string with a ' in it\""}The %ToJSON() does the escaping for you - or do you need something else?Peter
go to post Peter Cooper · Mar 16, 2018 Hi EvgenyWhat situation do you have in mind that could cause the compilation to be unsuccessful?With proper version control and release procedure this has rarely happened in my experience - when it does it's been due to unresolved dependencies - and in this case a re-compile fixes it.There is one possibility where it *could* happen - that is if the VC system allows multiple reservations/branches for the same class - bet we don't allow that.= =I can't see how deploying/copying the Cache.dat will avoid problems when you have multiple developers or multiple projects on the test server.= =I guess the only 100% way is to have a staging server where a deployment can be copied to and tested before deploying to the Live server - in this case it is tightly controlled and copy the Cache.dat is possiblePeter
go to post Peter Cooper · Mar 16, 2018 Just realised that copying the Cache.dat is only sensible for a single developerIf you have more than one developer working on different projects all deploying to the same Test server then copying won't work - you would get bits and pieces from different projectsEven for a single developer it's a bit dodgy - you could be working on two or more projects at the same time - waiting on user acceptance testing - if one passes then you want to deploy that to live but not the others.The more I think about it the more I believe that my method of working is the only one that works for all possibilities - or so I believe - if anyone has a better method please tellPeter
go to post Peter Cooper · Mar 16, 2018 Hi JohnWell hidden indeed - you would only find it by reading every line of the docs for each releasePeter
go to post Peter Cooper · Mar 16, 2018 Just had to do this at a client And missed out another thing that I have to do....This is a ZEN app that has custom components sub classed from ZEN - these create css and js filesBut only in the master namespace/csp applicationThese files need to be copied to the other namespace/csp applicationspeter
go to post Peter Cooper · Mar 16, 2018 Hi Robert and allYou can achieve the separation quite easily with routine and class package mapping.I have a client that has overseas affiliates they all share the same code/class base but have their own namespace with routines and packages mapped to the master namespace.Works just fineThe only issue is that developing the code is more complex as the different affiliates have started to need different functionality starting from the same base screenPeter
go to post Peter Cooper · Mar 15, 2018 Hi AllThe way I work is:- Personal Development Machines - Deploy to Test Server for User Acceptance Testing - Deploy to LiveThe version control system that I use is TrakWarePro from Globalware - sadly no longer in existence - but it works for us. Not only to maintain versions but to ship releases between the three environments.When deploying the classes need to be compiled (obviously) but I don't trust ISC compiling *all* the required classes and SQL statements etc. Neither does $System.OBJ.CompileAll() work 100% in resolving the dependencies in the correct order.Also a release will need to set SQL access on tables for any new data classes.So I have developed a do ##class(setup.rCompileAll).doit() method that does all the necessary - in the correct order, set the SQL Access etc.Usually a deployment will require changing data/updating indices/adding pages to the access database etc etc - so there is usually a setup class that contains the code to do this.So I havea Studio project "pfcXXX"a version control project "pfcXXX"a setup class "setup,rPFCxxx.cls"And all this works 99.9% over 10 plus years - I can't actually remember wen it went wrong but nothing in this world is 100% The downtime can be as little as 5 minutes for a simple release or up to 1 hour or so if the data changes are complex.The only downside is system is unavailable to the user whilst this process is happening - I know about the technique of using mirroring and updating one mirror and then swapping - but this is overkill for my users.Peter
go to post Peter Cooper · Mar 15, 2018 Hi EvgnyThanks - Only hope I can find the time and energy to keep it up :}Some questions about the community HTML editor....1. Is there an easy way to add hyperlinks in-line or do you have to manually edit the source2. Also are there any checking rules that are enforced re hyperlinks3. How do you upload images (screen shots)4. Is it possible to embed video (again screen shots) or is it better to use YouTube and add a link5. Is there a better free HTML editor/method of working to use to create articles rather than built in?Peter
go to post Peter Cooper · Mar 13, 2018 Hi EvgenyWe have met briefly at last autumn's Developer meet at the Belfry - keep up the great workNew Tag - absolutely (please)Angular2, Angular4 and now Angular5 are enhancements of the same basic product Angular1 (now AngularJS) is a different thingHave a look at https://dzone.com/articles/learn-different-about-angular-1-angular-2-amp...(that's not a typo the URI is as spelt)or google differences between angularjs and angular2Peter
go to post Peter Cooper · Mar 13, 2018 Hi KevHope you are keeping wellHave you triedwrite $System.OBJ.UnCompile("*") from terminal? - replace the "*" with more specific wild cardwrite $System.OBJ.Delete("class name") orwrite $System.OBJ.DeletePackage("package name")I had similar issues -but difficult to pin downPeter
go to post Peter Cooper · Mar 13, 2018 Hi SabarinathanI use pdfPrintCmd - see http://www.verypdf.com/app/pdf-print-cmd/index.htmlit's not free but I have had no issues with it over many years of useThe idea is:a. Write out the pdf to a directoryb. set xResult=$zf(-1, "print command exe")The great advantage is that it has command line options that control margins, double sided etc etcAlso it can be run in the background - this is very useful when doing batches of pdf where creating the pdf using FOP can take several seconds - the idea isa. scan the data via SQLb. create the pdfc. use printCmd to send to a printerTo give you an idea of it working I have a client that produces around 50 multi page passport a day - the printing takes around 1 hour - it's set going as a Cache background job and the printer just chugs away.Peter
go to post Peter Cooper · Jul 6, 2017 FAO MikeHi MikeYou ask some very good questions!It's been a few (10+) years since I did some metrics on Cache performance - working on it now (when I have time)I will publish the results hereBut my initial findings are much what I said - given enough memory (8Gb allocated to global buffers) it don't much matter - so a 200Mb global (10,000,000 rows with links to 2 other tables) I can't see any significant diff in the performance between parent/child or foreign key - or bitmap re normal indicies.going to try it with 50,000,000 and limit the amount of memory allocated to global buffersWatch this spacePeter
go to post Peter Cooper · Jul 6, 2017 FAO ScottHi ScottSorry it's taken me so long to get back to you re real world parents and childrenIt's the example you gave of Cache relationships - and I was a bit quick fire with my answer - sorry!. Please let me expand.....Real world parents and children is an interesting problem and a simple solution that you described is *not* the way I would model it!!this is regardless of the actual implementation - eg parent/child or foreign keys etc etc= =If i wanted to model family trees I would have a class "Person" and another class "PersonRelationship" - the second of these would have links to two instances in the "Person" tableSomething like (in relationships rather than foreign keys - but that's implementation rather than the oo design)PS - I am typing on the fly - so there may be errors!!!!= =Class Person as %PersistentRelationship rRelatedPerson as PersonRelationship [cardinality="many", inverseproperty="rLink"]property Name as %string;property pDoB as %Date;....etcAnd then PersonRelationship as %PersistentRelationship1 as Person[cardinality="one", inverseproperty rRelatedPerson]Relationship2 as Person[cardinality="one", inverseproperty rRelatedPerson]property RelationshipType as SomeLookupTable;....etcThe SomeLookupTable would describe (in words) the relationship eg "Son Of" and "Farther Of"For me this has some beautyYou can...Construct a family tree of infinite depth both forwards and backwardsUse recursive SQL programming to construct the treea "child" can have multiple links - eg "gene father" and "step father"The record is complete - eg a woman might have a "gene father" then a "step father" and then go back and be linked to the same "gene father" (life happens)It can also model surrogate parents via AI fathers or surrogate mothers or same sex relationshipsIt can be extended to, say, petsSome care has to be taken in the database insert/amend egavoid recursive relationships eg a Person is her own grandmother is not physically possiblea person cannot have the same mother and father (well with AI and embryo manipulation tailoring this *may* become a reality - but no problemo the model will still work)= =Hope this is clear and of interest - if you need any more info please ask= =PS - I was involved around 30 years ago in computing an "In Breeding Coefficient" for rare breeds (think small populations of endangered species in zoos) the aim was to give a metric on how inbred an individual was - small populations where is was common for both grandparents to be the same individual - the logic was intense to get a metric - you could have the case where the same individual was all of the both grandfathers and all 4 great grandfathers- not so good for preserving the gene pool !Peter
go to post Peter Cooper · Jun 24, 2017 WolfThinking about this the first point is do the children get added to or amended - if it's a lot (eg with case notes) then that will lead to mega block splitting as the data size for the whole record growsOTOH - if it's an invoice that (essentially) never changes then it's fixed and p/c is cools== ==Apart from the bit map indices only working for integer ID's - do if you really have squillions of products that you need to process then one-many must be the business solution so it gives the performance of select sum(qtySold) from invoiceLines where ProductID=123Peter
go to post Peter Cooper · Jun 24, 2017 FAO Mike KiddowAll of us Cache gurus are getting carried away with the internals and very deep design concepts that rely on an in-depth knowledge of how Cache works.....I guess you do not fully follow what we are discussing - (sorry not being dismissive at all) - it's just that me, Wolf, Dan Otto, Kyle are just being super-dooper Cache egg heads.Bottom line is...That for any reasonable sized database (ie less than <~10,000,000) rows on a modern server with adequate memory - it don't much matter!!! Cache will do the biz.= =But please note that the example from Scott is not correct...Having real children with real parents then the parent-child relationship is incorrect on many levels - because as I said before "a parent can die" and you don't want the children records to disappear!!!!In case I would have (as a first pass) a class "people" and another class "people relationships" which links different instances of people with each other= =oo methodology seems intuitively easy to understand - but it's notand on top of that is the pros and cons of Cache performance considerations that we have been discussingBut give it a goAs I said it don't much matter unless you are in serious terabyte countryPeter
go to post Peter Cooper · Jun 24, 2017 Hi WolfLong time no meet !!!This is such an interesting conversation...And it all depends on on the actuality - if you have a parent with a sqillion children - then the answer to p/c performance is not so goodAlso if you have a business object where the many side are constantly added to - eg a patients lab results- again different - leads to block splitting and a bunch of pointer blocks having to be loaded into memorySo...my business case argument is with an Invoice header and invoice lines then parent/child v one/many is the most efficient - they (really never change) so no index bock splittingBut I do take Dan's and Otto's comment about bitmaps only doing integer ID's= =Tell you a story..around 12 tears ago I was involved with a client with a 1.2Tb database - I was critisised by the in-house staff for IT staff for using naked global references in one module - blab blah - but it was 2% faster than full global references and in a tight loop 2% meant 30 mins off the processing time= =Having said that - I *will* be trying out one-many with a cascade deletePeter
go to post Peter Cooper · Jun 21, 2017 Hi KyleThanks for your excellent commentI agree - sort of....Bit it's a balance (as always) between loading buffers - it may be the case that there is an occasional need to just grab the dates - but if it's only a 10% (say) need whereas the 90% need is to display/process the header and the lines together then, for me, the 90% need should win out.Also if the dates (or whatever) are indexed then a selection on a date range (say) will only select the required rows from the index.= =But as I said before - it depends on the size of the system - my clients have modest needs (maybe 3 million rows max) and with a 64Gb machine all/most of the blocks are in memory anyway But thanks for the thoughts - I will certainly start looking at one-many with a cascade deletePeter