David - I completely agree. It's an excellent suggestion and we'll get it put in place.
- Log in to post comments
David - I completely agree. It's an excellent suggestion and we'll get it put in place.
David,
Thanks for your feedback!
In general, Developer Download always makes available the latest GA version of our two Community Edition products. In this case we wanted to be able to launch by Global Summit and so we went with the Preview since 2019.1.1 kits were not full GA yet. There are ongoing discussions about the pros/cons of making containers available here rather than people just fetching them directly from Docker. We'll let you know the final decision!
Excellent enhancements! Much needed and will be widely used I am sure :)
Thank you Stefan!
@Murray Oldfield - once again, bravo! Thank you for making this information more available to the community!
@Murray Oldfield - thank you for the time you spent putting together these performance articles!
Thank you! It's great having all 3 examples in one place :)
For anyone currently at Global Summit 2019, @Amir Samary will be demonstrating the new Evaluation Service TODAY at from 1:30-2:15 in the Tech Exchange, table 3. Come and check it out and give us your feedback!
Thanks,
Ben Spead
Manager, Application Services, InterSystems
You can grab a Windows version of Caché 2018.1 from the WRC Distribution site and just install Studio out of there, or InterSystems IRIS Studio can also connect to earlier Caché versions, so you can grab a windows InterSystems IRIS Community kit from download.InterSystems.com and install Studio out of there.
Vikram ... for now, I suggest that you use the opportunity to try InterSystems IRIS for free hosted in a cloud container:
https://www.intersystems.com/try-intersystems-iris-for-free/
Keep your eyes open for an announcement during Global Summit about another way that you can get your hands on InterSystems IRIS for evaluation purposes.
Thanks!
Ben
Note - if you're trying to do the former, you can also download and install Atelier, which is the InterSystems IDE plug-in for Eclipse:
Support should indeed help you out.
What exactly are you trying to do? Are you already running an InterSystems IRIS server at your organization and you just want to install Studio to connect to it? Or are you looking to install a local evaluation copy of the data platform as well as the IDE?
I am assuming that when you are talking about 'folders' you mean the structure which individual items are exported into when you use your source control hooks, correct? To achieve this you need to loop over all items in the namespace and call the source-control related export on each of them.
The way we do that for our internal systems is to the the BaselineExport() method in the %Studio.SourceControl.ISC class. %Studio.SourceControl.ISC is our source control hooks class for Perforce, and I haven't tried calling BaselineExport() while another set of hooks are configured for the namespace, it may *just work*, especially if your GitLab hooks use the ^Sources global to describe the export structure. Give is a try and let us know if it help (if not, I can get you the code for that method and you could adopt it for your purposes)
could you please give a little more of a description as to what you are hoping to accomplish? A JS file will be executed on the client, where-as "Caché Code" (by this I assume you mean Object Script?) is executed on the server.
You can edit JS files using Studio, you can create object script class projections to automatically create JS files with JS logic in it, you can send JS from a server process to the web browser, etc - there are may ways for Caché Code to interact with, inform or manipulate JS files. We need more details for what you want to do.
Per your second question, best practice is generally to use System Defaults which are set in your Namespace and store the production settings (rather than storing them in the Production class). This allows you to prevent having to have differences in the Production class between branches.
Steve - #2 is helpful if you want to leverage existing structures for authentication, auditing, transport or other functions that rely on CSP sessions . This can be used as part of a strategy of incrementally moving an application from a CSP-based architecture to Angular.
If your machine is virtualized, just clone the VM (that is what we do for all of our test upgrades). NOTE - make sure to stand it up on an isolated network segment so that it doesn't try to do any inbound file processing or other communication that it should not (especially if you clone LIVE).
Thanks again for sharing the start of your journey with the Community. I am curious if you are planning to provide another Update Article? I would love to hear about how your journey has progressed over the past year and what you have learned along the way which could help others!
Cheers,
Ben
Bouncing off of @Robert Cemper 's hints, if you are using a default SMP (system management portal) login you can just pass in the arguments for the username and password fields as follows:
https://mytest.myserver.com/csp/sys/UtilHome.csp?CacheUserName=tech&CachePassword=demo
I just tested this and it worked like a charm :) (use your own credentials of course)
Exactly! We are actively using it internally within InterSystems - both in internal application development, and in HealthShare product development. As discussed in the video we have found incredible value in this tool for decreasing technical debt (and making sure new changes don't add to it). We thought customers might find the value in it as well.
You are most welcome. In terms of why TestCoverage was released on OpenExchange, it is something we have been exploring internally last year and wanted to share with the Community in time for Global Summit 2018. In terms of whether or not it will actually make it into product, I can't speak to that but perhaps the author @Timothy Leavitt can comment on that (I believe there were at least exploratory discussions with Product Management on this topic).
The first half of that should be helpful. Code coverage may be helpful too if they are able to move into a CI type BUILD infrastructure (I don't know how well Test Coverage would work in a more dynamic Dev environment as longitudinal history might be harder to maintain ... haven't really thought about that too much before...)
Mike,
We're using UnitTesting for application validation for internal application development within InterSystems. If you have any specific questions, feel free to create new Questions in the D.C. and tag me, or if you would prefer a general discussion you can ask your Account Manager or Sales Engineer to set up a discussion with me.
There have also been several Global Summit presentations which have touched on the topic - not sure if you've seen these?
Best,
Ben Spead
Manager, AppServices, InterSystems
Sorry - I forgot that you first need to specify the source control class in the Management Portal. Put "Source Control" in the Search box on the SMP homepage, and then go to that page (e.g. http://localhost:57772/csp/sys/mgr/%25CSP.UI.Portal.SourceControl.zen). Select the Namespace in the left column and then "%Studio.SourceControl.ISC". Save the changes and try the BaselineExport() again. When you are done with the export, change the Source Control back to "None"
As a general tip, if you do enable and enforce source control, then you wouldn't need to be querying your class definitions to find variations between environments - you could see all of that (and so much more!) in your source control system ;)
I understand ... so you want to only export specific things which it has found to have differences in the code. That makes sense since you are trying to use it at the end of a piece of code already in place. I am afraid I don't know how to do that, but if you contact Support they may be able to suggest a trick or two.
Is there a reason that the use of a Virtual Namespace is a requirement? If not, you can use ##class(%Studio.SourceControl.ISC).BaselineExport() to export the entire contents of the Namespace to disk for file-based diffing (this is how I normally do it)
I can only comment on the original functionality in %Studio.SourceControl.ISC - if you're extending it you will need to do some debugging to see exactly why you are seeing the behavior that you are.
In the original %Studio.SourceControl.ISC class, when working in Connected mode (ie, Caché can issue p4 commands in real time to the Perforce server), the p4 command should take care of changing the file back to ReadOnly, which is what triggers GetStatus() to see it is uneditable and therefore not checked out. It may be that your GetStatus() isn't correctly interpretting the Reaonly state of the file, or if you are on UNIX, it may be that Caché doesn't see it properly as Readonly (this can happen if you are running your instance as Root). Note - I think there may be a bug where after checking the %Studio.SourceControl.Change table isn't updated appropriately, but the primary indicator of checked out/ checked in should be that Readonly bit on the file.
Hopefully this is enough to get you moving on this, and if not then I suggest you call Support to have them take a look with you and debug. If there are any other questions I can answer in this forum I am happy to try.
Best of luck!
Adrian - are you writing your own Perforce hooks or use the sample Perforce hooks that ship with Caché? (%Studio.SourceControl.ISC.cls).
In %Studio.SourceControl.ISC.cls it checks to see if the file in the local workspace is Readonly or ReadWrite. If Readonly it assumes it is not checked out and prompts the user to check out. If ReadWrite it will see if it is a multi-developer or single-developer instance. if single developer it can just edit it. If multi-developer it will check in %Studio.SourceControl.Change to see if the current user is the one who checked it out - if so they can edit, if not they can a message of who is editing it in the Output window and the item is treated as ReadOnly to them.
Eduard - I completely agree!
The nice (or annoying, depending on your perspective) thing about moving to the discipline of unit testing is that it forces you to write your code in a more modular (and therefore more testable) manner, and refactoring to meet these goals is a great byproduct of a testing focus :)
Alex - I think it isn't a smart idea to call into a specific line number of a routine since the contents will change. Instead you should call the line labels/functions called within the routine by outside of the routine and pass all appropriate variables, etc that way.
Trying to call a specific line number will certainly lead to faster than desired 'bit rot' in your unit test library.
Welcome to the the InterSystems Developer Community!!
1) ObjectScript is a (very, very large) superset of the M language (previously known as MUMPS)
2) InterSystems IRIS uses much of the core technology that was in Caché, but it is a break from the past in many ways and allows next-gen development tools, architectures, etc. InterSystems IRIS is an upgrade path which current Caché based applications should seriously consider. Zen is a legacy web development technology which runs on top of Caché (it ships with Caché and InterSystems IRIS)
3) The answer to this depends on whether you are taking straight globals, or persistent objects. Any data persisted via InterSystems Object or Relational structures can automatically be accessed via SQL. If you are just writing straight globals (e.g. set ^Patient(123)=foobar) then you can't query unless you set up an SQLStorage mapping to your globals
4) You can map your existing global structures to object definitions for OO/SQL access using SQLStorage (https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cl…; It looks like there isn't a lot in the docs on that so you should contact Support for more details on how to set it up to match your existing data to definitions which you can hit with traditional SQL.
Hope that helps!