go to post Rich Taylor · Apr 21, 2021 Let me elaborate a bit more on Dmitry's suggestion. IRIS for Health has a full FHIR server capability built-in. Rather than implement the API yourself, and have to keep up with the changing FHIR versions, you could use that. Now where the data comes from is a separate issue. For this you can use the interoperability of IRIS to reach out to your external systems to supply the data needed to complete the FHIR request. This stays with your use case of IRIS as an ESB to the rest of your environment. You can still use the InterSystems API manager to provide access to the service and mange that inteface.
go to post Rich Taylor · Mar 12, 2021 Alexey, I feel that this would be counter productive. Let me explain why. There is a fundamental difference in the purpose of journaling versus Auditing. Journals protect against data loss. The developers are in a position to determine whether or not a particular update to the database is important to the integrity of the system. Auditing it to help protect the Security of the data. Giving a developer the opportunity to turn off an auditing event deemed important to capture kind of defeats that purpose. It might be worth looking into what this external program is. Perhaps there is a native api that would accomplish this. You could also take a look at our gateways to see if you could ingest this external functionality to use directly in Cache. I'd also look at our IRIS product to see if a migration to that platform would provide the needed functionality or a better pathway to utilizing the external program. Finally, look at why this external program is called so often. Perhaps the calls can be optimized to reduce the audit events if this is a major issue.
go to post Rich Taylor · Feb 19, 2021 Weird, I don't see a log. That message pretty definitively says we have a license issue. I had based my earlier response on the fact that He seemed to be able to get some jobs working which would imply that the instances was running. That wouldn't happen if there was a license limitation exceeded on startup. As the message indicates the instance just shuts down. Mohana, have you been trying this in different environments? To echo Erik, please let us know how you are making out!
go to post Rich Taylor · Feb 18, 2021 The Community edition uses a core based license. It appears that your instance is running successfully and that some routines do execute. Therefore I do not believe that this is a license issue. If you had exceeded the number of allowed cores then the instance would not start. I would look at the routines that are not executing in the background successfully. It is possible that they are using Cache syntax that are no longer supported or has changed names. Try executing these routines in the foreground instead of as a background job. Verify that you get the results you expect. If that works try jobbing this off from the terminal session to see if it will run in the background at all. I would also examine the log files to see if you are getting any errors that are captured from the background execution.
go to post Rich Taylor · Jan 6, 2021 The best way to approach this would to engage with your sales engineer. We are always available to help evaluate use cases for our technology and to assist you in understanding the implementation of such interfaces. You can additionally begin your investigation with our documentation and learning resoures. Here are a couple of links to get you started. Enabling Productions to Use Managed File Transfer Services First Look: Managed File Transfer (MFT) with Interoperability Productions Managed File Transfer video
go to post Rich Taylor · Jan 4, 2021 I found that installing using "snap" installs a older version that does not support using the results of a previous 'docker login' command. Getting the latest version from https://github.com/mayflower/docker-ls/releases works.
go to post Rich Taylor · Dec 11, 2020 @John Murray Interesting. I had not seen this update as I am fairly sure the earlier versions didn't allow that functionality. That is an interesting option. Though, I think many will not make the effort to configure this way without a clear need to do so. I will definitely try playing around with it.
go to post Rich Taylor · Dec 11, 2020 Scott, One thing to keep in mind is that all code that you create or edit in VSCode is stored locally on your development machine as well as on the server (when saved and compiled). There is no need to export the code as it is already "exported". Just not packaged up into a single file. To the question about how to get this project into Production. The "proper" way is to have source control enabled with a proper development work flow (DevOps / Continuous Integration) such that you would just promote the work to the production stage. The implementation of your workflow should take care of moving the artifacts of your development into production. Given the way you present the question I am going to assume that you don't have source control or a development workflow in place. So to take all the code you have carefully developed tested in the project and move it to production you can take two approaches. Keep in mind that this will only get the code that you have changed and not other artifacts like configuration globals or settings. (safest, and I use the word safe very loosely here. Refer back to my "proper" comment above) would be to take the folder/directory where your project currently lives and copy it to a new location. Edit the connection settings then "import and compile" the code. Do this by right clicking on the folder where all your code lives ('src'). Select "import and compile" Same as #1 except you edit the connection settings in the same folder that you did development. THIS IS DANGEROUS if you forget and start doing more work. A PROPER SOURCE CONTROL AND WORKFLOW process is really a better way to go. It will take a little effort to configure for your desired flow. Again I am making an assumption that you are not using Docker Containers so automating the process will be a little more involved. Tools like Chef and Puppet will help. You will need to research what would work best for you. As I said this will take some effort to get setup. In the end it will help you in time and consistency of process. Take a look at this article series on the community which may help: https://community.intersystems.com/post/continuous-delivery-your-intersy...
go to post Rich Taylor · Nov 18, 2020 Then that wonderful Ian Flemming intro gets reduced down to "vodka martini, shaken not stirred"
go to post Rich Taylor · Jun 4, 2020 Great write up! Thanks, saved me lots of googling and trying methods that don't really work :)
go to post Rich Taylor · Jun 1, 2020 @Armin Gayl The answer depends a lot on background. If you are a team experienced with Cache then you are probably already comfortable with Studio. Studio is stable and works well with InterSystems products. If you are primarily a Windows environment, working primarily with Cache/IRIS classes and objectscript, staying with this IDE is fine. On the other hand if you: work across platforms (windows, linux, mac) OR want to easily integrate source control OR need to program in multiple languages and/or different components (docker, angular cli, ...) OR need to attract new talent OR If any of the above is true I would recommend VSCode with the Objectscript plug-in. Why? mutli-platform (I work with a Linux desktop so this we important to me) easy source control integrations much more resource friendly than Eclipse which I found to be a resource hog plug-ins just seem to work better. Example the docker plugin(s) on Eclipse were a disaster. The one I have on VSCode is great Well accepted in the market Let my delve into point 4 above and the previous last bullet point as I think this gets overlooked or considered unimportant to the IDE question because "they have to learn a new language (objectscript) anyway". My opinion is that the issue of learning a new language is really overstated. Any programmer today can't even get out of bed without knowing several development languages. The real problem is HOW you work with those languages. In other words the IDE. If you can bring in someone that is already familiar with how to work with the IDE and the general workflow, which is similar regardless of language, then you remove one barrier to entry. Now it becomes learning just another scripting language with is a process that new developers are used to. VSCode is quite popular and is trending up. I participated in a Hackathon with InterSystems several months ago. In the end there were 70 teams that submitted projects. Figure an average of 3 people on a team as a rough estimate. Every single developer that we interacted with was using VSCode. This, along with some great templates from @Evgeny Shvarov , made it easy for them to get working with IRIS. In fact something like 12 teams used IRIS including the second place team (the first place team's solution was related to a process where they were forbidden by law from storing any data). So that's my $0.02. I would stay with Studio if that is your comfort zone and work primarily within InterSystems technology. If not go with Visual Studio Code. I would not consider Eclipse for many of the reasons others have stated and because it is really resource heavy.
go to post Rich Taylor · May 7, 2020 @Robert Cemper Whatever gets the process to work right! I would consider WSL2 also. Its probably just me, but conceptually I had a problem with running a virtual environment (VM) so I could run a virtual environment (Container). WSL was not functioning well enough at the time so I went all in on Ubuntu as my native OS. Good Luck ans
go to post Rich Taylor · May 7, 2020 Erik, Nice, I was not aware that the create volume option would allow Durable %SYS to function. Thanks for that
go to post Rich Taylor · May 7, 2020 Robert, The cause is that fact that Desktop Docker (Docker for Windows) is a bit of a miss-leading. This is not really using Windows containers at all by default, though it can. From what I understand, while it is getting better, true Windows containers are still a bit problematic. Also all our images are Ubuntu based anyway. What is really happening is that there is a small Linux (Moby) vm running under Desktop Docker. So when you share volumes from the windows file system to the container you are going through two transitions. One from the Container to the Host Linux then from the Host Linux out to windows. While this works for just Docker volumes, there are issues when trying to do Durable %SYS around permissions as you surmised. I have heard of people getting into the Moby Linux environment and messing with the permissions, but this seems too much trouble and too fragile for my tastes. You do have some options though. A better option might be to use WSL2 (Windows Sub-system for Linux). This will enable you to run an Ubuntu environment within Windows. You do need to be on WSL2 and not the first version which as too limited. Here are a couple of links: https://code.visualstudio.com/blogs/2020/03/02/docker-in-wsl2https://www.hanselman.com/blog/HowToSetUpDockerWithinWindowsSystemForLinuxWSL2OnWindows10.aspx you could just run a Linux VM You could go with the nuclear option and just switch to Linux as a Desktop. I took this route over a year ago, before WSL2 came out, and have not looked back. Hope this helps.
go to post Rich Taylor · Apr 21, 2020 Kevin, The best option is to work with IRIS for Health Community Edition which is free for development and education. You can get this from either Docker as a container you can use on your system or on AWS, Azure, or GCP if you want to work in the cloud. AWS, at least, has a free tier that is good for 750 hours a month up to a year. This is more than adequate for education and simple development. I have used this for demos for a time. https://hub.docker.com/_/intersystems-iris-for-healthhttps://aws.amazon.com/marketplace/pp/B07N87JLMW?qid=1587469562959&sr=0-3&ref_=srh_res_product_titlehttps://azuremarketplace.microsoft.com/en-us/marketplace/apps/intersystems.intersystems-iris-health-community?tab=Overviewhttps://console.cloud.google.com/marketplace/details/intersystems-launcher/intersystems-iris-health-community-edition?filter=category:database&filter=price:free&id=31edacf5-553a-4762-9efc-6a4272c5a13c&pli=1 if you follow the link in the top bar for 'Learning' you will find many education resources including some quick start topics on IRIS. And, of course, you can ask questions here.
go to post Rich Taylor · Apr 16, 2020 No Problem. The newer JSON handling is quite flexible (Dynamic Objects). When you move to IRIS you also get the %JSON.Adapter class. Add that to the inheritance of any class and you get extend that flexibility to whole object structures.
go to post Rich Taylor · Feb 21, 2020 Damiano, Keep in mind that Studio is VERY namespace centric. A single running instance of CStudio can only talk to a single namespace at a time. Even running multiple copies of CStudio can run into issues related to this and how projects track the information. As Dmitriy Maslennikov has indicated you can look at Visual Studio code with the VSCode-ObjectScript plug-in as long as you are on Cache 2016.2+ or IRIS. You can also use the Atelier plugin for Eclipse (Photon version only) which has much the same capabilities. One last thought is why you have two namespaces? If this is just to separate the code from the application data then you really don't need two namespaces. You need to configure a single namespace to reference the two databases, one for data and one for code. I would review the documentation for Namespace to be sure you are on the right track. https://cedocs.intersystems.com/ens201813/csp/docbook/DocBook.UI.Page.cls?KEY=GSA_config#GSA_config_namespace I would also encourage you to engage with your sales engineer to review your architecture and development direction.
go to post Rich Taylor · Feb 20, 2020 I don't think there is any option to do this. By executing this again from the command line you are indicating that you want to open CStudio. Why not just use the File (menu) -> Open? in CStudio and navigate to the class you want.
go to post Rich Taylor · Aug 5, 2019 If you used package mapping you may have forgotten to map the global too. Examine the class definition to find the global name to map. If you map the class, but not the global you are getting the code of the class, but the storage would be local. This allows the sharing of definitions across namespaces without sharing the data. Add the global mapping to share the data too.