We are pleased to invite you to the InterSystems UK Developer Community Meetup on 17th of October!
The UK Developer Community Meetup is an informal meeting of developers, engineers, and devops to discuss successes and lessons learnt from those building and supporting solutions with InterSystems products.
The Season's holidays are fast approaching, and it's time to get ready for them! Join the Advent of Code 2024 with InterSystems and engage in our ObjectScript challenge!
I want to create an interface specific purge job. Please let me know if there are any holes in my approach. I realize that an interface that went from HospitalAService to HospitalARouter to PracticeBOperation would require two separate executes in my example below, but I want that granularity as there are some intermediate steps in our workflows that we don't need to retain messages for.
I would like to start a discussion regarding Caché Objects and Caché SQL.
It is my understanding that the creators of Caché Objects see Caché SQL as the reporting arm of Objects and as such SQL is essential to Caché Objects.
I once met a Caché Objects programmer who was writing code to $Order through the Globals because that person thought that Caché SQL was too slow and inefficient. I attempted to convince the person otherwise.
So, what say you? Is SQL essential to Caché Objects?
I am going to develop a ASP .NET Core Application. In that Hos can use IRIS Entity Framework. I searched but I couldn’t find IRIS Entity Framework for .Net Core . Please kindly help me to overcome this issue.
I have a class that has a property calledTags (like DescriptiveWords, but tags), where multiple tags are possible. I am trying to decide on list of Objects vs. array of Objects.
we are receiving the MDM document from one Data Source , we need to convert the MDM to ORU to send to the receiving Data Source.
Does HS components could make this conversion or we need to use Ensemble mapping to convert the MDM to ORU message?
Ensemble 2015. Working on an a way to send NACK'd HL7 messages to a flat file for external review/troubleshooting. (Similar to the way BadMessageHandler deals with validation errors.)
I think I have the Alert piece down, but need assistance with the exact syntax to do an SQL query in the DTL (or a custom function) to pull the HL7 message Raw Content into the Alert, based on the SessionID.
(Also, anything special to write alerts to the File Operation?)
Would you like to be sure your data transforms work as expected with a single command? And what about writing unit tests for your data transforms in a quick and simple way?
When talking about interoperability, there are usually a lot of data transforms involved. Those data transforms are used to convert data between different systems or applications in your code, so they are running a very important job.
Just wondering if there's a quick way to get the DocType, without knowing it in advance, from a message body by building a string from the VersionID, MessageType, and TriggerEvent fields? (Similar to how it might be done dynamically in Rhapsody)
Can this then be used to set the DocType for a source message?
If this is not the best practice what is a suitable alternative approach?
Is there a way to get the list of Business Services from a command line call? We are trying to see if there is a way we can automate bring down our Inbound Business Services during a fail over.
Our team is looking for a way to export all of our Cache SQL tables into Microsoft SQL Server. I have only found a method to export one table at a time into an ASCII file. We have over 170 tables so this would be very tedious and time consuming. Is there a way to directly export from Cache to SQL Server. Alternatively is it possible to export the entire database in a single shot or even multiple tables to text files?
We are not using github or any source safe till now. We have ensemble hl7 interfaces (business services, processes, operations)available in production. Now we want to deploy these interfaces to a brand new cloud server with iris instance.
Here in current production we have studio access but new cloud server iris we have only vscode access.
I have exported all the classes from current production using Studio and I have exported xml file with me.
This code snippet sends an XML request to a server and saves the response to a file. The class method "test" runs the code:
Class objectscript.postXML
{
classmethod test() {
Set HTTPRequest = ##class(%Net.HttpRequest).%New()
Set HTTPRequest.ContentType = "text/xml"
Set HTTPRequest.NoDefaultContentCharset = 1
Set HTTPRequest.Location = "ITOMCZ"
Set HTTPRequest.Server = "wph.foactive.com"
Do HTTPRequest.RemoveHeader("User-Agent")
Do HTTPRequest.RemoveHeader("Accept-Encoding")
Do HTTPRequest.RemoveHeader("Connection")
Do HTTPRequest.SetHeader("Expect","100-continue")
Set RequestXML = ##class(%Library.File).%New("c:\test.xml")
Do RequestXML.Open("RS")
Do HTTPRequest.EntityBody.CopyFrom(RequestXML)
Do RequestXML.%Close()
Do HTTPRequest.Post(HTTPRequest.Location)
Do $System.OBJ.Dump(HTTPRequest)
Do $System.OBJ.Dump(HTTPRequest.HttpResponse)
Write HTTPRequest.HttpResponse.Data.Size
Write HTTPRequest.ContentLength
Set ResponseStream = ##class(%Stream.FileBinary).%New()
// Second part is typically the file extension, i.e.: application/pdf -> pdf
Set FileType = $Piece(HTTPRequest.HttpResponse.GetHeader("CONTENT-TYPE"),"/",2)
Set ResponseStream.Filename = "C:\test."_FileType
Write ResponseStream.CopyFrom(HTTPRequest.HttpResponse.Data)
Write ResponseStream.%Save()
Do ResponseStream.%Close()
}
}
NewBie's Corner Session 28 Various Methods to Traverse a Global
Welcome to NewBie's Corner, a weekly or biweekly post covering basic Caché Material.
Judging from the number of responses to Session 27 Traversing A Global, developers are passionate about their methods. I am not here to judge the merit of the various methods.
Over the next few pages I will demonstrate a number of methods to Traverse a Global. If you don't already have a favorite they may help you pick one.
Hi All,
How to get the only folders (with sub-folder)from the particular drive using cache.
We need to create the only folders from some drive using Cache.
When deploying IRIS in a container, what are recommendations for permitting users terminal access? There is docker exec -it, but that does not work from a user's workstation where - in the old deployment model - they would connect to the server over ssh using Putty and open a terminal session from there.
If you've got more than one developer on a project, do you each work in your own namespace? Or do you all use a common namespace?
Through my work at George James Software I have encountered many different Caché and Ensemble development setups. At risk of over-generalizing, the older and more established users of InterSystems technologies seem more likely to have all their developers working in a common namespace, whereas the newer 'converts' tend to favour giving each developer their own namespace.
I am looking for any pointers on how Intersystems IRIS Health can monitor a filesystem/Folder that user/s /applications can drop in CSV files via FTP and load the file to the IRIS DB . I understand that I will need create a record map for the CSV files, I am looking for any configuration references on how how to process files using file inbound adapters with the intent to pick up the CSV file as they are dropped in the target location and pass it to a Business process and ingest into the IRIS database