go to post Julian Matthews · Jan 10, 2019 You might be hitting a hard limit on the performance of the hardware you're using. Are you able to add more resource and try again?
go to post Julian Matthews · Jan 9, 2019 What type of business service are you using? If you are using a single job on the inbound, I guess you're hitting a limit on how fast the adapter can work on handling each message (in your case, you're getting around 15ms per message)You could look at increasing the pool size and jobs per connection if you're not worried about the order in which the messages are received into your process.
go to post Julian Matthews · Dec 27, 2018 Hi Alexandr.If you are looking to run a task at specific times, you could create a new task which extends %SYS.Task.Definition to then be selectable as an option from the task manager.For example, I have a folder which I need to periodically delete files older than x days.To achieve this, I have a class that looks like this: Class DEV.Schedule.Purge Extends %SYS.Task.Definition { Parameter TaskName = "Purge Sent Folder"; Property Directory As %String; Property Daystokeep As %Integer(VALUELIST = ",5,10,15,20,25,30") [ InitialExpression = "30" ]; Method OnTask() As %Status { Set tsc = ..PurgeSentFolder(..Directory,..Daystokeep,"txt") Quit tsc } Method PurgeSentFolder(Directory As %String, DaysToKeep As %Integer, Extention As %String) As %Status { // Calculate the oldest date to keep files on or after set BeforeThisDate = $zdt($h-DaysToKeep_",0",3) // Gather the list of files in the specified directory set rs=##class(%ResultSet).%New("%File:FileSet") Set ext = "*."_Extention do rs.Execute(Directory,ext,"DateModified") // Step through the files in DateModified order while rs.Next() { set DateModified=rs.Get("DateModified") if BeforeThisDate]DateModified { // Delete the file set Name=rs.Get("Name") do ##class(%File).Delete(Name) } // Stop when we get to files with last modified dates on or after our delete date if DateModified]BeforeThisDate set tSC = 1 } quit tSC } } Then I created a new task in the scheduler, selected the namespace where the new class exists, and then filled in the properties and times I want the task to run.
go to post Julian Matthews · Dec 24, 2018 Hi Eric.My first check would be looking at the console log for that instance to see if there's anything wobbling in the background. Specifically checking for any entries around the time the monitor thinks it has gone down.Failing that, it's probably worth going to WRC. The last thing I think you need this close to Christmas is the Primary dropping and you needing the Mirror to be working.
go to post Julian Matthews · Dec 24, 2018 If you go to the management portal for the "down" mirror, are their any errors that might point to the issue?I recently saw this happen where the mirror had run out of space to store the journal files, so the mirror stopped functioning and was showing as "down".
go to post Julian Matthews · Dec 11, 2018 Take a look at this documentation. It goes in to a lot of detail with some useful diagrams.*edit*There is also this useful Mirroring 101 forum post.
go to post Julian Matthews · Nov 29, 2018 Hi Stephen.Are you able to select the specific queue from the Queues page and press the abort all button, or does it return an error?
go to post Julian Matthews · Nov 15, 2018 Yes, a new schema can do this along with a transformation.If you have an existing schema, probably best to clone it and then edit the clone to speed things up.
go to post Julian Matthews · Nov 2, 2018 So I found that it is possible to save single messages using the "HL7 V2.x Message Viewer" which might not be suitable for you if you're looking to export loads of messages.One option could be to add a new HL7 file out operation, search for your desired messages from the Router you wish to "export" from and then resend them to a new target which can be selected from the Resend messages page.
go to post Julian Matthews · Oct 18, 2018 Sorry John, I hadn't had my coffee when I read your post.When you look at the first message heading info within the Trace, does the Time Processed come before or after the Time Created of Message 2?
go to post Julian Matthews · Oct 1, 2018 Hi all, I have answered my own question.Per number, I will need to create a new HS.SDA3.PatientNumber and set the required information, and then insert each HS.SDA3.PatientNumber into the HS.SDA3.PatientNumbers list that exists within the HS.SDA3.Patient object.For the benefit of anyone else that stumbles across this post (and myself when I forget this in a few weeks time and end up finding my own post): To achieve this in Terminal as a testing area, these are the steps I followed:Set Patient = ##class(HS.SDA3.Patient).%New()Set PatNum1 = ##class(HS.SDA3.PatientNumber).%New()Set PatNum2 = ##class(HS.SDA3.PatientNumber).%New()Set PatNum1.Number = "123456"Set PatNum1.NumberType = "MRN"Set PatNum2.Number = "9999991234"Set PatNum2.NumberType = "NHS"Do Patient.PatientNumbers.Insert(PatNum1)Do Patient.PatientNumbers.Insert(PatNum2)
go to post Julian Matthews · Sep 21, 2018 Hi Akio.Generally speaking, the password for the "_SYSTEM" account is set by the user during the install process, so I don't think there will be a default.
go to post Julian Matthews · Jul 27, 2018 Thank you Vitaliy, not sure how I missed that in the documentation!
go to post Julian Matthews · Jul 17, 2018 Is the Business Process custom? If so, it's possible that there is a bit of bad code that is returning an error state and then continuing to process the message as expected?It might help if you provide some more detail on the BP itself.
go to post Julian Matthews · Jun 15, 2018 Hi Guilherme.I think your best starting point will be providing your system specifications, the OS you're running Studio on, and the version of Studio/Cache you are running.Depending on the issue it could be anything that is causing your problems.
go to post Julian Matthews · Jun 15, 2018 You could run the compile command you would use in Terminal within Studio using the output window. However, the only "speed boost" would be time saved from not launching and logging in to Terminal.If you're suffering from really slow compile times, it might just be that your namespace is huge, or a sign of a need to upgrade your hardware.Get in touch with either WRC or your sales engineer, and they'll happily help out.
go to post Julian Matthews · Jun 4, 2018 Thanks for the response.It sounds like I should be fine with the machine I'm running (Win7, 120GB SSD, 8GB RAM, i5 CPU (dual core)).My biggest hits are at start up, but once up and running it's pretty snappy. I should probably try to be more patient!
go to post Julian Matthews · May 29, 2018 The accepted answer would probably be your best shot.Say for example you wanted a count of all messages that have come from a service called "TEST Inbound", you could use the SQL query option (System Explorer>SQL) to run the following:SELECT count(*)FROM Ens.MessageHeader WHERE SourceConfigName = 'TEST Inbound'If you wanted to put a date range in as well (which is advisable if what you're searching is a high throughput system and your retention is large):SELECT count(*) TotalFROM Ens.MessageHeader where SourceConfigName = 'TEST Inbound' AND TimeCreated >= '2018-04-30 00:00:00' AND TimeCreated <= '2018-04-30 23:59:59'