go to post Julian Matthews · Mar 21, 2023 Hey Yuri. The users are held within the SQL table "Security.Users" in the %SYS namespace, so you could use embedded sql to return the information, however as you're unlikely to be executing your code directly from the %SYS namespace, I'd suggest creating a function that you pass the email address, and it returns the username. Something like: Class Demo.Utils.General.Users { ClassMethod UserFromEmail(Email As %String, Output Username As %String) As %Status { //Initially set this to null, as we want to return it empty when we get no results Set Username = "" //Hold the Namespace within a variable so we can use the variable to set the namespace back once the SQL has been run. Set CurrNamespace = $NAMESPACE //Change NameSpace to %SYS Set $NAMESPACE = "%SYS" //Run query to get the Username based on the email address - note the use of the UPPER function to remove issues with case sensitivity &SQL( Select ID into :Username FROM Security.Users WHERE UPPER(EmailAddress) = UPPER(:Email) ) //Set namespace back to the namespace the function was run from Set $NAMESPACE = CurrNamespace //Evaluate SQLCODE for result //Less than 0 is an error. If SQLCODE <0{ WRITE "SQLCODE="_$SYSTEM.SQL.Functions.SQLCODE(SQLCODE) QUIT 0 } //Greater than 0 can really only mean Code 100, which is no results found. If SQLCODE > 0 { QUIT 1 //No Result Found } Else { QUIT 1 //Result Found } } } DEMO> WRITE Class(Demo.Utils.General.Users).UserFromEmail("YuriMarx@ACME.XYZ",.Output) 1 DEMO> WRITE Output YMARX This is by no means perfect as I have thrown it together for the example - please forgive the messy if/else's!
go to post Julian Matthews · Mar 14, 2023 There's a few "gotchas" when it comes to Character Encoding. But the key thing in you case is understanding the character encoding being used by the receiving system. This should be something specified in the specification of the receiving system, but many times it's not. If I had to guess, it's most likely that the receiving system is using UTF-8 simply because latin1/ISO-8859-1 encodes the pound symbol as hex "A3" whereas UTF-8 encodes to "C2 A3". As there's no solitary "A3" in UTF-8, there's nothing to print, which is why you get the ? instead. I'm sure there's other character sets where this can happen, but I would start there.
go to post Julian Matthews · Nov 16, 2022 Hey Andy.When you're copying the router from your production, it will continue to reference the same rules class in the settings as per: After you have copied the Router, you will want to hit the magnifying glass and then use the "Save As" option in the Rule Editor. After you have done this, you can then go back to your Production and then change the rule assigned to your Router via the drop down to select the new rule. Just make sure you create a new alias for your Rule on the "general" tab on the rule page.
go to post Julian Matthews · Nov 9, 2022 Hey William. I'm pretty sure you just need to query the table Ens.MessageHeader. This should give you the process by way of the column SourceConfigName, and the status of the discarded messages. For example: SELECT *FROM Ens.MessageHeaderWHERE SourceConfigName = 'ProcessNameHere' AND Status = 'Discarded' You may want to consider including a time range depending on the size of the underlying database.
go to post Julian Matthews · Oct 28, 2022 I ended up extending EnsLib.HL7.Operation.TCPOperation and overwriting the OnGetReplyAction method. From there, I coped the default methods content but prepended it with a check that does the following: Check pResponse is an object Loop through the HL7 message in hunt of an ERR segment Checks value of ERR:3.9 against a lookup table If any of the above fail, the response message is passed to the original ReplyCodeAction code logic, otherwise it quits with the result from the Lookup Table. The use of the Lookup Table then makes adding/amending error actions accessible to the wider team rather than burying it within the ObjectScript, and having the failsafe of reverting to the original ReplyCodeAction logic keeps the operation from receiving an unexpected error and breaking as it has the robustness of the original method.
go to post Julian Matthews · Oct 25, 2022 Hey Patty. If you just simply need the empty NTE to be added in using the DTL, you can set the first field to an empty string to force it to appear. For example: Will give this: Note that my example is simply hardcoding the first OBX repetition of every first repeating field with no care for the content. You will likely need to do a for each where you evaluate if the source NTE:1 has a value, and then only set to an empty string if there is no content in the source.
go to post Julian Matthews · Oct 21, 2022 Hey Kev. The main way to build upon this would be to use something like Prometheus and Grafana to pull data out and then display it in a human readable way, and it has been covered on the forums a few times. However, if you were to upgrade past IRIS 2020, you should find that you are able to utilise System Alerting and Monitoring (SAM) in your environment.
go to post Julian Matthews · Sep 28, 2022 So upon further review, it seems that the first ACK is being generated by the Operation, and the second one is the body of the HTTP Response. Basically, the operation will attempt to parse the http response into a HL7 message, and if that doesn't happen, it will then "generate" an ack and write the http response data at the end of the generated ack. In my case, although there is a HL7 message in the response, it's not being parsed for some reason, so the code moves onto generating its own ack, followed by the http response body, which is the second ack I was seeing. I'm now replicating the HTTP operation and attempting to pin down exactly where it's falling down, and failing that I will likely reach out to WRC as it seems to be an issue deeper than I can dive.
go to post Julian Matthews · Sep 21, 2022 If you have a record map configured and have the generated class, then the next step would be to use this generated class in a transform as your source. From there, you can transform the data into the SDA class you have set in your target for transform. Once you have this SDA message, you can send it wherever your requirements need you to send them.
go to post Julian Matthews · Aug 11, 2022 Your Process is most likely using ..SendRequestAsync() to send to the Operation and has "pResponseRequired" set to 1 (or not set at all, so it's using the default value of 1). There's nothing inherently wrong with this, but if you just want to send to the Operation and not worry about the response going back to your process, you could change the "pResponseRequired" flag to 0 in your call. So it would look a little like this: Set tSC = ..SendRequestAsync("TargetOperationName",ObjToSend,0) However you may wish to consider if this approach is appropriate to your setup, or if you would be better off using "SendRequestSync()" and dealing with the response synchronously.
go to post Julian Matthews · Aug 10, 2022 To parse the json, the below is a starting point for taking the content of the stream into a dynamic object, and then saving the value into its own variable. Set DynamicObject=[].%FromJSON(pRequest.Stream) Set Name = DynamicObject.name Set DOB = DynamicObject.DOB Set SSN = DynamicObject.SSN You could then store these wherever you need to. If your SQL table is external, then you could have your Operation using the SQL Outbound Adapter to then write these in your external DB.ETA: If you then need to pick out the values within the content of name (which I assume has come from a HL7 message) you could use $PIECE to pick out the data from the delimited string you're receiving.
go to post Julian Matthews · Jul 29, 2022 Hey Daniel. As a starting point, I would not be adding the trace when viewing the DTL in Studio, and instead I would add it when using the Data Transformation Builder: Which gives me: If this is not working for you, make sure that the router has "Log Trace Events" enabled in the router settings and the router has been restarted since enabling the trace. I have been caught out numerous times enabling the trace and then forgetting to restart the process/router in the production.
go to post Julian Matthews · Jul 21, 2022 The issue of no results being returned is likely elsewhere in your query. To test this, I created a basic table with the following: CREATE TABLE Demo.BBQ ( Name varchar(100), Type varchar(50), isActive bit) And I then added a few rows: Insert into Demo.BBQ (Name, Type, isActive)VALUES('Super Grill''s BBQ Hut','Outdoor',1) Insert into Demo.BBQ (Name, Type, isActive)VALUES('Bobs BBQ Bistro','Indoor',1) Insert into Demo.BBQ (Name, Type, isActive)VALUES('Rubbish Grill''s BBQ Resort','Not Known',0) This then gave me a table that looks like this (note that the double single quotes used in the insert are inserted as a single quotes into the table): If I then run a query using the like function: And if I want to exclude the inactive location: The use of doubling up a single quote to escape the character is not a Intersystems specific approach, but is generally a standard SQL way of escaping the single quote.
go to post Julian Matthews · Jun 20, 2022 That just means you're one step closer to solving the issue! What are the data types for each field you're attempting to insert into? Have you tried just inserting a row with just 1 of the fields populated? Something like: INSERT INTO Phu_Replay_Schema.ReplayMessageModel (Completed) VALUES (1)
go to post Julian Matthews · Jun 20, 2022 Hey Lewis. Could you try swapping out "true" for it's bool equivalent? So try: INSERT INTO Phu_Replay_Schema.ReplayMessageModel (Completed, MessageHeaderId, NewHeaderId, NewTargetName) VALUES (1, 3616, null, 'Router_ReplayHL7')
go to post Julian Matthews · Jun 13, 2022 Hey Ciaran. To overcome the issue of multiple ORCs overwriting your PIDgrp NTE in your copy, you will likely need to maintain a separate count of what is being copied over. So you could look to do something similar to: (pay specific attention to actions 2, 9, and 10) Which would then give the following result: You could then take it a step further by setting the value of your PIDgrp NTE to your count value before incrementing it: Which would then give you: As a warning, I did find that the DTL building went a bit weird when it comes to the auto-applying of the key counts. So I was adding a For Each on a repeating field and it assigned a key of say k1, then when adding the sets within the for each loop, it would randomly use k2 or k3 in the field. it might just be a limitation of how I was building things up, but it's one to keep an eye out for as it'll give you unexpected results if it happens to you. Good luck :)
go to post Julian Matthews · May 19, 2022 Hey Ephraim. I have thrown together a task which should do what you need. The code is a bit verbose and could be cut down a touch, but hopefully it's human readable enough for you to pick out what its doing. Effectively, it takes the current date to then grab a date from last month, and then gets the first and last date of that month to then use in the audit method. Class Demo.Tasks.MonthlyAudit Extends %SYS.Task.Definition { Method OnTask() As %Status { Set tSC = $$$OK //Get Current Date Set CurentDatetime = $ZDATETIME($HOROLOG,3) //The report needs to be for last month, so get a date from last month based on todays date Set LastMonth = $SYSTEM.SQL.DATEADD("MM",-1,CurentDatetime) //Get last Day of last month As Horolog Set LastDayHoro = $SYSTEM.SQL.LASTDAY(LastMonth) //Convert Horolog into a Date Set LastMonthEnd = $ZDATETIME(LastDayHoro,3) //Get First Day of Last Month Set LastMonthStart = $SYSTEM.SQL.DATEPART("YYYY",LastMonthEnd)_"-"_$SYSTEM.SQL.DATEPART("MM",LastMonthEnd)_"-01" //Switch to the %SYS Namespace ZNspace "%SYS" Set tSC = ##class(%SYS.Audit).Export("AuditExport.xml",,,LastMonthStart_" 00:00:00",LastMonthEnd_" 23:59:59") Quit tSC } } Then, when setting the task up, I would set it to run on the first Monday of the Month, and it will grab everything from the previous month.
go to post Julian Matthews · Mar 3, 2022 I believe the privilege comes from the OS user that is launching Terminal. Have you tried running Terminal as an Admin and seeing if runs as expected?
go to post Julian Matthews · Feb 17, 2022 Hey Kurro. You should be able to include the allowed source in the rule constraint: Alternatively, you could add a rule above that looks like this:
go to post Julian Matthews · Feb 1, 2022 Hey Stefan. Is the file saved as a class when trying to run the formatter? I found previously that it's not enough to just set the language in a new class, but the file needs to be saved as the formatter relies on the file extension.