go to post Julian Matthews · Sep 21, 2022 Hey Marc. Thank you for sharing this - I have no idea how I have yet to come across this!
go to post Julian Matthews · Sep 21, 2022 If you have a record map configured and have the generated class, then the next step would be to use this generated class in a transform as your source. From there, you can transform the data into the SDA class you have set in your target for transform. Once you have this SDA message, you can send it wherever your requirements need you to send them.
go to post Julian Matthews · Sep 20, 2022 If you're working with a CSV, then you could look at the Record Mapper or Complex Record Mapper depending on the input format. From there, you could then create a transform that uses the generated class from the record mapper as your source, and then the appropriate SDA class as your destination. However, if you're working with an actual xls/xlsx file, then I'm out of ideas.
go to post Julian Matthews · Sep 20, 2022 Hey Marc. Firstly, thank you for sharing this. This does seem to closely follow what I had intended to use, with a slight variation or two. Would you mind giving some insight on this line "set tReadLen=..#CHUNKSIZE" as I'm not familiar with the use of the # symbol in this way. Is this acting as a modulo in this context?
go to post Julian Matthews · Aug 11, 2022 Your Process is most likely using ..SendRequestAsync() to send to the Operation and has "pResponseRequired" set to 1 (or not set at all, so it's using the default value of 1). There's nothing inherently wrong with this, but if you just want to send to the Operation and not worry about the response going back to your process, you could change the "pResponseRequired" flag to 0 in your call. So it would look a little like this: Set tSC = ..SendRequestAsync("TargetOperationName",ObjToSend,0) However you may wish to consider if this approach is appropriate to your setup, or if you would be better off using "SendRequestSync()" and dealing with the response synchronously.
go to post Julian Matthews · Aug 10, 2022 To parse the json, the below is a starting point for taking the content of the stream into a dynamic object, and then saving the value into its own variable. Set DynamicObject=[].%FromJSON(pRequest.Stream) Set Name = DynamicObject.name Set DOB = DynamicObject.DOB Set SSN = DynamicObject.SSN You could then store these wherever you need to. If your SQL table is external, then you could have your Operation using the SQL Outbound Adapter to then write these in your external DB.ETA: If you then need to pick out the values within the content of name (which I assume has come from a HL7 message) you could use $PIECE to pick out the data from the delimited string you're receiving.
go to post Julian Matthews · Jul 29, 2022 Hey Daniel. As a starting point, I would not be adding the trace when viewing the DTL in Studio, and instead I would add it when using the Data Transformation Builder: Which gives me: If this is not working for you, make sure that the router has "Log Trace Events" enabled in the router settings and the router has been restarted since enabling the trace. I have been caught out numerous times enabling the trace and then forgetting to restart the process/router in the production.
go to post Julian Matthews · Jul 21, 2022 The issue of no results being returned is likely elsewhere in your query. To test this, I created a basic table with the following: CREATE TABLE Demo.BBQ ( Name varchar(100), Type varchar(50), isActive bit) And I then added a few rows: Insert into Demo.BBQ (Name, Type, isActive)VALUES('Super Grill''s BBQ Hut','Outdoor',1) Insert into Demo.BBQ (Name, Type, isActive)VALUES('Bobs BBQ Bistro','Indoor',1) Insert into Demo.BBQ (Name, Type, isActive)VALUES('Rubbish Grill''s BBQ Resort','Not Known',0) This then gave me a table that looks like this (note that the double single quotes used in the insert are inserted as a single quotes into the table): If I then run a query using the like function: And if I want to exclude the inactive location: The use of doubling up a single quote to escape the character is not a Intersystems specific approach, but is generally a standard SQL way of escaping the single quote.
go to post Julian Matthews · Jul 20, 2022 Lose one of the single quotes where my arrow is (there was a typo in my first reply). That way your final line should be: like '%Forceps McGill''s Neonatal NETS%' and i.Active = 1
go to post Julian Matthews · Jul 20, 2022 There's two ways around this: If this is something you're running one off in, say, the sql query tool within the management portal, then you will want to repeat the single quote to be "Select * from cnd.Facilities where name like ''%Grill''s BBQ%'" and this will escape the single quote. If this is a query being called within your code and the value could be anything passed to it, then you will want to look to use parameterised SQL queries. An example can be found for SQL adapters here.
go to post Julian Matthews · Jul 20, 2022 Hey Scott. I'm not sure if this will work at all, but have you tried extending your timeout for your CSP gateway? Management Portal -> System Administration -> Configuration -> Web Gateway Management is the route to this. The username you'll then need is "CSPSystem" and the password should be the default password used when installing the system. From within here you can navigate to "Default Parameters" and increase the Server Response Timeout parameter.
go to post Julian Matthews · Jun 20, 2022 That just means you're one step closer to solving the issue! What are the data types for each field you're attempting to insert into? Have you tried just inserting a row with just 1 of the fields populated? Something like: INSERT INTO Phu_Replay_Schema.ReplayMessageModel (Completed) VALUES (1)
go to post Julian Matthews · Jun 20, 2022 Hey Lewis. Could you try swapping out "true" for it's bool equivalent? So try: INSERT INTO Phu_Replay_Schema.ReplayMessageModel (Completed, MessageHeaderId, NewHeaderId, NewTargetName) VALUES (1, 3616, null, 'Router_ReplayHL7')
go to post Julian Matthews · Jun 14, 2022 I have to admit, I'm not familiar with where that code has come from, so it's difficult to comment on the syntax. That said, I think I can confidently say that your first line has the OBXgrp hard coded to the first repetition, but the second line has it set to k2. You will want a for each for the OBXgrp and a seperate one for the NTEs within the OBXgrp.
go to post Julian Matthews · Jun 13, 2022 Hey Ciaran. To overcome the issue of multiple ORCs overwriting your PIDgrp NTE in your copy, you will likely need to maintain a separate count of what is being copied over. So you could look to do something similar to: (pay specific attention to actions 2, 9, and 10) Which would then give the following result: You could then take it a step further by setting the value of your PIDgrp NTE to your count value before incrementing it: Which would then give you: As a warning, I did find that the DTL building went a bit weird when it comes to the auto-applying of the key counts. So I was adding a For Each on a repeating field and it assigned a key of say k1, then when adding the sets within the for each loop, it would randomly use k2 or k3 in the field. it might just be a limitation of how I was building things up, but it's one to keep an eye out for as it'll give you unexpected results if it happens to you. Good luck :)
go to post Julian Matthews · Jun 13, 2022 Hey Eduard, thanks for your detailed reply. As there is no source message to send in to my specific use case, then a registered class seems to be the way to go!
go to post Julian Matthews · May 19, 2022 Hey Ephraim. I have thrown together a task which should do what you need. The code is a bit verbose and could be cut down a touch, but hopefully it's human readable enough for you to pick out what its doing. Effectively, it takes the current date to then grab a date from last month, and then gets the first and last date of that month to then use in the audit method. Class Demo.Tasks.MonthlyAudit Extends %SYS.Task.Definition { Method OnTask() As %Status { Set tSC = $$$OK //Get Current Date Set CurentDatetime = $ZDATETIME($HOROLOG,3) //The report needs to be for last month, so get a date from last month based on todays date Set LastMonth = $SYSTEM.SQL.DATEADD("MM",-1,CurentDatetime) //Get last Day of last month As Horolog Set LastDayHoro = $SYSTEM.SQL.LASTDAY(LastMonth) //Convert Horolog into a Date Set LastMonthEnd = $ZDATETIME(LastDayHoro,3) //Get First Day of Last Month Set LastMonthStart = $SYSTEM.SQL.DATEPART("YYYY",LastMonthEnd)_"-"_$SYSTEM.SQL.DATEPART("MM",LastMonthEnd)_"-01" //Switch to the %SYS Namespace ZNspace "%SYS" Set tSC = ##class(%SYS.Audit).Export("AuditExport.xml",,,LastMonthStart_" 00:00:00",LastMonthEnd_" 23:59:59") Quit tSC } } Then, when setting the task up, I would set it to run on the first Monday of the Month, and it will grab everything from the previous month.
go to post Julian Matthews · Apr 14, 2022 Product Alerts & Advisories Link HS2022-01-Communication Link
go to post Julian Matthews · Apr 11, 2022 Hey Ephraim. You will see from looking at the classmethod being called that there is a start date parameter which was left blank by Michael so that it will export everything up to the end date. In your case, you could do the following to fulfil your example: ##class(%SYS.Audit).Export("AuditExport.xml",,,"2022-03-01 00:00:00","2022-04-11 23:59:59") However this will only be useful to your specific date range, which is where Michaels use of $ZDT and $H come into play. If you wanted to execute the task and have it return the last 30 days, you could do this: ##class(%SYS.Audit).Export("AuditExport.xml",,,$zdt($h-30,3)_" 00:00:00",$zdt($h,3)_" 23:59:59")