Hey Lionel.

I did write an article about this a little while ago which I hope can walk you through what you're looking to achieve, with the difference being to pivot from using the ORCGrp counts like I had, and instead using RXCgrp and then setting the transform to have a source and target class of RDE O11.

If you do ty to follow this approach, I'm more than happy to answer any questions you have.

If you have a record map configured and have the generated class, then the next step would be to use this generated class in a transform as your source. From there, you can transform the data into the SDA class you have set in your target for transform.

Once you have this SDA message, you can send it wherever your requirements need you to send them.

If you're working with a CSV, then you could look at the Record Mapper or Complex Record Mapper depending on the input format. From there, you could then create a transform that uses the generated class from the record mapper as your source, and then the appropriate SDA class as your destination.

However, if you're working with an actual xls/xlsx file, then I'm out of ideas.

Your Process is most likely using ..SendRequestAsync() to send to the Operation and has "pResponseRequired" set to 1 (or not set at all, so it's using the default value of 1).

There's nothing inherently wrong with this, but if you just want to send to the Operation and not worry about the response going back to your process, you could change the "pResponseRequired" flag to 0 in your call. So it would look a little like this:

Set tSC = ..SendRequestAsync("TargetOperationName",ObjToSend,0)

However you may wish to consider if this approach is appropriate to your setup, or if you would be better off using "SendRequestSync()" and dealing with the response synchronously. 

To parse the json, the below is a starting point for taking the content of the stream into a dynamic object, and then saving the value into its own variable.

Set DynamicObject=[].%FromJSON(pRequest.Stream)
Set Name = DynamicObject.name
Set DOB = DynamicObject.DOB
Set SSN = DynamicObject.SSN

You could then store these wherever you need to. If your SQL table is external, then you could have your Operation using the SQL Outbound Adapter to then write these in your external DB.

ETA: If you then need to pick out the values within the content of name (which I assume has come from a HL7 message) you could use $PIECE to pick out the data from the delimited string you're receiving.

Hey Daniel.

As a starting point, I would not be adding the trace when viewing the DTL in Studio, and instead I would add it when using the Data Transformation Builder:

Which gives me:

If this is not working for you, make sure that the router has "Log Trace Events" enabled in the router settings and the router has been restarted since enabling the trace. I have been caught out numerous times enabling the trace and then forgetting to restart the process/router in the production.

The issue of no results being returned is likely elsewhere in your query.

To test this, I created a basic table with the following:

CREATE TABLE Demo.BBQ (
    Name varchar(100),
    Type varchar(50),
    isActive bit
)

And I then added a few rows:

Insert into Demo.BBQ (Name, Type, isActive)
VALUES('Super Grill''s BBQ Hut','Outdoor',1)
Insert into Demo.BBQ (Name, Type, isActive)
VALUES('Bobs BBQ Bistro','Indoor',1)
Insert into Demo.BBQ (Name, Type, isActive)
VALUES('Rubbish Grill''s BBQ Resort','Not Known',0)

This then gave me a table that looks like this (note that the double single quotes used in the insert are inserted as a single quotes into the table):

If I then run a query using the like function:

And if I want to exclude the inactive location:

The use of doubling up a single quote to escape the character is not a Intersystems specific approach, but is generally a standard SQL way of escaping the single quote.